Back to Blog

Simplify Database Ad-Hoc Queries with Neurelo AI

Of all the use cases for generative AI (GenAI), it is software development that is currently generating the most breathless excitement (and in some quarters, skepticism). Depending on which survey you read, developers now write code 55% faster or over 40% of all code is now written by AI — achieved by simply prompting an LLM with natural language instructions. It's not all ponies and rainbows however. Other surveys point out that the amount of code churned (reverted within two weeks of being committed) has doubled over the past three years. 

Whichever stats you believe, there is no question that AI code assistance has become a mainstay of modern app development. 

Here at Neurelo, we see developers taking measured and practical approaches to AI assistance. They want to take advantage of GenAI smarts to automate tedious boilerplate, but maintain oversight over more complex and sophisticated tasks. That is exactly what we enable them to do using our complex query APIs with AI assist. 

The Neurelo advantage for AI query generation

There are some great AI code assistants out there helping developers write app code. But a lot of those assistants struggle with the more specialized code that needs to query the application database, especially when the app needs to do anything more complex than simple point queries. Think filtering, joining, and aggregating data for real time analytics or transforming data for retrieval-augmented generated (RAG), or ML model training and inference. In these scenarios, using AI that intimately understands both the schema and the query syntax of your database unlocks massive gains in developer productivity, code quality, and application performance.

How does AI assistance in Neurelo help you make those gains? Because it is implemented as part of a complete cloud data API platform for your database. This means it passes not just your prompt to the LLM, but also augments it with the metadata generated by our schema editor. This enables the LLM to understand the structure of your data model, providing your schema metadata as in-context learning. It is also important to note that our LLM has been fine-tuned with the query syntax of your chosen database. Combining all this insight, context, and tuning, Neurelo automatically generates and optimizes the code needed to execute your most complex queries. 

AI Assist displays the generated query based on your provided prompt, as illustrated in the image below. You can review the query in the Neurelo playground to check it for correctness and make any necessary adjustments as needed or even regenerate it from scratch. Soon you’ll also be able to feed the generated query into your database’s explain plan for deeper evaluation and tuning. 

When you are happy with the query and commit to your branch, Neurelo instantly deploys it as a custom API endpoint. You can then configure the endpoint with rate-limits, timeouts, and access controls, all from within the Neurelo platform. What's really important to emphasize is that It is only by working with a single database tool that spans every stage of your application lifecycle that you can get the benefits of this integrated experience: if you need to change your data model, Neurelo AI Assist can help regenerate and redeploy the query code for you. 

Review our AI Assisted Query documentation for detailed instructions on how to build custom queries, and advanced options to control LLM usage and output. 

The future: domain specific, context aware LLMs with more control

One consistent piece of positive feedback we've received from our users is the accuracy of the query code generated by our AI assist. Thanks to schema awareness and continuous training on database nuances, over 90% of users use AI Assist after connecting Neurelo to their database. Many are now using Neurelo AI Assist as a replacement for traditional query builder tools.

But we aren’t declaring victory just yet. It goes without saying that the pace of GenAI progress is accelerating, and going forward we will be harnessing state-of-the-art advancements to make AI Assist in Neurelo even more useful to you. Today we default to generate query code using our fine-tuned OpenAI GPT-4o model. From our testing, GPT-4o  provides the best price-performance balance. You also have the flexibility to use Neurelo's AI Assist with your own OpenAI account if you prefer. 

We are now working to provide you with more optionality. While OpenAI is great, there have been huge advances in code generation by other models. Larger context windows, training on more recent code samples and docs, and improved instruction following with autonomous agents are all contributing to improve the quality of code generation. We plan on using multiple fine- tuned LLMs side-by-side to generate your queries, providing you the ability to compare different query plans. You will also be able to optimize your queries for different objectives — for example prioritizing execution speed or system efficiency. 

While our AI Assist will get even smarter by automating more of your query code generation, only you know your users. So you will remain in control, harnessing Neurelo AI Assist to do the heavy-lifting so you are free to focus your creativity on designing and building even better applications.  

Try Neurelo's AI query builder

Hopefully we’ve piqued your interest in experiencing the power of Neurelo — a new way to build and run applications with PostgreSQL, MongoDB, and MySQL. You can try Neurelo for free simply by creating an account. Please leave comments below if you have any questions, concerns, or feedback on how we can continue to develop AI Assist.