In a previous article I described how generative AI could change the data analytics landscape. And one such an area is data engineering.
Databricks, a premier technology used by data engineers and data scientists alike, have now announced AI Functions in public preview.
AI Functions will allow data professionals to integrate Large Language Models (LLMs) into their work directly from Databricks SQL, streamlining the development and deployment processes, without worrying about the underlying infrastructure. This will speed up the delivery of data engineering significantly. This means you will be able to deliver data solutions faster - ultimately get through more work.
"Imagine that you’re an analyst and have been given a historical list of thousands of call transcripts with the task of providing a report that breaks down all of the calls into one of four categories [Frustrated, Happy, Neutral, Satisfied]. Typically, this would require you to request the data science team create a classification model. Instead, with AI Functions, you can prompt a Large Language Model, such as OpenAI’s ChatGPT model, directly from SQL."
Currently, AI Functions support Azure OpenAI and OpenAI models such as GPT-35-turbo, with future releases planning to accommodate other LLMs, including open-source models like Dolly.
AI Functions lets you harness the power of Large Language Models through a familiar SQL interface, and once the appropriate LLM prompt is created, it can be seamlessly transformed into a production pipeline using Databricks tools like Delta Live Tables or scheduled Jobs.
The advantage of this is that it can facilitate easy customization of LLMs for specific business needs, help achieve seamless integration between AI and your data, and expedite delivery.
See the full announcement and examples here.
Comments