Creating a Context Aware AI Assistant in APEX Using a Private LLM
Details
Oracle APEX has a built-in AI Assistant feature that integrates really well with popular LLMs on the cloud, such as Meta's Llama and OpenAI's ChatGPT. It allows you to provide some context for your AI Assistant using RAG Sources. That context can come from static content (plain text), or you can look in your data by using a SQL Query or execute a PL/SQL Function. This will help the AI Assistant give smarter answers based on your own data and not generic knowledge it's been trained on. If you go for the Query route, you will be sending some data to the LLM provider, and you may not want to do that.
There is a way to execute your own instance of an LLM inside your infrastructure. Could be your own computer or a server built for that purpose. I'll show you how to set up your private LLM instance and connect your APEX builder to leverage an Open Source model and spin up a context aware AI Assistant.
