Perform LLM Orchestration and Chat with Azure SQL using Azure Open AI


Details
This project demonstrates how to perform LLM orchestration on Azure SQL using Azure OpenAI.
Vector Search in Azure SQL and SQL Server 2025. Azure SQL and SQL Server 2025 now support vector data types natively. This allows semantic search directly within SQL databases.
Traditionally, SQL search is performed using standard SQL queries or Full-Text Search. However, in this approach, search operations are executed through vector search, enabling a more context-aware and intelligent retrieval mechanism. This means you can now query Azure SQL using natural language prompts instead of rigid SQL queries.
For example, a standard SQL query for retrieving monitors based on brand and size would look like this:
SELECT * FROM items
WHERE category LIKE 'monitors'
AND brand IN ('Samsung', 'LG')
AND size = '27 inch';
With LLM-powered vector search, you can achieve the same query using a natural language prompt:
"Please recommend me a 27-inch monitor for office use. Just show Korean brands."
This will return results such as:
• Samsung 27" Odyssey G5 G55C LS27CG552EUXUF – 1ms, 165Hz, (HDMI+DP), QHD Curved Gaming Monitor
• Samsung M5 LS27BM500EUXUF – 27", 4ms, Full HD LED Monitor, Black
Document Search with Vector Embeddings in Azure SQL
Another key capability is vector-based document search within Azure SQL. For instance, you can store PDF documents as vector embeddings in SQL and query them using prompts instead of keyword-based searches.
LLM Orchestration in SQL Server
LLM orchestration, which is commonly implemented in Python using LangChain and LangGraph, can now be replicated in SQL Server.
• In Python, LLM orchestration involves multiple methods or agents that handle specific tasks.
• Similarly, in SQL Server, we can define stored procedures for different tasks.
• Based on the user's prompt, the system can dynamically select the appropriate stored procedure, effectively enabling LLM orchestration within T-SQL.
Why This Approach Matters?
This approach is highly valuable because:
Ensures Real-Time and Accurate Data – Unlike traditional LLM queries that rely on external APIs or vector stores, SQL-based orchestration ensures real-time, up-to-date, and verified results.
Eliminates Hallucination – Since the LLM operates within a structured SQL environment, it retrieves only real and factual data, reducing the risk of hallucinated responses.
Enhances Security – Sensitive business data remains within Azure SQL rather than being exposed to external vector databases.
By leveraging Azure SQL, OpenAI, and vector search, this project enables intelligent querying, document retrieval, and LLM orchestration, all within a secure and structured SQL environment.

Sponsors
Perform LLM Orchestration and Chat with Azure SQL using Azure Open AI