Skip to content

Discussion - Topic: Function calling with LLMs

Photo of Ken Dempster
Hosted By
Ken D.

Details

This week's topic: Function calling with LLMs

As described in Thoughtworks Technology Radar Vol. #31.

Function calling with LLMs refers to the ability to integrate LLMs with external functions, APIs or tools by determining and invoking the appropriate function based on a given query and associated documentation. This extends the utility of LLMs beyond text generation, allowing them to perform specific tasks such as information retrieval, code execution and API interaction. By triggering external functions or APIs, LLMs can perform actions that were previously outside their standalone capabilities. This technique enables LLMs to act on their outputs, effectively bridging the gap between thought and action — similar to how humans use tools to accomplish various tasks. By introducing function calling, LLMs add determinism and factuality to the generation process, striking a balance between creativity and logic. This method allows LLMs to connect to internal systems and databases or even perform internet searches via connected browsers. Models like OpenAI’s GPT series support function calling and fine-tuned models like Gorilla are specifically designed to enhance the accuracy and consistency of generating executable API calls from natural language instructions. As a technique, function calling fits within retrieval-augmented generation (RAG) and agent architectures. It should be viewed as an abstract pattern of use, emphasizing its potential as a foundational tool in diverse implementations rather than a specific solution.

Zoom link will be added about 5 min before the event starts.

Discussion Resources :

TBD

Photo of DevTalk LA group
DevTalk LA
See more events
Online event
Link visible for attendees
FREE