How does function calling with tools really work?
AI Nuggets
Based on the provided YouTube video transcript, here are the extracted detailed instructions, CLI commands, website URLs, and tips, organized in an easy-to-follow outline form:
- Function Calling in AI Models:
- Function calling in AI models like OpenAI and AMA does not actually call functions but outputs JSON that adheres to a predefined schema.
- To make a model output JSON, specify
format: JSON
in the API call and instruct the model to output as JSON in the prompt.- Creating Functions for Weather and Location Queries:
- weatherFromLatLon Function:
- Takes latitude and longitude as parameters.
- Calls the Open Meto API to get the temperature.
- Outputs the temperature to the console.
- weatherFromLocation Function:
- Uses the Nominatim API to convert a place name to latitude and longitude.
- Calls
weatherFromLatLon
with the obtained coordinates.- Web Search Function:
- Uses search engine (searchandng) to find information.
- Outputs the first result from the search.
- CityFromLatLon Function:
- Uses Nominatim API to convert coordinates to a place name.
- Describing Functions to the Model:
- Manually describe the functions with their names and parameters.
- Create tool descriptions for
weatherFromLatLon
,weatherFromLocation
,webSearch
, andCityFromLatLon
.- Convert the descriptions into an array and stringify it for the model prompt.
- Creating Prompts and Calling the Model:
- Include instructions and a list of tools in the system prompt.
- Create a
promptAndAnswer
function that takes a prompt and calls the model usingolama do generate
.- Ensure
format: JSON
is specified and instruct the model to respond as JSON.- Executing Functions Based on Model Output:
- Replace the console log line with a call to
executeFunction
, which takes a function name and parameters.- Use a switch statement in
executeFunction
to call the appropriate function and obtain results.- Testing with Different AI Models:
- Test the function calling with various models such as llama 3, quen 2, Gemma 2, and llama 2.
- Verify that the functions work correctly with each model.
- Upcoming Changes:
- Mention of a PR (Pull Request) #5284 by Mike from the core olama team.
- The PR suggests a more rigid format for messages and may affect the generate endpoint.
- No specific timeframe for the PR’s implementation; it could be next week, next year, or never.
- Final Thoughts:
- The video encourages viewers to apply these concepts to their own applications and share their experiences in the comments.
Note: The video does not provide any direct URLs or CLI commands beyond the API names (Open Meto API, Nominatim API) and the hypothetical function names (
weatherFromLatLon
,weatherFromLocation
,webSearch
,CityFromLatLon
). The actual URLs for these APIs and the exact CLI commands to call them are not included in the transcript.