LangChain Tool Calling feature just changed everything
AI Summary
- Overview of Lang Jan’s Tool Calling Feature
- Underrated but offers significant flexibility.
- Allows switching between models supporting function calling.
- Previous Limitations
- Function calling was tailored to OpenAI’s API.
- Other vendors like Vertex, Gemini, and Anthropic Sonet had different schemas.
- Integration with non-OpenAI models was troublesome.
- Lang Jan’s Solution
- Created a unified interface for tool calling.
- Supports multiple models: OpenAI, Vertex, Gemini, Mistral, Fireworks, Anthropic Sonet.
- Responds to community requests for vendor flexibility.
- Interface Components
bind function
method: Allows LLM to use custom functions.tool calls
: Populated with function call invocations by the LLM.- Tool Calling Agent: Creates a function calling agent for any supported vendor.
- Example Usage
- Define tools or use OpenAI format for function calling.
- Bind tools to the LLM.
- LLM decides on invoking functions during calls.
- Agent Creation
- Function calling agent can be created with different models like Anthropic Sonet.
- Demo Explanation
- Defined tools: multiply and Tab Search for real-time data.
- Initialized tool calling agent with OpenAI’s GPT-4 and Anthropic Sonet.
- Asked for weather comparison in Dubai and San Francisco in Celsius.
- Results traced with LSmith for examination.
- Results
- OpenAI and Anthropic Sonet provided similar weather results.
- Traces showed the process of invoking tools and summarizing results.
- Conclusion
- Lang Jan’s implementation simplifies switching between models with function calling.
- It’s a significant advancement in machine learning commoditization.
- Fulfills community demand for working with various models.