Mixture of Predictive Agents (MoPA) - The Wisdom of Many AI Agents Architecture
AI Summary
Video Summary: Mixture of Predictive Agents (MOPA) Architecture
- Sponsor: HubSpot
- Topic: MOPA architecture for predicting Bitcoin prices
- API Used: CoinGecko (free API for Bitcoin price data)
- Data Handling:
- Fetch last 30 days of Bitcoin prices
- Cache data in JSON to avoid API rate limits
- Architecture:
- Custom prompt for models
- Use models from AMA, OpenAI, CLA, and Gro
- Easy model switching with a list system
- Process:
- For loop sends prompt to each model
- Extract predicted prices from responses
- Aggregate predictions by averaging them
- Code Walkthrough:
- Fetch and store Bitcoin prices in JSON
- Check cache age and refresh if older than 24 hours
- Exclude current day for backtesting
- Load Bitcoin prices into MOPA
- Set up clients for different AI providers
- Extract prices using regex
- Backtest by comparing predictions with actual prices
- Models Tested:
- AMA: LLaMA 3, Deep Coder V2
- OpenAI: GPT-3.5 Turbo, GPT-4, CLA 3.5, Sonnet Opus
- Gro: LLaMA 70B, Mixol 8*7B
- Results:
- Predictions aggregated and displayed in terminal
- Corrected prediction by reducing 4% for optimism
- Showed accuracy and delta from current price
- Conclusion:
- Emphasized ease of testing multiple models
- Mentioned potential for “wisdom of the crowd” with more models
- Achieved 98.2% accuracy in the example
- Access to Code:
- Available for channel members via community GitHub
- Sponsor Content:
- HubSpot’s ebook on enhancing productivity with AI
- Upcoming Content:
- Live stream announcement
To access the full code and participate in the community, viewers are encouraged to become channel members and join the Discord. The sponsor, HubSpot, offers an ebook on improving workday productivity with AI, which is available for free through a link in the video description.