Setting up & using local LLMs in SuperAGI
AI Summary
Super AGI Tutorial Summary
- Install an open-source large language model (LLM) in GGF format.
- Example source: Hugging Face.
- Open
docker-compose-gpu.yml
in the Super AGI folder.
- In the backend volume section, add the LLM file path:
"<file_path>:/<local_model_path>"
.- Repeat under the celery volumes.
- Run the Docker command:
docker-compose -f docker-compose-gpu.yml up --build
.- Access Super AGI locally:
- Go to
localhost:3000
in a browser.- Navigate to models, add a new model with required details.
- Token limit is obtained from the LLM source.
- Click “test model” to load the model to GPU memory.
- Ensures hardware compatibility.
- After testing, click “add model” to finalize.
- The model is now ready for running autonomous agents with Super AGI.