Install Huggingface Candle Locally to Run Blazing Fast Models
AI Nuggets
Candle Installation and Usage Instructions
Prerequisites
- Ensure NVIDIA Cuda version 12.3 is installed.
- If unsure how to install Cuda, refer to a separate video on the channel for instructions.
- Verify the compute capability of your NVIDIA card is over 8.
- Modern NVIDIA cards from the current or previous year should suffice.
Required Libraries and Environment Variables
- Install Rust:
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
- Set environment variables for GCC and C++ compiler driver:
export PATH=$PATH:/path/to/gcc
- Install OpenSSL and set the library path:
sudo apt-get install libssl-dev export LD_LIBRARY_PATH=/path/to/openssl:$LD_LIBRARY_PATH
- Find paths for CC1 Plus and OpenSSL libraries:
sudo find / -name cc1plus sudo find / -name libssl.so
Installation Steps
- Create a new Rust application using Cargo:
cargo new my_app
- Add the Candle core library to the Rust project:
cargo add candle-core --features cuda
- Replace
cuda
withcpu
if not using an NVIDIA GPU.- Build the Rust project:
cargo build
- Clone the Candle repository:
git clone https://github.com/huggingface/candle.git
- Change directory to the cloned repository:
- cd candle/examples
- Run an example inference with the Fire 2 model:
cargo run --example fire2 --features cuda --release -- model=52 --prompt="Your prompt here"
- Replace
cuda
withcpu
if necessary.- Change
Your prompt here
to the desired input for the model.Additional Tips
- If you encounter issues, ensure all dependencies are correctly installed and environment variables are set.
- The video also mentions a coupon code for a 50% discount on GPU rentals from Mast Compute, but the code is not provided in the transcript.
Resources
- Mast Compute GPU rental service: Mast Compute Website
- Use the provided coupon code for a discount (code not specified in the transcript).
Blog and Commands
- The commands used in the video will be available in a blog post linked in the video description.
Video Acknowledgements
- Thanks to Mast Compute for sponsoring the VM and GPU used in the video.
(Note: The exact URLs for the commands and the Mast Compute website are not provided in the transcript. Visit the video description for the blog link and additional resources.)