I built my own Free VScode Copilot!!!



AI Summary

Summary of Video Transcript

  • The video demonstrates how to use a local AI copilot for coding in Visual Studio Code without a subscription, using the AMA (AI Model Access) and Deep Seek Coder model.
  • Instructions are provided for installing AMA, downloading the Deep Seek Coder model, and testing it with a simple Python addition code.
  • The video guides through installing the Visual Studio Code extension ‘Continue’, which allows the use of various language models (LLMs).
  • Detailed steps are given for configuring Continue to use the Deep Seek Coder model by editing the config.js file.
  • The video shows how to use keyboard shortcuts within Visual Studio Code to interact with the Continue extension and how to switch between different models.
  • Tips are provided on how to download and use other models like Stable LM Zero, and how to ensure they are accessible locally through ol serve.
  • The video also covers privacy settings related to telemetry data sharing with Continue.dev.
  • The process of writing, editing, and running Python code using the AI models is demonstrated, highlighting the differences in output quality based on the model used.
  • The tutorial concludes with an emphasis on the benefits of using local models for sensitive data and the flexibility of using different models.

Detailed Instructions and Tips

  • Install AMA.
  • Download the Deep Seek Coder model with the command: AMA run deep-seek-coder.
  • Install the Visual Studio Code extension ‘Continue’ from the marketplace or by searching for it in the Extensions tab.
  • Configure config.js to use the Deep Seek Coder model by adding a block with the title, provider (AMA), and model name.
  • Use keyboard shortcuts in Visual Studio Code with Continue:
    • Control or Command + M to select code.
    • Control + Shift + M for follow-up code selection.
    • Control + Shift + L for quick edit.
    • Control + Shift + R to debug the terminal.
  • Download other models like Stable LM Zero with the command: AMA run stable-lm-zero.
  • Ensure local access to models with ol serve.
  • Disable telemetry data sharing in Continue’s extension settings if desired.
  • Edit and run code using the AI models, and switch between them as needed.

URLs and Commands

  • No specific URLs or CLI commands are provided in the summary.