Writing Better Code with Ollama



AI Summary

  • Olama team released an official Node.js library for Olama, available on GitHub.
  • The author plans to use both Node.js and Python libraries.
  • Demonstrated using the chat endpoint with a system prompt and initial question.
  • Endpoint response is a JSON blob with output and performance metrics, not streamed by default.
  • Streaming can be enabled by setting stream to true, which requires handling an async generator.
  • Llama Coder, a VS Code extension, assists in writing code, including setting stream to true.
  • Llama Coder can generate code from comments but may not be perfect.
  • To print JSON blob tokens without new lines, the author uses process.stdout.write instead of console.log.
  • Continue.Dev is another VS Code extension that helps by answering coding questions.
  • Both Llama Coder and Continue.Dev are free, work offline, and are alternatives to Co-Pilot.
  • The author lives on an island near Seattle with no internet on the ferry, making offline tools useful.
  • Setup involves installing AMA, Llama Coder (using model Deep Seek Coder 1.3b Q4), and Continue.Dev.
  • Continue.Dev allows model selection and has a setting to disable telemetry for offline use.
  • Other VS Code extensions mentioned are CBT and Olama Auto Coder, but Llama Coder and Continue.Dev are preferred.
  • Interested in trying Cody from Source Gra, which integrates AMA.
  • Asks viewers if they’ve replaced Co-Pilot with local tools and for any other configurations or setups.
  • Invites comments and thanks viewers for watching.