Category: Code

  • Setting Up a Local Ollama Copilot via LSP

    I am quite interested in running AI offline. Thus I really like Ollama, and have added automatic failover from ChatGPT to a local AI to my little terminal llm tool cll (get it on Github at akirk/cll). As a developer, an important local gap for me was Github Copilot. Its function of autocomplete on steroids…

  • cll Now Works With Local Files And Improves Output Formatting

    I’ve written about my cll tool before and it is still my go-to way of communicating with LLMs. See the Github repo. As a developer, having llms available in the Terminal is very helpful to me. Write a file to disk A lot of my prompts ask the LLM to create a file for me.…