Title: Code – Page 3 – Alex Kirk

---

# Category: Code

 * 
   ## 󠀁[Setting Up a Local Ollama Copilot via LSP](https://alex.kirk.at/2024/11/15/setting-up-a-local-ollama-copilot-via-lsp/)󠁿
   
 * November 15, 2024
 * I am quite interested in running AI offline. Thus I really like Ollama, and have
   added automatic failover from ChatGPT to a local AI to my little terminal llm
   tool cll (get it on Github at akirk/cll). As a developer, an important local 
   gap for me was Github Copilot. Its function of autocomplete on steroids…
 * [AI](https://alex.kirk.at/category/ai/), [Code](https://alex.kirk.at/category/code/),
   [Workflow](https://alex.kirk.at/category/workflow/)
 * 
   ## 󠀁[cll Now Works With Local Files And Improves Output Formatting](https://alex.kirk.at/2024/08/26/cll-now-works-with-local-files-and-improves-output-formatting/)󠁿
   
 * August 26, 2024
 * I’ve written about my cll tool before and it is still my go-to way of communicating
   with LLMs. See the Github repo. As a developer, having llms available in the 
   Terminal is very helpful to me. Write a file to disk A lot of my prompts ask 
   the LLM to create a file for me.…
 * [AI](https://alex.kirk.at/category/ai/), [Code](https://alex.kirk.at/category/code/)

 [Previous Page](https://alex.kirk.at/category/code/page/2/?output_format=md) [Next Page](https://alex.kirk.at/category/code/page/4/?output_format=md)