chat-cli: renamed and added Ollama support

I have added support for Ollama to my chat-cli tool (formerly named cli-chatgpt, see previous posts). Ollama is a very easy way to run llama2 locally; it runs a local HTTP server and this will be used.

Depending on whether an OpenAI key or Ollama are available, the models will be made available and you can see this in cgt -h. (Using cgt as a command here is based on my recommendation of using an alias in your shell.)

When you’re offline, OpenAI is deemed unavailable, thus–if it is installed and active–Ollama/llama2 will be used automatically. While it currently doesn’t switch between models during a conversation (because of your online state), you can simply exit and re-continue the conversation with another model using cgt -l.

Both models show their responses in a streaming way.

❯ cgt where is gerolstein
Model: gpt-3.5-turbo
> where is gerolstein

Gerolstein is a town in the Bitburg-Prüm district in Rhineland-Palatinate, Germany. It is located in the Eifel mountain range, approximately 15 kilometers southeast of Bitburg.
> ^c

### Went offline here.
❯ cgt where is gerolstein
Model: llama2:latest
> where is gerolstein

 Gerolstein is a town located in the state of Rhineland-Palatinate, Germany. It is situated in the northern part of the state, approximately 20 kilometers (12 miles) northwest of the city of Mainz. The exact address of Gerolstein is:

Gerolstein, Germany

If you are planning to visit Gerolstein or need more detailed information, please let me know and I will be happy to help.
> 

While there is no config available, you can change the priority of models in the source.

Leave a Reply

Only people in my network can comment.

This site uses Akismet to reduce spam. Learn how your comment data is processed.