cll: Adding unix pipe support

Today I’ve added a little feature to my cll tool (which I renamed again but I think now I’ll stick to this) so that it can process stdin so that you can do stuff like this:

echo 'a presentation about the friends plugin for wordpress' | cgt -s 'Please create a reveal.js presentation based on the following notes. Ensure to use short titles and short few words on each list item. Please load the moon theme and scripts from the cdn.jsdelivr.net domain, dont use any reveal.js plugins. Respond with just the HTML, no outside comments.' > presentation.html

Which creates a presentation.html file that contains a full Reveal.js presentation on the topic.

The systems prompt will tell the model how to behave so that you can also (granted it doesn’t always work but it’s gotten better) tell it to output JSON which you can then parse:

echo hello | cll -mllama2 -s 'please respond only in valid json' | jq .
{
  "message": "hello",
  "type": "text"
}

In “stdin mode,” it will only output the response from the LLM, you can turn that back on using -v (as in verbose) although that additional output will go to stderr.

chat-cli: renamed and added Ollama support

I have added support for Ollama to my chat-cli tool (formerly named cli-chatgpt, see previous posts). Ollama is a very easy way to run llama2 locally; it runs a local HTTP server and this will be used.

Depending on whether an OpenAI key or Ollama are available, the models will be made available and you can see this in cgt -h. (Using cgt as a command here is based on my recommendation of using an alias in your shell.)

When you’re offline, OpenAI is deemed unavailable, thus–if it is installed and active–Ollama/llama2 will be used automatically. While it currently doesn’t switch between models during a conversation (because of your online state), you can simply exit and re-continue the conversation with another model using cgt -l.

Both models show their responses in a streaming way.

❯ cgt where is gerolstein
Model: gpt-3.5-turbo
> where is gerolstein

Gerolstein is a town in the Bitburg-Prüm district in Rhineland-Palatinate, Germany. It is located in the Eifel mountain range, approximately 15 kilometers southeast of Bitburg.
> ^c

### Went offline here.
❯ cgt where is gerolstein
Model: llama2:latest
> where is gerolstein

 Gerolstein is a town located in the state of Rhineland-Palatinate, Germany. It is situated in the northern part of the state, approximately 20 kilometers (12 miles) northwest of the city of Mainz. The exact address of Gerolstein is:

Gerolstein, Germany

If you are planning to visit Gerolstein or need more detailed information, please let me know and I will be happy to help.
> 

While there is no config available, you can change the priority of models in the source.