Help Make (the) Friends (Plugin) Pretty

While I keep maintaining my Friends plugin for WordPress, there is an area where it could do better because I am not great at it: the visual appeal.

From pretty early on, I designed the plugin in a way that it

  1. can be mainly used on the frontend (i.e. not in wp-admin),
  2. is based on granular template files,
  3. and each template file can be overriden.

Over time, some of this has changed or improved. For example, you can now see your friends’ posts in widgets on your wp-admin dashboard, or, I’m making progress on providing a block theme for Friends so that it can be customized with the site editor.

This is the current Friends default theme, based on the spectre.css framework
This is a user view with full posts in the current design
Dashboard filled with Friends’ posts, Netvibes style

But, even as I wrote in the Friends Wiki on how new themes can be created, I think this has remained unknown to users.

So, a year ago, as a demo, I created the Mastodon-Like Interface plugin. This is what the Friends plugin can also look like:

Unfortunately, this remained under people’s radar. So in the latest version of the Friends plugin, I have now made it more obvious that the theme can be changed, by slightly adjusting how themes are loaded, giving the user more control over which theme is loaded:

In the course of this, I have updated the Friends Wiki with a more specific guide on how to write themes for the Friends plugin.

There have been some case studies about what a network between WordPresses could look like. For example, Mike McAlister has designed some screens for what he called OpenPress at the time. And the mockups look great:

Finding a new home for the WordPress community

I would love it if the community could help with creating some more themes to give people a choice. Recently, Livia Gouvêa has contributed some layout improvements for the current sidebar. This is a great start, thank you, Livia! If you have a new theme, you can add them with a pull request to the THEMES.md file. As soon as there are some themes, we’ll make it easier to install them.

It is quite likely that people were detracted from getting started with the Friends plugin because they don’t like the current theme (hat tip Robert Windisch and his talk at WordCamp Karlsruhe) but this would be too bad, because I believe it is an empowering tool, allowing you to become less dependent on third-party vendors, even if you’re “just” using it to make your own WordPress your full featured, personal Mastodon instance.

Looking forward to more Friends themes, this would be an awesome addition!

Posted in Web

Setting Up a Local Ollama Copilot via LSP

I am quite interested in running AI offline. Thus I really like Ollama, and have added automatic failover from ChatGPT to a local AI to my little terminal llm tool cll (get it on Github at akirk/cll).

As a developer, an important local gap for me was Github Copilot. Its function of autocomplete on steroids is really powerful in my day to day work and speeds up my development a lot.

Now, how can you get this offline? Mostly, search engines point to solutions that involve Visual Studio Code extensions, for example Continue and lots of other dependencies.

LSPs are independent of IDEs

But why should this involve IDE extensions? With the concept of LSPs (read LSP: the good, the bad, and the ugly to learn how LSPs work), and the existence of LSP-Copilot, this should be independent of IDEs. And I personally use Sublime Text.

And indeed, it does work just on that basis: using the go proxy ollama-copilot by Bernardo de Oliveira Bruning.

But for me it didn’t work out of the box. Thus, I’d like to share the steps that got this working for me. I use macOS.

Steps to get it running

First, follow the install instructions for Ollama and ollama-copilot. This puts the go binary in ~/go/bin/ollama-copilot

Then, change the settings for lsp-copilot and add "proxy": "127.0.0.1:11435" (this is the default local port).

Now, you also need to address the certificate situation. I use mkcert which you can install with homebrew using

brew install mkcert

Follow the instructions to install its root cert. We need a certificate that covers two Edit: three hosts, so run

cd ~/go/bin/; mkcert api.github.com copilot-proxy.githubusercontent.com proxy.individual.githubcopilot.com

which gives you two files with which you can now now start the proxy:

~/go/bin/ollama-copilot -cert ~/go/bin/api.github.com+2.pem -key ~/go/bin/api.github.com+2-key.pem

Finally, you need to add one more thing to the lsp-copilot config JSON. First find out the location of the root cert: echo $(mkcert -CAROOT)/rootCA.pem and add an env section there (see this FAQ), for me it’s:

"env": {
	"NODE_EXTRA_CA_CERTS": "~/Library/Application Support/mkcert/rootCA.pem"
},

This made it work for me. Edit: It seems a bit erratic. For me it works most reliably if you start ollama-copilot first, and only then Sublime Text. You can see the proxy at work through its output in the terminal.

2024/11/15 16:04:08 request: POST /v1/engines/copilot-codex/completions
2024/11/15 16:04:12 response: POST /v1/engines/copilot-codex/completions 200 4.744932083s

And this is from the LSP log panel:

:: [16:04:07.967]  -> LSP-copilot textDocument/didChange: {'textDocument': {'uri': 'file:///...', 'version': 42}, 'contentChanges': [{'range': {'start': {'line': 2860, 'character': 53}, 'end': {'line': 2860, 'character': 53}}, 'rangeLength': 0, 'text': 'c'}]}
:: [16:04:08.013] --> LSP-copilot getCompletions (6): <params with 147614 characters>
:: [16:04:08.027] --> LSP-copilot getCompletionsCycling (7): <params with 147614 characters>
:: [16:04:08.133] <-  LSP-copilot statusNotification: {'status': 'InProgress', 'message': ''}
:: [16:04:08.156] <-  LSP-copilot statusNotification: {'status': 'InProgress', 'message': ''}
:: [16:04:12.447] <-  LSP-copilot window/logMessage: {'type': 3, 'message': '[fetchCompletions] request.response: [https://copilot-proxy.githubusercontent.com/v1/engines/copilot-codex/completions] took 4288 ms'}
:: [16:04:12.920] <-  LSP-copilot window/logMessage: {'type': 3, 'message': '[streamChoices] solution 0 returned. finish reason: [Iteration Done]'}
:: [16:04:12.920] <-  LSP-copilot window/logMessage: {'type': 3, 'message': '[streamChoices] request done: headerRequestId: [] model deployment ID: []'}
:: [16:04:12.920] <-  LSP-copilot statusNotification: {'status': 'Normal', 'message': ''}
:: [16:04:12.920] <<< LSP-copilot (7) (duration: 4892ms): {'completions': [{'uuid': '4224f736-39f9-402e-b80e-027700892012', 'text': '\t\t\t\t\'title\'  => \'<span class="ab-icon dashicons dashicons-groups"></span>...', {'line': 2860, 'character': 54}, 'docVersion': 42, 'point': 105676, 'region': (105622, 105676)}]}

Verdict

So far it showed that it is neither better nor faster than Github Copilot: In the logfile above you can see that a completion took almost 5 seconds. But ollama-copilot works offline which is better than no copilot. And it works with only a few moving parts.