Pop search terms in here.

Feed the Birds

Ollama Clients

I've been playing with my new LLM toys on and off over the past couple of days, and had chance to try out a few of those client applications. The long and short of it is that I'm still using Open WebUI, but moved it onto the Mac Pro given that it wouldn't interfere with anything, and it's only of use to me when that system's up and running anyway.

Speedy Installation Instructions

Create a Quadlet that fires up the Open WebUI container on boot:

[Container]
ContainerName=open-webui
Image=ghcr.io/open-webui/open-webui:main
AutoUpdate=registry
PublishPort=8080:8080
Volume=open-webui:/app/backend/data
Network=host

[Unit]
Description=Open WebUI

[Install]
WantedBy=default.target

I'm happy with it using port 8080, so they're mapped one-to-one. Note Network=host is required, or the container won't be able to see Ollama.

Get systemd to see it and fire it up:

systemctl --user daemon-reload
systemctl --user start open-webui

Check everything's happy:

systemctl --user status open-webui
journalctl --user -fu open-webui

If it's not firing up before the desktop on reboot:

loginctl enable-linger $USER

It's certainly feature-rich and I really don't understand most of the options just yet, but it does a good job of keeping all that out of your way when it's in use, so I can worry about it if and when I decide I need to.

Apps

I'd listed Chatbox AI, Enchanted, Ollamac, and Ollamac Pro in the previous article. I did have a muck about with them, and Chatbox AI is probably the most flexible and complete in my opinion. There's plenty of noise on there when it first launches, but it's pretty quickly cleaned up and you'll find Ollama as a Model Provider once you've clicked through the first couple of prompts hoping you've got a subscription to one thing or another.

Screenshot of Chatbox AI's user interface, with a four panel design and plenty of "example content" over on the left bar, where the chats live.

One big Chatbox AI face-plant is that the iPad app isn't requesting local network access permissions, and doesn't even appear in the Settings > Privacy & Security > Local Network list to grant it manually. Therefore, it can't see my Ollama server from the iPad. I'm not sure whose fault that is.

My next best for desktop was Ollamac. It's quick, it's dead simple, it's open source. Runs on the Mac, but there's no mobile device version. It took a quit-and-restart to get it seeing the Ollama server, but worked perfectly after that, and it has a few basic per-chat settings including model selection, system prompt, temperature, and so on. The chat window is clear, text is easy to select, there's a copy button. However, sometimes it wouldn't scroll the content, leaving my prompt hidden behind the "Write your message here" text box, and the response even lower. Despite that, I may well keep this one around, because it's nice to flick over to a quick app interface rather than a web page sometimes.

Screenshot of Ollamac. The layout is a classic two panel design with chats list on the left, current chat on the right.

For me, Enchanted and Ollamac Pro (unrelated to Ollamac, so I discover) both suffered from having a low-contrast user interface with the worst sin of not returning the caret to the text input field after the response had been received. Every time I would have to move my hand back to the mouse to click back on the input, or mash the TAB key and hope for the best.

Screenshot of Enchanted for macOS, which is... very grey. There's no differentiation between the chat and the prompt area. And it's all grey.

At least Enchanted's mobile version (for iPad and iPhone no less) asked for the correct permissions and connected to the server, so that was a plus. It would be nice if someone offered iCloud sync of chat history. That would be pretty cool.

For reasons unknown, Ollamac Pro defaulted to white text on a grey background. It's easily changed in the settings, but why? Then it did the no-caret-in-the-text-box trick, and I hit CMD-Q to be done for the night. Not a comprehensive review, I'll admit.

Screenshot of Ollamac Pro, with the bizarre default of white text on light grey background. I can make out the emojis, but that's about all.

Irony

Now that I have this resource, I discover that I've pretty much finished the task I was planning to use it for. But it does feel very curious having this condensed, offline, queryable, statistical knowledge-in-a-jar system next to me, and I know it's only a button-press away when I next think it may be useful.

← Recent Articles
September 01, 2025