Mammal is an LLM client built on Tauri. It supports a bunch of providers (esp. the openai API compatible ones). It's a side project borne out of my desire to have a local store of my LLM interactions, and I took it as a chance to experiment with building on Tauri.
Lots of my experimentation with prompting is with productionizing prompts in mind, so I have lots of thoughts about moving in that direction. But the basic client is useful right now (if not entirely bug-free).
Feedback welcome :)