Mar 29 Using litellm with Gemini in Open WebUI Like so many others, I've grown accustomed to using the local model runner Ollama as well as proxy servers like litellm. Occasionally, I use these with the Open WebUI frontend. Read More »