• by kesor on 10/14/2024, 10:40:02 AM

    A Dockerized Nginx server configured to act as a reverse proxy for Ollama, a local AI model serving platform. The proxy includes built-in authentication using a custom Authorization header and exposes the Ollama service over the internet using a Cloudflare Tunnel.

    Ollama has an OpenAI compatible AI, having this proxy allows you to use Cursor.ai IDE with the models on your computer by using their OpenAI custom url + token.