Browsers block HTTP requests from HTTPS pages. Your local Ollama can't be reached. This 175KB proxy fixes it — with hardware-locked licensing and zero dependencies.
Modern web apps are served over HTTPS. Browsers enforce mixed content blocking — they refuse to make HTTP requests from HTTPS pages. Your Ollama server at http://localhost:11434 is invisible to any website served over HTTPS.
This means: gbrain, Open WebUI, any LLM playground hosted on a real domain cannot reach your local Ollama. Your AI models are stranded.
Without Ollama Proxy:
With Ollama Proxy:
Ollama Proxy sits between your browser and Ollama. It listens on a proxy port with proper CORS headers, forwards requests to Ollama, and returns responses with the headers browsers need. Your AI models become accessible from any HTTPS website.
Your data never leaves your machine. The proxy runs locally. No cloud, no external servers, no data collection.
Pure native C. No Python, no Node.js, no Java, no runtime. Download, run, done. Smaller than a favicon.
License key is tied to your machine's CPU, disk, and hostname via SHA-256 fingerprint. Can't be shared or pirated.
Automatically injects Access-Control-Allow-Origin: * headers so browsers allow cross-origin requests.
Full support for Ollama's streaming chat completions. Token-by-token responses forwarded in real time.
Windows x64, Linux x64/ARM64, macOS x64/ARM64. One codebase, compiled natively for each platform.
No OpenSSL, no Python, no npm. Just the OS network stack. Runs on any machine that can run Ollama.
Download the proxy and run ollama-proxy --fingerprint. This generates a unique hardware identifier from your CPU, disk, and hostname. Copy this — you'll need it for checkout.
Paste your machine fingerprint during checkout. We generate a license key signed with HMAC-SHA256, locked to your hardware. You receive it instantly by email.
Run ollama-proxy --license YOUR-KEY. The proxy validates your key offline (no phone-home), starts on port 11435, and begins forwarding to Ollama on 11434.
Point your web app at http://localhost:11435 instead of 11434. The proxy adds CORS headers and forwards everything. Your local AI just works.
One binary. One license. One machine. No subscriptions, no cloud fees, no data collection.
Download the proxy binary for your platform. 175KB, zero dependencies, runs instantly.
A license key is required. Purchase below, then enter it when starting the proxy.
Your license key will be emailed within 60 seconds of payment. 30-day money-back guarantee. No questions asked.