Nginx proxy server in a Docker container to Authenticate & Proxy requests to Ollama from Public Internet via Cloudflare Tunnel
-
Updated
Sep 4, 2025 - Shell
Nginx proxy server in a Docker container to Authenticate & Proxy requests to Ollama from Public Internet via Cloudflare Tunnel
A one-file Ollama CLI client written in bash
Ollama with Let's Encrypt Using Docker Compose
En este repositorio encontraras de una forma centralizada y facil, para poder correr alguno de tus modelos favortios localmente usando docker
A Bash Library for Ollama. Run LLM prompts straight from your shell, and more
Runpod-LLM provides ready-to-use container scripts for running large language models (LLMs) easily on RunPod.
Store your knowledge (privately), lead LLMs with it and cure hallucinations.
Особенности взаимодействия с различными нейросетями при помощи фреймворка Ollama
Add a description, image, and links to the ollama-api topic page so that developers can more easily learn about it.
To associate your repository with the ollama-api topic, visit your repo's landing page and select "manage topics."