Files
llm-proxy-go/README.md

810 B

LLM Proxy

HTTP proxy for LLM APIs with streaming support and chunk processing.

Usage

./llm-proxy

Configuration

Variable Description Default
UPSTREAM_URL Upstream LLM API URL https://api.openai.com/v1/chat/completions
LISTEN_ADDR Listen address :8080
API_KEY Upstream API key -
INSECURE Skip TLS verification false

Example

UPSTREAM_URL=https://api.openai.com/v1/chat/completions \
API_KEY=sk-... \
LISTEN_ADDR=:8080 \
./llm-proxy

Endpoints

  • GET /health - Health check
  • /* - Proxies all requests to upstream

Streaming

Supports SSE (text/event-stream) and NDJSON (application/x-ndjson) streaming. Each chunk is processed via processChunk() before forwarding.