feat: add DeepSeek, Moonshot, xAI, Groq, Ollama; drop v1; migrate TUI to v2
CI / Root Module (push) Failing after 30s
CI / Lint (push) Failing after 50s
CI / V2 Module (push) Successful in 2m14s

Five OpenAI-compatible providers join the library as first-class constructors
(llm.DeepSeek, llm.Moonshot, llm.XAI, llm.Groq, llm.Ollama). Their wire-level
implementation is shared via a new v2/openaicompat package which is the
extracted guts of the old v2/openai provider; each provider supplies its own
Rules value to declare per-model constraints (e.g., DeepSeek Reasoner rejects
tools and temperature, Moonshot/xAI accept images only on *-vision* models,
Groq rejects audio input). v2/openai itself becomes a thin wrapper that sets
RestrictTemperature for o-series and gpt-5 models.

A new provider registry (v2/registry.go) exposes llm.Providers() and drives
the TUI's provider picker so adding a provider in future is a single-file
change.

The TUI at cmd/llm was migrated from v1 to v2 and moved to v2/cmd/llm. With
nothing else depending on v1, the v1 code at the repo root (all .go files,
schema/, internal/, provider/, root go.mod/go.sum) is deleted.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
This commit is contained in:
2026-04-24 13:34:39 +00:00
parent 9b91b2f794
commit 34119e5a00
58 changed files with 1921 additions and 4242 deletions
+27
View File
@@ -0,0 +1,27 @@
# go-llm CLI environment variables
# Copy this file to .env and fill in the keys for providers you use.
# OpenAI API Key (https://platform.openai.com/api-keys)
OPENAI_API_KEY=
# Anthropic API Key (https://console.anthropic.com/settings/keys)
ANTHROPIC_API_KEY=
# Google AI API Key (https://aistudio.google.com/apikey)
GOOGLE_API_KEY=
# DeepSeek API Key (https://platform.deepseek.com)
DEEPSEEK_API_KEY=
# Moonshot / Kimi API Key (https://platform.moonshot.ai)
MOONSHOT_API_KEY=
# xAI / Grok API Key (https://x.ai/api)
XAI_API_KEY=
# Groq API Key (https://console.groq.com/keys)
GROQ_API_KEY=
# Ollama runs locally with no API key required.
# Override the endpoint if you're not using localhost:11434.
# OLLAMA_BASE_URL=http://localhost:11434/v1