34119e5a00
Five OpenAI-compatible providers join the library as first-class constructors (llm.DeepSeek, llm.Moonshot, llm.XAI, llm.Groq, llm.Ollama). Their wire-level implementation is shared via a new v2/openaicompat package which is the extracted guts of the old v2/openai provider; each provider supplies its own Rules value to declare per-model constraints (e.g., DeepSeek Reasoner rejects tools and temperature, Moonshot/xAI accept images only on *-vision* models, Groq rejects audio input). v2/openai itself becomes a thin wrapper that sets RestrictTemperature for o-series and gpt-5 models. A new provider registry (v2/registry.go) exposes llm.Providers() and drives the TUI's provider picker so adding a provider in future is a single-file change. The TUI at cmd/llm was migrated from v1 to v2 and moved to v2/cmd/llm. With nothing else depending on v1, the v1 code at the repo root (all .go files, schema/, internal/, provider/, root go.mod/go.sum) is deleted. Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
911 B
911 B
CLAUDE.md for go-llm
All Go code now lives under v2/. The module path is
gitea.stevedudenhoeffer.com/steve/go-llm/v2. There is no module at the
repository root anymore; the v1 code at the root was deleted after all
consumers migrated to v2.
See v2/CLAUDE.md for build/test commands and per-package guidance.
CLI
The interactive TUI lives at v2/cmd/llm:
cd v2 && go run ./cmd/llm
It iterates llm.Providers() so every registered provider (OpenAI, Anthropic,
Google, DeepSeek, Moonshot, xAI, Groq, Ollama) appears in the picker
automatically. Status is derived from each provider's env var; Ollama shows as
"(local)" because it needs no key.
Key bindings
Enter— Send messageCtrl+I— Add imageCtrl+T— Toggle tools panelCtrl+P— Change providerCtrl+M— Change modelCtrl+S— SettingsCtrl+N— New conversationEsc— Exit/Cancel