Ollama Cloud now appears in Providers() alongside local Ollama, with its
own EnvKey (OLLAMA_API_KEY) and a curated cloud-model list for picker UIs.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
The Ollama provider now targets /api/chat directly via the native provider
introduced in the previous commits. Public API is unchanged for callers
that go through llm.Ollama() (and is extended by Task 5's OllamaCloud()
constructor).
DefaultBaseURL was renamed to DefaultLocalBaseURL (without the trailing
/v1 segment used by the OpenAI-compat path). registry.go is updated
correspondingly; no other callers referenced the old name.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Five OpenAI-compatible providers join the library as first-class constructors
(llm.DeepSeek, llm.Moonshot, llm.XAI, llm.Groq, llm.Ollama). Their wire-level
implementation is shared via a new v2/openaicompat package which is the
extracted guts of the old v2/openai provider; each provider supplies its own
Rules value to declare per-model constraints (e.g., DeepSeek Reasoner rejects
tools and temperature, Moonshot/xAI accept images only on *-vision* models,
Groq rejects audio input). v2/openai itself becomes a thin wrapper that sets
RestrictTemperature for o-series and gpt-5 models.
A new provider registry (v2/registry.go) exposes llm.Providers() and drives
the TUI's provider picker so adding a provider in future is a single-file
change.
The TUI at cmd/llm was migrated from v1 to v2 and moved to v2/cmd/llm. With
nothing else depending on v1, the v1 code at the repo root (all .go files,
schema/, internal/, provider/, root go.mod/go.sum) is deleted.
Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>