feat: add DeepSeek, Moonshot, xAI, Groq, Ollama; drop v1; migrate TUI to v2
CI / Root Module (push) Failing after 30s
CI / Lint (push) Failing after 50s
CI / V2 Module (push) Successful in 2m14s

Five OpenAI-compatible providers join the library as first-class constructors
(llm.DeepSeek, llm.Moonshot, llm.XAI, llm.Groq, llm.Ollama). Their wire-level
implementation is shared via a new v2/openaicompat package which is the
extracted guts of the old v2/openai provider; each provider supplies its own
Rules value to declare per-model constraints (e.g., DeepSeek Reasoner rejects
tools and temperature, Moonshot/xAI accept images only on *-vision* models,
Groq rejects audio input). v2/openai itself becomes a thin wrapper that sets
RestrictTemperature for o-series and gpt-5 models.

A new provider registry (v2/registry.go) exposes llm.Providers() and drives
the TUI's provider picker so adding a provider in future is a single-file
change.

The TUI at cmd/llm was migrated from v1 to v2 and moved to v2/cmd/llm. With
nothing else depending on v1, the v1 code at the repo root (all .go files,
schema/, internal/, provider/, root go.mod/go.sum) is deleted.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
This commit is contained in:
2026-04-24 13:34:39 +00:00
parent 9b91b2f794
commit 34119e5a00
58 changed files with 1921 additions and 4242 deletions
+33
View File
@@ -0,0 +1,33 @@
// Package groq implements the go-llm v2 provider interface for Groq
// (https://console.groq.com). Groq hosts open-source models behind an OpenAI
// Chat Completions-compatible endpoint, so this package is a thin wrapper over
// openaicompat with its own defaults and per-model Rules.
package groq
import (
"strings"
"gitea.stevedudenhoeffer.com/steve/go-llm/v2/openaicompat"
)
// DefaultBaseURL is the public Groq OpenAI-compatible endpoint.
const DefaultBaseURL = "https://api.groq.com/openai/v1"
// Provider is a type alias over openaicompat.Provider.
type Provider = openaicompat.Provider
// New creates a new Groq provider. An empty baseURL uses DefaultBaseURL.
func New(apiKey, baseURL string) *Provider {
if baseURL == "" {
baseURL = DefaultBaseURL
}
return openaicompat.New(apiKey, baseURL, openaicompat.Rules{
// Only Groq-hosted vision variants (e.g. *-vision-preview) accept images.
SupportsVision: func(m string) bool {
return strings.Contains(m, "vision")
},
// Chat completions endpoint does not accept audio input; audio is via
// dedicated transcription endpoints, which go-llm doesn't cover here.
SupportsAudio: func(string) bool { return false },
})
}
+33
View File
@@ -0,0 +1,33 @@
package groq_test
import (
"context"
"errors"
"testing"
"gitea.stevedudenhoeffer.com/steve/go-llm/v2/groq"
"gitea.stevedudenhoeffer.com/steve/go-llm/v2/openaicompat"
"gitea.stevedudenhoeffer.com/steve/go-llm/v2/provider"
)
func TestNew_Basic(t *testing.T) {
if p := groq.New("key", ""); p == nil {
t.Fatal("New returned nil")
}
}
func TestRules_AudioRejected(t *testing.T) {
p := groq.New("key", "")
req := provider.Request{
Model: "llama-3.3-70b-versatile",
Messages: []provider.Message{{
Role: "user",
Audio: []provider.Audio{{Base64: "AAA=", ContentType: "audio/wav"}},
}},
}
_, err := p.Complete(context.Background(), req)
var fue *openaicompat.FeatureUnsupportedError
if !errors.As(err, &fue) || fue.Feature != "audio" {
t.Fatalf("want FeatureUnsupportedError(audio), got %v", err)
}
}