Files
go-llm/v2/openai/openai.go
T
steve 34119e5a00
CI / Root Module (push) Failing after 30s
CI / Lint (push) Failing after 50s
CI / V2 Module (push) Successful in 2m14s
feat: add DeepSeek, Moonshot, xAI, Groq, Ollama; drop v1; migrate TUI to v2
Five OpenAI-compatible providers join the library as first-class constructors
(llm.DeepSeek, llm.Moonshot, llm.XAI, llm.Groq, llm.Ollama). Their wire-level
implementation is shared via a new v2/openaicompat package which is the
extracted guts of the old v2/openai provider; each provider supplies its own
Rules value to declare per-model constraints (e.g., DeepSeek Reasoner rejects
tools and temperature, Moonshot/xAI accept images only on *-vision* models,
Groq rejects audio input). v2/openai itself becomes a thin wrapper that sets
RestrictTemperature for o-series and gpt-5 models.

A new provider registry (v2/registry.go) exposes llm.Providers() and drives
the TUI's provider picker so adding a provider in future is a single-file
change.

The TUI at cmd/llm was migrated from v1 to v2 and moved to v2/cmd/llm. With
nothing else depending on v1, the v1 code at the repo root (all .go files,
schema/, internal/, provider/, root go.mod/go.sum) is deleted.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
2026-04-24 13:34:39 +00:00

36 lines
1.2 KiB
Go

// Package openai implements the go-llm v2 provider interface for OpenAI.
//
// The actual wire-protocol logic lives in the shared openaicompat package;
// this file encodes OpenAI-specific Rules (temperature is rejected on o-series
// and gpt-5* models) and supplies the default base URL.
package openai
import (
"strings"
"gitea.stevedudenhoeffer.com/steve/go-llm/v2/openaicompat"
)
// DefaultBaseURL is the public OpenAI Chat Completions endpoint.
const DefaultBaseURL = "https://api.openai.com/v1"
// Provider is the OpenAI chat-completion provider. It's a type alias over
// openaicompat.Provider so existing callers using openai.Provider keep compiling.
type Provider = openaicompat.Provider
// New creates a new OpenAI provider. An empty baseURL uses DefaultBaseURL.
func New(apiKey string, baseURL string) *Provider {
if baseURL == "" {
baseURL = DefaultBaseURL
}
return openaicompat.New(apiKey, baseURL, openaicompat.Rules{
RestrictTemperature: restrictTemperature,
})
}
// restrictTemperature reports whether OpenAI rejects a user-supplied
// temperature for this model. o-series reasoning models and gpt-5* both do.
func restrictTemperature(model string) bool {
return strings.HasPrefix(model, "o") || strings.HasPrefix(model, "gpt-5")
}