Files
go-llm/v2/ollama/ollama.go
T
steve a3e9982d49 refactor(v2/ollama): drop openaicompat shim, use native provider
The Ollama provider now targets /api/chat directly via the native provider
introduced in the previous commits. Public API is unchanged for callers
that go through llm.Ollama() (and is extended by Task 5's OllamaCloud()
constructor).

DefaultBaseURL was renamed to DefaultLocalBaseURL (without the trailing
/v1 segment used by the OpenAI-compat path). registry.go is updated
correspondingly; no other callers referenced the old name.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-05-01 18:29:59 +00:00

23 lines
827 B
Go

// Package ollama implements the go-llm v2 provider interface for Ollama,
// targeting Ollama's native /api/chat endpoint. Supports both local Ollama
// instances (no API key) and Ollama Cloud (https://ollama.com, requires an
// API key sent as a Bearer token).
package ollama
// New creates a new Ollama provider over the native /api/chat API. An empty
// apiKey means local-mode (no Authorization header is sent). A non-empty
// apiKey is sent as `Authorization: Bearer <key>` for Ollama Cloud.
//
// An empty baseURL defaults to DefaultLocalBaseURL when apiKey is empty, or
// DefaultCloudBaseURL when apiKey is set.
func New(apiKey, baseURL string) *Provider {
if baseURL == "" {
if apiKey == "" {
baseURL = DefaultLocalBaseURL
} else {
baseURL = DefaultCloudBaseURL
}
}
return newNative(apiKey, baseURL)
}