refactor(v2/ollama): drop openaicompat shim, use native provider

The Ollama provider now targets /api/chat directly via the native provider
introduced in the previous commits. Public API is unchanged for callers
that go through llm.Ollama() (and is extended by Task 5's OllamaCloud()
constructor).

DefaultBaseURL was renamed to DefaultLocalBaseURL (without the trailing
/v1 segment used by the OpenAI-compat path). registry.go is updated
correspondingly; no other callers referenced the old name.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
2026-05-01 18:29:59 +00:00
parent f70c7c0842
commit a3e9982d49
3 changed files with 50 additions and 42 deletions
+1 -1
View File
@@ -129,7 +129,7 @@ var providerRegistry = []ProviderInfo{
Name: "ollama",
DisplayName: "Ollama (local)",
EnvKey: "", // no key needed
DefaultURL: ollama.DefaultBaseURL,
DefaultURL: ollama.DefaultLocalBaseURL,
Models: []string{
"llama3.2", "llama3.1", "qwen2.5", "mistral", "gemma2", "phi4",
},