Add go-llm v2: redesigned API for simpler LLM abstraction
v2 is a new Go module (v2/) with a dramatically simpler API: - Unified Message type (no more Input marker interface) - Define[T] for ergonomic tool creation with standard context.Context - Chat session with automatic tool-call loop (agent loop) - Streaming via pull-based StreamReader - MCP one-call connect (MCPStdioServer, MCPHTTPServer, MCPSSEServer) - Middleware support (logging, retry, timeout, usage tracking) - Decoupled JSON Schema (map[string]any, no provider coupling) - Sample tools: WebSearch, Browser, Exec, ReadFile, WriteFile, HTTP - Providers: OpenAI, Anthropic, Google (all with streaming) Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
34
v2/response.go
Normal file
34
v2/response.go
Normal file
@@ -0,0 +1,34 @@
|
||||
package llm
|
||||
|
||||
// Response represents the result of a completion request.
|
||||
type Response struct {
|
||||
// Text is the assistant's text content. Empty if only tool calls.
|
||||
Text string
|
||||
|
||||
// ToolCalls contains any tool invocations the assistant requested.
|
||||
ToolCalls []ToolCall
|
||||
|
||||
// Usage contains token usage information (if available from provider).
|
||||
Usage *Usage
|
||||
|
||||
// message is the full assistant message for this response.
|
||||
message Message
|
||||
}
|
||||
|
||||
// Message returns the full assistant Message for this response,
|
||||
// suitable for appending to the conversation history.
|
||||
func (r Response) Message() Message {
|
||||
return r.message
|
||||
}
|
||||
|
||||
// HasToolCalls returns true if the response contains tool call requests.
|
||||
func (r Response) HasToolCalls() bool {
|
||||
return len(r.ToolCalls) > 0
|
||||
}
|
||||
|
||||
// Usage captures token consumption.
|
||||
type Usage struct {
|
||||
InputTokens int
|
||||
OutputTokens int
|
||||
TotalTokens int
|
||||
}
|
||||
Reference in New Issue
Block a user