Files
go-llm/v2/response.go
Steve Dudenhoeffer a4cb4baab5 Add go-llm v2: redesigned API for simpler LLM abstraction
v2 is a new Go module (v2/) with a dramatically simpler API:
- Unified Message type (no more Input marker interface)
- Define[T] for ergonomic tool creation with standard context.Context
- Chat session with automatic tool-call loop (agent loop)
- Streaming via pull-based StreamReader
- MCP one-call connect (MCPStdioServer, MCPHTTPServer, MCPSSEServer)
- Middleware support (logging, retry, timeout, usage tracking)
- Decoupled JSON Schema (map[string]any, no provider coupling)
- Sample tools: WebSearch, Browser, Exec, ReadFile, WriteFile, HTTP
- Providers: OpenAI, Anthropic, Google (all with streaming)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-07 20:00:08 -05:00

35 lines
876 B
Go

package llm
// Response represents the result of a completion request.
type Response struct {
// Text is the assistant's text content. Empty if only tool calls.
Text string
// ToolCalls contains any tool invocations the assistant requested.
ToolCalls []ToolCall
// Usage contains token usage information (if available from provider).
Usage *Usage
// message is the full assistant message for this response.
message Message
}
// Message returns the full assistant Message for this response,
// suitable for appending to the conversation history.
func (r Response) Message() Message {
return r.message
}
// HasToolCalls returns true if the response contains tool call requests.
func (r Response) HasToolCalls() bool {
return len(r.ToolCalls) > 0
}
// Usage captures token consumption.
type Usage struct {
InputTokens int
OutputTokens int
TotalTokens int
}