- Introduce `MCPServer` to support connecting to MCP servers via stdio, SSE, or HTTP. - Implement tool fetching, management, and invocation through MCP. - Add `WithMCPServer` method to `ToolBox` for seamless tool integration. - Extend schema package to handle raw JSON schemas for MCP tools. - Update documentation with MCP usage guidelines and examples.
89 lines
3.1 KiB
Markdown
89 lines
3.1 KiB
Markdown
# CLAUDE.md for go-llm
|
|
|
|
## Build and Test Commands
|
|
- Build project: `go build ./...`
|
|
- Run all tests: `go test ./...`
|
|
- Run specific test: `go test -v -run <TestName> ./...`
|
|
- Tidy dependencies: `go mod tidy`
|
|
|
|
## Code Style Guidelines
|
|
- **Indentation**: Use standard Go tabs for indentation.
|
|
- **Naming**:
|
|
- Use `camelCase` for internal/private variables and functions.
|
|
- Use `PascalCase` for exported types, functions, and struct fields.
|
|
- Interface names should be concise (e.g., `LLM`, `ChatCompletion`).
|
|
- **Error Handling**:
|
|
- Always check and handle errors immediately.
|
|
- Wrap errors with context using `fmt.Errorf("%w: ...", err)`.
|
|
- Use the project's internal `Error` struct in `error.go` when differentiating between error types is needed.
|
|
- **Project Structure**:
|
|
- `llm.go`: Contains core interfaces (`LLM`, `ChatCompletion`) and shared types (`Message`, `Role`, `Image`).
|
|
- Provider implementations are in `openai.go`, `anthropic.go`, and `google.go`.
|
|
- Schema definitions for tool calling are in the `schema/` directory.
|
|
- `mcp.go`: MCP (Model Context Protocol) client integration for connecting to MCP servers.
|
|
- **Imports**: Organize imports into groups: standard library, then third-party libraries.
|
|
- **Documentation**: Use standard Go doc comments for exported symbols.
|
|
- **README.md**: The README.md file should always be kept up to date with any significant changes to the project.
|
|
|
|
## CLI Tool
|
|
- Build CLI: `go build ./cmd/llm`
|
|
- Run CLI: `./llm` (or `llm.exe` on Windows)
|
|
- Run without building: `go run ./cmd/llm`
|
|
|
|
### CLI Features
|
|
- Interactive TUI for testing all go-llm features
|
|
- Support for OpenAI, Anthropic, and Google providers
|
|
- Image input (file path, URL, or base64)
|
|
- Tool/function calling with demo tools
|
|
- Temperature control and settings
|
|
|
|
### Key Bindings
|
|
- `Enter` - Send message
|
|
- `Ctrl+I` - Add image
|
|
- `Ctrl+T` - Toggle tools panel
|
|
- `Ctrl+P` - Change provider
|
|
- `Ctrl+M` - Change model
|
|
- `Ctrl+S` - Settings
|
|
- `Ctrl+N` - New conversation
|
|
- `Esc` - Exit/Cancel
|
|
|
|
## MCP (Model Context Protocol) Support
|
|
|
|
The library supports connecting to MCP servers to use their tools. MCP servers can be connected via:
|
|
- **stdio**: Run a command as a subprocess
|
|
- **sse**: Connect to an SSE endpoint
|
|
- **http**: Connect to a streamable HTTP endpoint
|
|
|
|
### Usage Example
|
|
```go
|
|
ctx := context.Background()
|
|
|
|
// Create and connect to an MCP server
|
|
server := &llm.MCPServer{
|
|
Name: "my-server",
|
|
Command: "my-mcp-server",
|
|
Args: []string{"--some-flag"},
|
|
}
|
|
if err := server.Connect(ctx); err != nil {
|
|
log.Fatal(err)
|
|
}
|
|
defer server.Close()
|
|
|
|
// Add the server to a toolbox
|
|
toolbox := llm.NewToolBox().WithMCPServer(server)
|
|
|
|
// Use the toolbox in requests - MCP tools are automatically available
|
|
req := llm.Request{
|
|
Messages: []llm.Message{{Role: llm.RoleUser, Text: "Use the MCP tool"}},
|
|
Toolbox: toolbox,
|
|
}
|
|
```
|
|
|
|
### MCPServer Options
|
|
- `Name`: Friendly name for logging
|
|
- `Command`: Command to run (for stdio transport)
|
|
- `Args`: Command arguments
|
|
- `Env`: Additional environment variables
|
|
- `URL`: Endpoint URL (for sse/http transport)
|
|
- `Transport`: "stdio" (default), "sse", or "http"
|