LLM Provider Compatibility
agentful works with any LLM that supports function calling and 128K+ context.
Supported Providers
- GLM-4.7 - 10x cheaper than Claude, native Anthropic-compatible API
- Local Models - Ollama, vLLM, self-hosted
When to Use What
- Cheaper alternative → GLM-4.7
- Privacy → Local models
- Best quality → Claude (default)