LLM Providers
Radkit supports multiple LLM providers through a unified BaseLlm trait. This allows you to switch between models from different providers with minimal code changes.
All providers follow a similar pattern:
- Initialization: Create an LLM client, usually from an environment variable containing the API key.
- Configuration: Optionally, chain builder methods to configure parameters like
max_tokensortemperature. - Execution: Call
generate_content(or the simplergeneratefor text-only) with aThread.
Anthropic (Claude)
Section titled “Anthropic (Claude)”use radkit::models::providers::AnthropicLlm;use radkit::models::{BaseLlm, Thread};
// From environment variable (ANTHROPIC_API_KEY)let llm = AnthropicLlm::from_env("claude-3-sonnet-20240229")?;
// With configurationlet llm = AnthropicLlm::from_env("claude-3-opus-20240229")? .with_max_tokens(4096) .with_temperature(0.7);
// Generate contentlet thread = Thread::from_user("Explain quantum computing");let response = llm.generate_content(thread, None).await?;
println!("Response: {}", response.content().first_text().unwrap());OpenAI (GPT)
Section titled “OpenAI (GPT)”use radkit::models::providers::OpenAILlm;use radkit::models::BaseLlm;
// From environment variable (OPENAI_API_KEY)let llm = OpenAILlm::from_env("gpt-4o")?;
// With configurationlet llm = OpenAILlm::from_env("gpt-4o-mini")? .with_max_tokens(2000) .with_temperature(0.5);
let response = llm.generate("What is machine learning?", None).await?;Google Gemini
Section titled “Google Gemini”use radkit::models::providers::GeminiLlm;use radkit::models::BaseLlm;
// From environment variable (GEMINI_API_KEY)let llm = GeminiLlm::from_env("gemini-1.5-flash-latest")?;
let response = llm.generate("Explain neural networks", None).await?;Grok (xAI)
Section titled “Grok (xAI)”use radkit::models::providers::GrokLlm;use radkit::models::BaseLlm;
// From environment variable (XAI_API_KEY)let llm = GrokLlm::from_env("grok-1.5-flash")?;
let response = llm.generate("What is the meaning of life?", None).await?;DeepSeek
Section titled “DeepSeek”use radkit::models::providers::DeepSeekLlm;use radkit::models::BaseLlm;
// From environment variable (DEEPSEEK_API_KEY)let llm = DeepSeekLlm::from_env("deepseek-chat")?;
let response = llm.generate("Code review best practices", None).await?;