LLM Providers

Overview

LLM Providers are the AI model connections that power your prompt testing and evaluation. Configure providers to connect to OpenAI, Anthropic, AWS Bedrock, and Groq.

LLM providers list page

Creating a Provider

For OpenAI, Anthropic, or Groq

  1. Click + New Provider
  2. Enter a descriptive name
  3. Select the provider from the dropdown
  4. Enter your API Key
  5. Optionally enter a custom Base URL
  6. Optionally toggle Set as default provider
  7. Click Test Connection to verify
  8. Click Create Provider

For AWS Bedrock

  1. Click + New Provider
  2. Enter a descriptive name
  3. Select AWS Bedrock from the dropdown
  4. Enter your AWS Access Key ID
  5. Enter your AWS Secret Access Key
  6. Enter your AWS Region (e.g., us-east-1)
  7. Optionally toggle Set as default provider
  8. Click Test Connection
  9. Click Create Provider

Supported Providers

OpenAI

Use for: GPT-4, GPT-4o, GPT-3.5 Turbo models

Get API Key: platform.openai.com/api-keys

Anthropic

Use for: Claude 3, Claude 2 models

Get API Key: console.anthropic.com/settings/keys

Groq

Use for: Fast inference with open-source models

Get API Key: console.groq.com/keys

AWS Bedrock

Use for: Amazon Bedrock models (Claude, Titan, Llama, etc.)

Setup: Configure IAM user with Bedrock permissions in AWS Console

Managing Providers

  • View Models - Click the Models button to see available models
  • Set as Default - Click the star icon to make a provider default
  • Delete - Click the trash icon to remove a provider

Best Practices

  • Use descriptive names (e.g., "Production OpenAI GPT-4")
  • Secure your credentials - never commit to version control
  • Test connections before saving
  • Set a default provider for quick testing
  • Rotate keys regularly for security
  • Delete unused providers