credentials.schema.json Configuration Guide
This guide explains every field in docs/schemas/credentials.schema.json.
Purpose
credentials.json stores provider credentials used by LLM providers/endpoints. It is a per-provider credential map and intentionally keeps the structure simple.
Full Example
1{ 2 "$schema": "https://ponybunny.dho.ai/schemas/credentials.schema.json", 3 "providers": { 4 "anthropic": { 5 "apiKey": "sk-ant-...", 6 "baseUrl": "" 7 }, 8 "openai": { 9 "apiKey": "sk-...", 10 "baseUrl": "" 11 }, 12 "aws-bedrock": { 13 "accessKeyId": "AKIA...", 14 "secretAccessKey": "...", 15 "region": "us-east-1", 16 "baseUrl": "" 17 }, 18 "azure-openai": { 19 "apiKey": "...", 20 "endpoint": "https://<resource>.openai.azure.com", 21 "baseUrl": "" 22 }, 23 "openai-compatible": { 24 "apiKey": "...", 25 "baseUrl": "http://localhost:8000" 26 }, 27 "google-ai-studio": { 28 "apiKey": "...", 29 "baseUrl": "" 30 }, 31 "google-vertex-ai": { 32 "projectId": "my-project", 33 "region": "us-central1", 34 "baseUrl": "" 35 } 36 } 37}
Top-Level Fields
$schema: Schema URI for editor/tool validation.providers: Provider ID -> endpoint credential object.
providers.<providerId> Field Reference
All fields below are optional in schema, but practical requirements depend on provider protocol.
apiKey(string): API key for providers that use token auth (Anthropic/OpenAI/Google AI Studio/Azure).accessKeyId(string): AWS Access Key ID (AWS Bedrock).secretAccessKey(string): AWS Secret Access Key (AWS Bedrock).region(string): AWS region (Bedrock) or Google region (Vertex AI).endpoint(string): Azure OpenAI endpoint URL.projectId(string): Google Cloud project ID (Vertex AI).baseUrl(string): Provider base URL override.
Provider Usage Notes
anthropic/openai/google-ai-studio: usually requireapiKey.aws-bedrock: typically needsaccessKeyId,secretAccessKey,region.azure-openai: typically needsapiKey+endpoint.google-vertex-ai: typically needsprojectId+region.openai-compatible: usually needsbaseUrl, and commonlyapiKey(or a placeholder if endpoint ignores auth).
/v1 URL Rules and Runtime Handling
For OpenAI-style providers, credentials-level URL fields participate in final request URL composition.
Runtime base URL precedence:
credentials.jsonproviders.<providerId>.baseUrlcredentials.jsonproviders.<providerId>.endpoint(mainly Azure)llm-config.jsonproviderbaseUrl
This means credentials file values override llm-config.json URL values.
/v1 rule:
- final request path should contain exactly one version segment where required
Valid patterns:
- credentials
baseUrlwithout/v1+ endpoint path with/v1/... - credentials
baseUrlwith/v1+ endpoint path without/v1
Avoid:
- base URL and endpoint path both lacking
/v1on APIs that require versioned paths
System behavior:
- runtime handling avoids duplicate
/v1when both components include version prefix
Azure note:
- Azure OpenAI uses deployment-style endpoint paths and
api-version, so do not force generic/v1assumptions for Azure.
Safety and Operations
- Treat this file as secret material.
- Avoid committing real credentials.
- Prefer environment-variable injection in CI/production when possible.