Interface: AiProviderConfig
Defined in: src/ai/types.ts:141
Configuration for an AI provider connection.
Intent​
Supply connection details to LLM service factories.
Capability​
AI provider selection, model routing.
Example​
const config: AiProviderConfig = {
provider: 'azure-openai',
model: 'gpt-4o',
endpoint: 'https://my-resource.openai.azure.com/',
temperature: 0.2,
maxTokens: 4096,
};
Properties​
anthropicApiKey?​
readonlyoptionalanthropicApiKey:string
Defined in: src/ai/types.ts:147
API key for Anthropic.
apiKey?​
readonlyoptionalapiKey:string
Defined in: src/ai/types.ts:145
API key for OpenAI / Azure OpenAI.
endpoint?​
readonlyoptionalendpoint:string
Defined in: src/ai/types.ts:151
Base endpoint URL (required for Azure OpenAI deployments).
maxTokens?​
readonlyoptionalmaxTokens:number
Defined in: src/ai/types.ts:155
Maximum tokens for the completion (provider default if omitted).
model​
readonlymodel:string
Defined in: src/ai/types.ts:149
Target model name (e.g. gpt-4o, claude-3-5-sonnet-20241022).
provider​
readonlyprovider:"openai"|"azure-openai"|"anthropic"
Defined in: src/ai/types.ts:143
LLM provider identifier.
temperature​
readonlytemperature:number
Defined in: src/ai/types.ts:153
Sampling temperature (0.0–2.0). Lower = more deterministic.