Skip to main content

Interface: AiProviderConfig

Defined in: src/ai/types.ts:141

Configuration for an AI provider connection.

Intent​

Supply connection details to LLM service factories.

Capability​

AI provider selection, model routing.

Example​

const config: AiProviderConfig = {
provider: 'azure-openai',
model: 'gpt-4o',
endpoint: 'https://my-resource.openai.azure.com/',
temperature: 0.2,
maxTokens: 4096,
};

Properties​

anthropicApiKey?​

readonly optional anthropicApiKey: string

Defined in: src/ai/types.ts:147

API key for Anthropic.


apiKey?​

readonly optional apiKey: string

Defined in: src/ai/types.ts:145

API key for OpenAI / Azure OpenAI.


endpoint?​

readonly optional endpoint: string

Defined in: src/ai/types.ts:151

Base endpoint URL (required for Azure OpenAI deployments).


maxTokens?​

readonly optional maxTokens: number

Defined in: src/ai/types.ts:155

Maximum tokens for the completion (provider default if omitted).


model​

readonly model: string

Defined in: src/ai/types.ts:149

Target model name (e.g. gpt-4o, claude-3-5-sonnet-20241022).


provider​

readonly provider: "openai" | "azure-openai" | "anthropic"

Defined in: src/ai/types.ts:143

LLM provider identifier.


temperature​

readonly temperature: number

Defined in: src/ai/types.ts:153

Sampling temperature (0.0–2.0). Lower = more deterministic.