Praman — AI-First SAP UI5 Test Automation Platform - v1.0.1
    Preparing search index...

    Interface AiProviderConfig

    Configuration for an AI provider connection.

    Supply connection details to LLM service factories.

    AI provider selection, model routing.

    const config: AiProviderConfig = {
    provider: 'azure-openai',
    model: 'gpt-4o',
    endpoint: 'https://my-resource.openai.azure.com/',
    temperature: 0.2,
    maxTokens: 4096,
    };
    interface AiProviderConfig {
        anthropicApiKey?: string;
        apiKey?: string;
        endpoint?: string;
        maxTokens?: number;
        model: string;
        provider: "openai" | "azure-openai" | "anthropic";
        temperature: number;
    }
    Index

    Properties

    anthropicApiKey?: string

    API key for Anthropic.

    apiKey?: string

    API key for OpenAI / Azure OpenAI.

    endpoint?: string

    Base endpoint URL (required for Azure OpenAI deployments).

    maxTokens?: number

    Maximum tokens for the completion (provider default if omitted).

    model: string

    Target model name (e.g. gpt-4o, claude-3-5-sonnet-20241022).

    provider: "openai" | "azure-openai" | "anthropic"

    LLM provider identifier.

    temperature: number

    Sampling temperature (0.0–2.0). Lower = more deterministic.