Praman — AI-First SAP UI5 Test Automation Platform - v1.0.1
    Preparing search index...

    Interface LlmService

    LLM provider service interface.

    All methods return AiResponse envelopes — never throw on API errors. Only createLlmService() throws (when the provider is not configured).

    Uniform interface over Azure OpenAI, OpenAI, and Anthropic

    interface LlmService {
        chat(
            messages: { content: string; role: "system" | "user" | "assistant" }[],
            schema: ZodType,
        ): Promise<AiResponse<unknown>>;
        close(): Promise<void>;
        complete(prompt: string, schema: ZodType): Promise<AiResponse<unknown>>;
        isConfigured(): boolean;
    }
    Index

    Methods

    • Send a multi-turn conversation and receive a structured response.

      Parameters

      • messages: { content: string; role: "system" | "user" | "assistant" }[]

        Ordered list of conversation turns

      • schema: ZodType

        Zod schema to validate the JSON response

      Returns Promise<AiResponse<unknown>>

      Validated response or error envelope

    • Close the LLM connection and release resources.

      Returns Promise<void>

      Called automatically in the pramanAI fixture teardown. Safe to call multiple times — idempotent.

    • Send a single prompt and receive a structured response.

      Parameters

      • prompt: string

        Natural language instruction for the LLM

      • schema: ZodType

        Zod schema to validate the JSON response

      Returns Promise<AiResponse<unknown>>

      Validated response or error envelope

      Internally constructs a single user message and delegates to chat.

    • Return whether the AI provider is configured.

      Returns boolean

      Returns false gracefully when config.ai is undefined. Never throws. Use this to degrade gracefully when AI is not available.