Skip to content

BaseLLMProvider

Abstract base class for LLM providers in CompositeVoice.

Defined in: src/providers/base/BaseLLMProvider.ts:73

Abstract base class for LLM providers in CompositeVoice.

Remarks

BaseLLMProvider extends BaseProvider and implements the LLMProvider interface. It provides shared helpers for building message arrays and merging generation options, while requiring subclasses to implement the two generation methods.

All LLM providers communicate over REST (type = 'rest') and follow a Receive Text -> Send Text contract:

  • Input: a plain-text prompt or an array of LLMMessage objects.
  • Output: an AsyncIterable<string> that yields text chunks (supports both streaming and non-streaming implementations).

Inheritance hierarchy:

BaseProvider
 +-- BaseLLMProvider          <-- you are here
      +-- AnthropicLLM        (streaming SSE)
      +-- OpenAILLM           (non-streaming / streaming)
      +-- GroqLLM             (streaming)
      +-- WebLLMLLM           (in-browser inference)

Example

import { BaseLLMProvider } from 'composite-voice';
import type { LLMProviderConfig, LLMGenerationOptions, LLMMessage } from 'composite-voice';

class MyLLMProvider extends BaseLLMProvider {
  constructor(config: LLMProviderConfig) {
    super(config);
  }

  protected async onInitialize(): Promise<void> { }
  protected async onDispose(): Promise<void> { }

  async generate(prompt: string, options?: LLMGenerationOptions) {
    const messages = this.promptToMessages(prompt);
    return this.generateFromMessages(messages, options);
  }

  async *generateFromMessages(messages: LLMMessage[], options?: LLMGenerationOptions) {
    const merged = this.mergeOptions(options);
    const response = await myApi.chat(messages, merged);
    yield response.text;
  }
}

See

Extends

Extended by

Implements

Constructors

Constructor

new BaseLLMProvider(config, logger?): BaseLLMProvider;

Defined in: src/providers/base/BaseLLMProvider.ts:87

Create a new LLM provider.

Parameters

ParameterTypeDescription
configLLMProviderConfigLLM provider configuration including model name, temperature, system prompt, and other generation defaults.
logger?LoggerOptional parent logger; a child will be derived.

Returns

BaseLLMProvider

Overrides

BaseProviderClass.constructor

Properties

PropertyModifierTypeDefault valueDescriptionOverridesInherited fromDefined in
configpublicLLMProviderConfigundefinedLLM-specific provider configuration.LLMProvider.config BaseProviderClass.config-src/providers/base/BaseLLMProvider.ts:78
initializedprotectedbooleanfalseTracks whether initialize has completed successfully.-BaseProviderClass.initializedsrc/providers/base/BaseProvider.ts:97
loggerprotectedLoggerundefinedScoped logger instance for this provider.-BaseProviderClass.loggersrc/providers/base/BaseProvider.ts:94
rolesreadonlyreadonly ProviderRole[]undefinedLLM providers cover the 'llm' pipeline role by default.LLMProvider.roles BaseProviderClass.roles-src/providers/base/BaseLLMProvider.ts:75
typereadonlyProviderTypeundefinedCommunication transport this provider uses ('rest' or 'websocket').-LLMProvider.type BaseProviderClass.typesrc/providers/base/BaseProvider.ts:74

Methods

assertReady()

protected assertReady(): void;

Defined in: src/providers/base/BaseProvider.ts:255

Guard that throws if the provider has not been initialized.

Returns

void

Remarks

Call at the start of any method that requires the provider to be ready.

Throws

Error Thrown with a descriptive message when initialized is false.

Inherited from

BaseProviderClass.assertReady


dispose()

dispose(): Promise<void>;

Defined in: src/providers/base/BaseProvider.ts:154

Clean up resources and dispose of the provider.

Returns

Promise<void>

Remarks

Delegates to the subclass hook onDispose and resets the initialized flag. If the provider is not initialized, the call is a no-op.

Throws

Re-throws any error raised by onDispose.

Implementation of

LLMProvider.dispose

Inherited from

BaseProviderClass.dispose


generate()

abstract generate(prompt, options?): Promise<AsyncIterable<string, any, any>>;

Defined in: src/providers/base/BaseLLMProvider.ts:108

Generate a response from a single user prompt.

Parameters

ParameterTypeDescription
promptstringThe user’s input text.
options?LLMGenerationOptionsOptional generation overrides (temperature, max tokens, stop sequences, abort signal, etc.).

Returns

Promise<AsyncIterable<string, any, any>>

An AsyncIterable that yields text chunks as they arrive.

Remarks

Interface: Receive Text -> Send Text. This is the simplest way to get a completion. Implementations typically convert the prompt into a messages array (optionally prepending a system message) and delegate to generateFromMessages.

Implementation of

LLMProvider.generate


generateFromMessages()

abstract generateFromMessages(messages, options?): Promise<AsyncIterable<string, any, any>>;

Defined in: src/providers/base/BaseLLMProvider.ts:126

Generate a response from an array of conversation messages.

Parameters

ParameterTypeDescription
messagesLLMMessage[]Ordered array of LLMMessage objects representing the conversation history.
options?LLMGenerationOptionsOptional generation overrides.

Returns

Promise<AsyncIterable<string, any, any>>

An AsyncIterable that yields text chunks as they arrive.

Remarks

Interface: Receive Text -> Send Text. Use this method when you need multi-turn conversation support. The messages array can include system, user, and assistant roles to provide full conversational context.

Implementation of

LLMProvider.generateFromMessages


getConfig()

getConfig(): LLMProviderConfig;

Defined in: src/providers/base/BaseLLMProvider.ts:192

Get a shallow copy of the current LLM configuration.

Returns

LLMProviderConfig

A new LLMProviderConfig object.

Overrides

BaseProviderClass.getConfig


initialize()

initialize(): Promise<void>;

Defined in: src/providers/base/BaseProvider.ts:127

Initialize the provider, making it ready for use.

Returns

Promise<void>

Remarks

Calls the subclass hook onInitialize. If the provider has already been initialized the call is a no-op.

Throws

ProviderInitializationError Thrown when onInitialize rejects. The original error is wrapped with the provider class name for diagnostics.

Implementation of

LLMProvider.initialize

Inherited from

BaseProviderClass.initialize


isReady()

isReady(): boolean;

Defined in: src/providers/base/BaseProvider.ts:178

Check whether the provider has been initialized and is ready.

Returns

boolean

true when initialize has completed successfully and dispose has not yet been called.

Implementation of

LLMProvider.isReady

Inherited from

BaseProviderClass.isReady


mergeOptions()

protected mergeOptions(options?): LLMGenerationOptions;

Defined in: src/providers/base/BaseLLMProvider.ts:170

Merge per-call generation options with the provider’s config defaults.

Parameters

ParameterTypeDescription
options?LLMGenerationOptionsOptional per-call overrides.

Returns

LLMGenerationOptions

A merged LLMGenerationOptions object.

Remarks

Values supplied in options take precedence over values in config. Only defined values are included in the result, allowing providers to distinguish “not set” from explicit values.


onConfigUpdate()

protected onConfigUpdate(_config): void;

Defined in: src/providers/base/BaseProvider.ts:242

Hook called after updateConfig merges new values.

Parameters

ParameterTypeDescription
_configPartial<BaseProviderConfig>The partial configuration that was merged.

Returns

void

Remarks

The default implementation is a no-op. Override in subclasses to react to runtime configuration changes (e.g. reconnect with a new API key).

Inherited from

BaseProviderClass.onConfigUpdate


onDispose()

abstract protected onDispose(): Promise<void>;

Defined in: src/providers/base/BaseProvider.ts:229

Provider-specific disposal logic.

Returns

Promise<void>

Remarks

Subclasses must implement this method to release any resources acquired during onInitialize (e.g. close connections, free memory).

Inherited from

BaseProviderClass.onDispose


onInitialize()

abstract protected onInitialize(): Promise<void>;

Defined in: src/providers/base/BaseProvider.ts:217

Provider-specific initialization logic.

Returns

Promise<void>

Remarks

Subclasses must implement this method to perform any setup required before the provider can be used (e.g. validate credentials, open connections, load models).

Inherited from

BaseProviderClass.onInitialize


promptToMessages()

protected promptToMessages(prompt): LLMMessage[];

Defined in: src/providers/base/BaseLLMProvider.ts:141

Convert a plain-text prompt into an LLMMessage array.

Parameters

ParameterTypeDescription
promptstringThe user’s input text.

Returns

LLMMessage[]

A messages array suitable for generateFromMessages.

Remarks

If the provider’s config includes a systemPrompt, it is prepended as a system message. The prompt itself becomes a user message.


updateConfig()

updateConfig(config): void;

Defined in: src/providers/base/BaseProvider.ts:201

Merge partial configuration updates into the current config.

Parameters

ParameterTypeDescription
configPartial<BaseProviderConfig>A partial configuration object whose keys will overwrite existing values.

Returns

void

Remarks

After merging, the subclass hook onConfigUpdate is called so providers can react to changed values at runtime.

Inherited from

BaseProviderClass.updateConfig

© 2026 CompositeVoice. All rights reserved.

Font size
Contrast
Motion
Transparency