BaseLLMProvider
Abstract base class for LLM providers in CompositeVoice.
Defined in: src/providers/base/BaseLLMProvider.ts:73
Abstract base class for LLM providers in CompositeVoice.
Remarks
BaseLLMProvider extends BaseProvider and implements the LLMProvider interface. It provides shared helpers for building message arrays and merging generation options, while requiring subclasses to implement the two generation methods.
All LLM providers communicate over REST (type = 'rest') and follow a Receive Text -> Send Text contract:
- Input: a plain-text prompt or an array of LLMMessage objects.
- Output: an
AsyncIterable<string>that yields text chunks (supports both streaming and non-streaming implementations).
Inheritance hierarchy:
BaseProvider
+-- BaseLLMProvider <-- you are here
+-- AnthropicLLM (streaming SSE)
+-- OpenAILLM (non-streaming / streaming)
+-- GroqLLM (streaming)
+-- WebLLMLLM (in-browser inference)
Example
import { BaseLLMProvider } from 'composite-voice';
import type { LLMProviderConfig, LLMGenerationOptions, LLMMessage } from 'composite-voice';
class MyLLMProvider extends BaseLLMProvider {
constructor(config: LLMProviderConfig) {
super(config);
}
protected async onInitialize(): Promise<void> { }
protected async onDispose(): Promise<void> { }
async generate(prompt: string, options?: LLMGenerationOptions) {
const messages = this.promptToMessages(prompt);
return this.generateFromMessages(messages, options);
}
async *generateFromMessages(messages: LLMMessage[], options?: LLMGenerationOptions) {
const merged = this.mergeOptions(options);
const response = await myApi.chat(messages, merged);
yield response.text;
}
}
See
- BaseProvider for the root provider lifecycle
- LLMProvider for the interface contract
Extends
Extended by
Implements
Constructors
Constructor
new BaseLLMProvider(config, logger?): BaseLLMProvider;
Defined in: src/providers/base/BaseLLMProvider.ts:87
Create a new LLM provider.
Parameters
| Parameter | Type | Description |
|---|---|---|
config | LLMProviderConfig | LLM provider configuration including model name, temperature, system prompt, and other generation defaults. |
logger? | Logger | Optional parent logger; a child will be derived. |
Returns
BaseLLMProvider
Overrides
Properties
| Property | Modifier | Type | Default value | Description | Overrides | Inherited from | Defined in |
|---|---|---|---|---|---|---|---|
config | public | LLMProviderConfig | undefined | LLM-specific provider configuration. | LLMProvider.config BaseProviderClass.config | - | src/providers/base/BaseLLMProvider.ts:78 |
initialized | protected | boolean | false | Tracks whether initialize has completed successfully. | - | BaseProviderClass.initialized | src/providers/base/BaseProvider.ts:97 |
logger | protected | Logger | undefined | Scoped logger instance for this provider. | - | BaseProviderClass.logger | src/providers/base/BaseProvider.ts:94 |
roles | readonly | readonly ProviderRole[] | undefined | LLM providers cover the 'llm' pipeline role by default. | LLMProvider.roles BaseProviderClass.roles | - | src/providers/base/BaseLLMProvider.ts:75 |
type | readonly | ProviderType | undefined | Communication transport this provider uses ('rest' or 'websocket'). | - | LLMProvider.type BaseProviderClass.type | src/providers/base/BaseProvider.ts:74 |
Methods
assertReady()
protected assertReady(): void;
Defined in: src/providers/base/BaseProvider.ts:255
Guard that throws if the provider has not been initialized.
Returns
void
Remarks
Call at the start of any method that requires the provider to be ready.
Throws
Error Thrown with a descriptive message when initialized is false.
Inherited from
dispose()
dispose(): Promise<void>;
Defined in: src/providers/base/BaseProvider.ts:154
Clean up resources and dispose of the provider.
Returns
Promise<void>
Remarks
Delegates to the subclass hook onDispose and resets the initialized flag. If the provider is not initialized, the call is a no-op.
Throws
Re-throws any error raised by onDispose.
Implementation of
Inherited from
generate()
abstract generate(prompt, options?): Promise<AsyncIterable<string, any, any>>;
Defined in: src/providers/base/BaseLLMProvider.ts:108
Generate a response from a single user prompt.
Parameters
| Parameter | Type | Description |
|---|---|---|
prompt | string | The user’s input text. |
options? | LLMGenerationOptions | Optional generation overrides (temperature, max tokens, stop sequences, abort signal, etc.). |
Returns
Promise<AsyncIterable<string, any, any>>
An AsyncIterable that yields text chunks as they arrive.
Remarks
Interface: Receive Text -> Send Text. This is the simplest way to get a completion. Implementations typically convert the prompt into a messages array (optionally prepending a system message) and delegate to generateFromMessages.
Implementation of
generateFromMessages()
abstract generateFromMessages(messages, options?): Promise<AsyncIterable<string, any, any>>;
Defined in: src/providers/base/BaseLLMProvider.ts:126
Generate a response from an array of conversation messages.
Parameters
| Parameter | Type | Description |
|---|---|---|
messages | LLMMessage[] | Ordered array of LLMMessage objects representing the conversation history. |
options? | LLMGenerationOptions | Optional generation overrides. |
Returns
Promise<AsyncIterable<string, any, any>>
An AsyncIterable that yields text chunks as they arrive.
Remarks
Interface: Receive Text -> Send Text. Use this method when you need multi-turn conversation support. The messages array can include system, user, and assistant roles to provide full conversational context.
Implementation of
LLMProvider.generateFromMessages
getConfig()
getConfig(): LLMProviderConfig;
Defined in: src/providers/base/BaseLLMProvider.ts:192
Get a shallow copy of the current LLM configuration.
Returns
A new LLMProviderConfig object.
Overrides
initialize()
initialize(): Promise<void>;
Defined in: src/providers/base/BaseProvider.ts:127
Initialize the provider, making it ready for use.
Returns
Promise<void>
Remarks
Calls the subclass hook onInitialize. If the provider has already been initialized the call is a no-op.
Throws
ProviderInitializationError Thrown when onInitialize rejects. The original error is wrapped with the provider class name for diagnostics.
Implementation of
Inherited from
isReady()
isReady(): boolean;
Defined in: src/providers/base/BaseProvider.ts:178
Check whether the provider has been initialized and is ready.
Returns
boolean
true when initialize has completed successfully and dispose has not yet been called.
Implementation of
Inherited from
mergeOptions()
protected mergeOptions(options?): LLMGenerationOptions;
Defined in: src/providers/base/BaseLLMProvider.ts:170
Merge per-call generation options with the provider’s config defaults.
Parameters
| Parameter | Type | Description |
|---|---|---|
options? | LLMGenerationOptions | Optional per-call overrides. |
Returns
A merged LLMGenerationOptions object.
Remarks
Values supplied in options take precedence over values in config. Only defined values are included in the result, allowing providers to distinguish “not set” from explicit values.
onConfigUpdate()
protected onConfigUpdate(_config): void;
Defined in: src/providers/base/BaseProvider.ts:242
Hook called after updateConfig merges new values.
Parameters
| Parameter | Type | Description |
|---|---|---|
_config | Partial<BaseProviderConfig> | The partial configuration that was merged. |
Returns
void
Remarks
The default implementation is a no-op. Override in subclasses to react to runtime configuration changes (e.g. reconnect with a new API key).
Inherited from
BaseProviderClass.onConfigUpdate
onDispose()
abstract protected onDispose(): Promise<void>;
Defined in: src/providers/base/BaseProvider.ts:229
Provider-specific disposal logic.
Returns
Promise<void>
Remarks
Subclasses must implement this method to release any resources acquired during onInitialize (e.g. close connections, free memory).
Inherited from
onInitialize()
abstract protected onInitialize(): Promise<void>;
Defined in: src/providers/base/BaseProvider.ts:217
Provider-specific initialization logic.
Returns
Promise<void>
Remarks
Subclasses must implement this method to perform any setup required before the provider can be used (e.g. validate credentials, open connections, load models).
Inherited from
BaseProviderClass.onInitialize
promptToMessages()
protected promptToMessages(prompt): LLMMessage[];
Defined in: src/providers/base/BaseLLMProvider.ts:141
Convert a plain-text prompt into an LLMMessage array.
Parameters
| Parameter | Type | Description |
|---|---|---|
prompt | string | The user’s input text. |
Returns
A messages array suitable for generateFromMessages.
Remarks
If the provider’s config includes a systemPrompt, it is prepended as a system message. The prompt itself becomes a user message.
updateConfig()
updateConfig(config): void;
Defined in: src/providers/base/BaseProvider.ts:201
Merge partial configuration updates into the current config.
Parameters
| Parameter | Type | Description |
|---|---|---|
config | Partial<BaseProviderConfig> | A partial configuration object whose keys will overwrite existing values. |
Returns
void
Remarks
After merging, the subclass hook onConfigUpdate is called so providers can react to changed values at runtime.