LLMProvider
Large language model provider interface.
Defined in: src/core/types/providers.ts:800
Large language model provider interface.
Remarks
Defines the contract for LLM providers in the CompositeVoice pipeline. The provider receives transcribed text (or a message history) and produces a text response, either as a complete string or as a stream of tokens via an async iterable.
For voice applications, streaming (stream: true) is strongly recommended as it allows TTS synthesis to begin before the full response is generated, significantly reducing end-to-end latency.
Example
class MyLLMProvider implements LLMProvider {
readonly type = 'rest';
readonly roles = ['llm'] as const;
config: LLMProviderConfig;
async initialize() { /* ... */ }
async dispose() { /* ... */ }
isReady() { return true; }
async *generate(prompt: string, options?: LLMGenerationOptions) {
const response = await callMyAPI(prompt, options);
yield response;
}
async *generateFromMessages(messages: LLMMessage[], options?: LLMGenerationOptions) {
const response = await callMyAPI(messages, options);
yield response;
}
}
See
- BaseProvider for lifecycle methods
- LLMProviderConfig for configuration options
- LLMGenerationOptions for per-request options
- LLMMessage for conversation message format
Extends
Extended by
Properties
| Property | Modifier | Type | Default value | Description | Inherited from | Defined in |
|---|---|---|---|---|---|---|
config | public | LLMProviderConfig | undefined | Configuration for this LLM provider. See LLMProviderConfig | - | src/core/types/providers.ts:806 |
roles | readonly | readonly ProviderRole[] | [] | Pipeline roles this provider covers. Remarks Each provider declares which stages of the 5-role audio pipeline it can fulfil. Single-role providers list one role (e.g. ['stt']); multi-role providers list every role they handle (e.g. ['input', 'stt'] for NativeSTT, which manages its own microphone access). The provider resolution algorithm reads this property to assign providers to pipeline slots. Base provider classes set sensible defaults: - BaseSTTProvider: ['stt'] - BaseLLMProvider: ['llm'] - BaseTTSProvider: ['tts'] See - ProviderRole for the possible role values - ResolvedPipeline for the resolved pipeline slots | BaseProvider.roles | src/core/types/providers.ts:166 |
type | readonly | ProviderType | undefined | The communication type this provider uses. See ProviderType | BaseProvider.type | src/core/types/providers.ts:144 |
Methods
dispose()
dispose(): Promise<void>;
Defined in: src/core/types/providers.ts:186
Clean up resources and dispose of the provider.
Returns
Promise<void>
Remarks
Called by CompositeVoice during agent shutdown. The provider should close any open connections, clear buffers, and release resources.
Inherited from
generate()
generate(prompt, options?): Promise<AsyncIterable<string, any, any>>;
Defined in: src/core/types/providers.ts:823
Generate a response from a single user prompt.
Parameters
| Parameter | Type | Description |
|---|---|---|
prompt | string | The user’s text input (typically transcribed speech) |
options? | LLMGenerationOptions | Optional generation parameters that override provider defaults |
Returns
Promise<AsyncIterable<string, any, any>>
An async iterable of text chunks
Remarks
Returns an async iterable that yields text chunks. When streaming is enabled, multiple chunks are yielded as tokens arrive. When streaming is disabled, a single chunk containing the full response is yielded.
Throws
Error if the provider is not initialized
Throws
AbortError if the generation is cancelled via options.signal
generateFromMessages()
generateFromMessages(messages, options?): Promise<AsyncIterable<string, any, any>>;
Defined in: src/core/types/providers.ts:844
Generate a response from a multi-turn conversation.
Parameters
| Parameter | Type | Description |
|---|---|---|
messages | LLMMessage[] | Array of conversation messages including history |
options? | LLMGenerationOptions | Optional generation parameters that override provider defaults |
Returns
Promise<AsyncIterable<string, any, any>>
An async iterable of text chunks
Remarks
Used when ConversationHistoryConfig is enabled. The messages array includes the system prompt, previous conversation turns, and the latest user input. Returns an async iterable of text chunks, same as generate.
Throws
Error if the provider is not initialized
Throws
AbortError if the generation is cancelled via options.signal
See
- LLMMessage for the message format
- ConversationHistoryConfig for history configuration
initialize()
initialize(): Promise<void>;
Defined in: src/core/types/providers.ts:177
Initialize the provider and allocate any required resources.
Returns
Promise<void>
Remarks
Called by CompositeVoice during agent startup. The provider should be ready to process requests after this method resolves.
Throws
Error if initialization fails (e.g., invalid API key, network error)
Inherited from
isReady()
isReady(): boolean;
Defined in: src/core/types/providers.ts:193
Check if the provider is initialized and ready to process requests.
Returns
boolean
true if the provider has been initialized and is operational