ToolAwareLLMProvider
LLM provider with optional tool use support.
Defined in: src/core/types/providers.ts:925
LLM provider with optional tool use support.
Remarks
Extends LLMProvider with an optional generateWithTools method. Providers that don’t support tool use simply don’t implement this method, and the orchestrator falls back to text-only generation.
Extends
Properties
| Property | Modifier | Type | Default value | Description | Inherited from | Defined in |
|---|---|---|---|---|---|---|
config | public | LLMProviderConfig | undefined | Configuration for this LLM provider. See LLMProviderConfig | LLMProvider.config | src/core/types/providers.ts:806 |
roles | readonly | readonly ProviderRole[] | [] | Pipeline roles this provider covers. Remarks Each provider declares which stages of the 5-role audio pipeline it can fulfil. Single-role providers list one role (e.g. ['stt']); multi-role providers list every role they handle (e.g. ['input', 'stt'] for NativeSTT, which manages its own microphone access). The provider resolution algorithm reads this property to assign providers to pipeline slots. Base provider classes set sensible defaults: - BaseSTTProvider: ['stt'] - BaseLLMProvider: ['llm'] - BaseTTSProvider: ['tts'] See - ProviderRole for the possible role values - ResolvedPipeline for the resolved pipeline slots | LLMProvider.roles | src/core/types/providers.ts:166 |
type | readonly | ProviderType | undefined | The communication type this provider uses. See ProviderType | LLMProvider.type | src/core/types/providers.ts:144 |
Methods
dispose()
dispose(): Promise<void>;
Defined in: src/core/types/providers.ts:186
Clean up resources and dispose of the provider.
Returns
Promise<void>
Remarks
Called by CompositeVoice during agent shutdown. The provider should close any open connections, clear buffers, and release resources.
Inherited from
generate()
generate(prompt, options?): Promise<AsyncIterable<string, any, any>>;
Defined in: src/core/types/providers.ts:823
Generate a response from a single user prompt.
Parameters
| Parameter | Type | Description |
|---|---|---|
prompt | string | The user’s text input (typically transcribed speech) |
options? | LLMGenerationOptions | Optional generation parameters that override provider defaults |
Returns
Promise<AsyncIterable<string, any, any>>
An async iterable of text chunks
Remarks
Returns an async iterable that yields text chunks. When streaming is enabled, multiple chunks are yielded as tokens arrive. When streaming is disabled, a single chunk containing the full response is yielded.
Throws
Error if the provider is not initialized
Throws
AbortError if the generation is cancelled via options.signal
Inherited from
generateFromMessages()
generateFromMessages(messages, options?): Promise<AsyncIterable<string, any, any>>;
Defined in: src/core/types/providers.ts:844
Generate a response from a multi-turn conversation.
Parameters
| Parameter | Type | Description |
|---|---|---|
messages | LLMMessage[] | Array of conversation messages including history |
options? | LLMGenerationOptions | Optional generation parameters that override provider defaults |
Returns
Promise<AsyncIterable<string, any, any>>
An async iterable of text chunks
Remarks
Used when ConversationHistoryConfig is enabled. The messages array includes the system prompt, previous conversation turns, and the latest user input. Returns an async iterable of text chunks, same as generate.
Throws
Error if the provider is not initialized
Throws
AbortError if the generation is cancelled via options.signal
See
- LLMMessage for the message format
- ConversationHistoryConfig for history configuration
Inherited from
LLMProvider.generateFromMessages
generateWithTools()
generateWithTools(messages, options?): Promise<AsyncIterable<LLMStreamChunk, any, any>>;
Defined in: src/core/types/providers.ts:930
Generate with tool support. Returns a richer streaming type that separates text from tool invocations.
Parameters
| Parameter | Type |
|---|---|
messages | LLMMessage[] |
options? | LLMGenerationOptions & { tools?: LLMToolDefinition[]; } |
Returns
Promise<AsyncIterable<LLMStreamChunk, any, any>>
initialize()
initialize(): Promise<void>;
Defined in: src/core/types/providers.ts:177
Initialize the provider and allocate any required resources.
Returns
Promise<void>
Remarks
Called by CompositeVoice during agent startup. The provider should be ready to process requests after this method resolves.
Throws
Error if initialization fails (e.g., invalid API key, network error)
Inherited from
isReady()
isReady(): boolean;
Defined in: src/core/types/providers.ts:193
Check if the provider is initialized and ready to process requests.
Returns
boolean
true if the provider has been initialized and is operational