Skip to content

Mistral

Use Mistral models as the LLM provider in a CompositeVoice pipeline.

Use MistralLLM when you need strong multilingual support, especially for European languages, or want a cost-effective alternative to larger models.

Prerequisites

  • A Mistral API key or a CompositeVoice proxy server
  • Install the peer dependency:
npm install openai

Mistral’s API is OpenAI-compatible, so the openai package handles all communication.

Basic setup

import { CompositeVoice, MistralLLM, NativeSTT, NativeTTS } from '@lukeocodes/composite-voice';

const agent = new CompositeVoice({
  stt: new NativeSTT({ language: 'en-US' }),
  llm: new MistralLLM({
    proxyUrl: '/api/proxy/mistral',
    model: 'mistral-small-latest',
    systemPrompt: 'You are a concise voice assistant. Keep answers under two sentences.',
  }),
  tts: new NativeTTS(),
});

await agent.start();

Configuration options

OptionTypeDefaultDescription
modelstring'mistral-small-latest'Model identifier. See model variants below.
systemPromptstringSystem-level instructions for the assistant.
temperaturenumberRandomness (0 = deterministic, 2 = creative).
maxTokensnumberMaximum tokens per response.
topPnumberNucleus sampling threshold (0—1).
streambooleantrueStream tokens incrementally.
proxyUrlstringCompositeVoice proxy endpoint. Recommended for browsers.
mistralApiKeystringMistral API key. Convenience alias for apiKey.
apiKeystringDirect API key. mistralApiKey takes precedence if both are set.

Model variants

ModelSpeedNotes
mistral-small-latestFastDefault. Good speed-to-quality ratio for voice.
mistral-medium-latestModerateBalanced capability.
mistral-large-latestSlowerMost capable Mistral model.

Complete example

import {
  CompositeVoice,
  MistralLLM,
  DeepgramSTT,
  DeepgramTTS,
} from '@lukeocodes/composite-voice';

const agent = new CompositeVoice({
  stt: new DeepgramSTT({
    proxyUrl: '/api/proxy/deepgram',
    language: 'en',
    options: { model: 'nova-3', smartFormat: true },
  }),
  llm: new MistralLLM({
    proxyUrl: '/api/proxy/mistral',
    model: 'mistral-small-latest',
    temperature: 0.7,
    maxTokens: 256,
    systemPrompt: 'Tu es un assistant vocal amical. Reponds brievement.',
  }),
  tts: new DeepgramTTS({
    proxyUrl: '/api/proxy/deepgram',
    voice: 'aura-2-thalia-en',
  }),
  conversationHistory: { enabled: true, maxTurns: 10 },
});

await agent.start();

Tips

  • Mistral excels at multilingual tasks. French and other European languages produce especially good results.
  • mistral-small-latest is best for voice. It provides the fastest responses while maintaining quality for conversational use cases.
  • Mistral uses the openai peer dependency. You do not need to install @mistralai/mistralai.
  • Model names use the -latest suffix. This always points to the most recent version of that model tier.

© 2026 CompositeVoice. All rights reserved.

Font size
Contrast
Motion
Transparency