Docs/Deploy & operate/Framework integrations

Framework integrations

Wordcab fits the orchestration stack your team already runs. Pipecat and LiveKit are native; everything else is OpenAI-compatible.

Wordcab plugs into the voice-agent orchestration frameworks your team already uses. Every adapter is OpenAI-compatible at the LLM layer and WebSocket-based at the audio layer, so "change the URL" is usually the whole integration.

Pipecat (native)

Wordcab Voice and Think ship as first-class Pipecat services. Tested against Pipecat 1.0 (Apr 2026). Full async pipeline support with VAD, turn detection, and interruption handling.

python
from pipecat.pipeline import Pipeline
from pipecat_wordcab import WordcabSTT, WordcabLLM, WordcabTTS

pipeline = Pipeline([
    WordcabSTT(model="voxtral-realtime", language="en"),
    WordcabLLM(model="qwen3.5-4b", system="You are a helpful agent."),
    WordcabTTS(voice="ember", model="qwen3-tts"),
])

LiveKit Agents (native)

Drop-in provider plugin for LiveKit Agents. Works with LiveKit Cloud or a self-hosted SFU. Keeps audio inside your LiveKit instance — no round-trip to a hosted STT vendor.

python
from livekit import agents
from livekit_plugins_wordcab import WordcabSTT, WordcabLLM, WordcabTTS

async def entrypoint(ctx: agents.JobContext):
    assistant = agents.VoiceAssistant(
        stt=WordcabSTT(model="voxtral-realtime"),
        llm=WordcabLLM(model="qwen3.5-4b"),
        tts=WordcabTTS(voice="ember"),
    )
    assistant.start(ctx.room)

Daily.co

Used via Pipecat's Daily transport or directly against the Daily Bots API. Common pattern for meeting capture and bot-assisted workflows. Point the bot's LLM/STT/TTS endpoints at Wordcab; Daily handles the media.

Vapi

Configure a Wordcab endpoint as a custom STT / LLM / TTS provider inside Vapi's agent builder. The OpenAI-compatible chat endpoint plugs in without custom glue.

Retell AI

Retell's Custom LLM URL feature: point it at a Wordcab Think deployment. Retell handles the telephony glue; reasoning stays inside your boundary.

LangGraph / LangChain

Use the OpenAI-compatible chat endpoint directly (no custom adapter needed):

python
from langchain_openai import ChatOpenAI

llm = ChatOpenAI(
    base_url="https://wordcab.apps.example.com",
    api_key=os.environ["WORDCAB_API_KEY"],
    model="qwen3.5-4b",
)

LlamaIndex

python
from llama_index.llms.openai_like import OpenAILike

llm = OpenAILike(
    api_base="https://wordcab.apps.example.com/v1",
    api_key=os.environ["WORDCAB_API_KEY"],
    model="qwen3.5-4b",
    is_chat_model=True,
)

Flowise / n8n / Zapier

These platforms have generic "OpenAI-compatible" connectors. Swap api_base to the Wordcab endpoint and keep going.