Prerequisites
- Node.js 18+
- Express
- A Layercode account and agent (sign up here)
- (Optional) An API key for your LLM provider (we recommend Google Gemini)
Setup
Copy
GOOGLE_GENERATIVE_AI_API_KEY- Your Google AI API keyLAYERCODE_WEBHOOK_SECRET- Your Layercode agent’s webhook secret, found in the Layercode dashboard (go to your agent, click Edit in the Your Backend Box and copy the webhook secret shown)LAYERCODE_API_KEY- Your Layercode API key found in the Layercode dashboard settings
Create Your Express Webhook Endpoint
Here’s an example of a our Layercode webhook endpoint, which generates responses using Google Gemini and streams them back to the frontend as SSE events. See the GitHub repo for the full example.Copy
3. How It Works
- /agent endpoint: Receives POST requests from Layercode with the user’s transcribed message, session, and turn info.
- Session management: Keeps track of conversation history per session (in-memory for demo; use a store for production).
- LLM call: Calls Google Gemini (or your own agent) with the conversation history and streams the response.
- SSE streaming: Streams the agent’s response back to Layercode as Server-Sent Events, which are then converted to speech and played to the user.
- /authorize endpoint: Your Layercode API key should never be exposed to the frontend. Instead, your backend acts as a secure proxy: it receives the frontend’s request then, calls the Layercode authorization API using your secret API key, and finally returns the
client_session_key(and optionally aconversation_id) to the frontend. This key is required for the frontend to establish a secure WebSocket connection to Layercode.
Running Your Backend
Start your Express server:Copy