Skip to main content
These docs will be deprecated and no longer maintained with the release of LangChain v1.0 in October 2025. Visit the v1.0 alpha docs

Friendli

Friendli enhances AI application performance and optimizes cost savings with scalable, efficient deployment options, tailored for high-demand AI workloads.

This tutorial guides you through integrating Friendli with LangChain.

Setup

Ensure the @langchain/community is installed.

  • npm
  • Yarn
  • pnpm
npm install @langchain/community @langchain/core

Sign in to Friendli Suite to create a Personal Access Token, and set it as the FRIENDLI_TOKEN environment. You can set team id as FRIENDLI_TEAM environment.

You can initialize a Friendli chat model with selecting the model you want to use. The default model is mixtral-8x7b-instruct-v0-1. You can check the available models at docs.friendli.ai.

Usage

import { Friendli } from "@langchain/community/llms/friendli";

const model = new Friendli({
model: "mixtral-8x7b-instruct-v0-1", // Default value
friendliToken: process.env.FRIENDLI_TOKEN,
friendliTeam: process.env.FRIENDLI_TEAM,
maxTokens: 18,
temperature: 0.75,
topP: 0.25,
frequencyPenalty: 0,
stop: [],
});

const response = await model.invoke(
"Check the Grammar: She dont like to eat vegetables, but she loves fruits."
);

console.log(response);

/*
Correct: She doesn't like to eat vegetables, but she loves fruits
*/

const stream = await model.stream(
"Check the Grammar: She dont like to eat vegetables, but she loves fruits."
);

for await (const chunk of stream) {
console.log(chunk);
}

/*
Cor
rect
:
She
doesn
...
she
loves
fruits
*/

API Reference:

  • Friendli from @langchain/community/llms/friendli

Was this page helpful?


You can also leave detailed feedback on GitHub.

Morty Proxy This is a proxified and sanitized view of the page, visit original site.