Skip to main content
These docs will be deprecated and no longer maintained with the release of LangChain v1.0 in October 2025. Visit the v1.0 alpha docs

DeepInfra

LangChain supports LLMs hosted by Deep Infra through the DeepInfra wrapper. First, you'll need to install the @langchain/community package:

  • npm
  • Yarn
  • pnpm
npm install @langchain/community @langchain/core

You'll need to obtain an API key and set it as an environment variable named DEEPINFRA_API_TOKEN (or pass it into the constructor), then call the model as shown below:

import { DeepInfraLLM } from "@langchain/community/llms/deepinfra";

const apiKey = process.env.DEEPINFRA_API_TOKEN;
const model = "meta-llama/Meta-Llama-3-70B-Instruct";

const llm = new DeepInfraLLM({
temperature: 0.7,
maxTokens: 20,
model,
apiKey,
maxRetries: 5,
});

const res = await llm.invoke(
"What is the next step in the process of making a good game?"
);

console.log({ res });

API Reference:


Was this page helpful?


You can also leave detailed feedback on GitHub.

Morty Proxy This is a proxified and sanitized view of the page, visit original site.