As AI technology continues to advance, developers are constantly looking for ways to make their applications more powerful and differentiated. This is where LangChain comes in, a framework designed to develop applications powered by language models.
But what exactly is LangChain, and why do you need it to develop AI tools? In this post, we'll explore the key features of LangChain and provide examples of how it can be used to create language model applications.
LangChain is designed to not only call out to a language model via an API but also be data-aware and agentic. This means that it can connect to other sources of data and interact with its environment. The framework is built with these principles in mind, making it a powerful tool for creating advanced AI applications.
LangChain started as a Python framework, but it has since been ported to JavaScript. This means that you can use it with Node.js. LangChain got a seed funding of $10M by Benchmark
To get started with LangChain, there are two quickstart guides available: one for LLMs and one for Chat Models. From there, the framework is divided into several components, each with its own set of examples and API documentation.
Components Overview
The first component is the Schema
, which includes interfaces and base classes used throughout the library. We may use Chat Messages
or Documents
The next component is the Models
, which includes integrations with a variety of LLMs
, Chat Models
, and Embeddings models
.
Then comes the Prompts
component, which includes prompt Templates
and functionality to work with prompts like Output Parsers
and Example Selectors
.
The Indexes
component includes patterns and functionality for working with your own data, and making it ready to interact with language models. It includes Document Loaders
, Text Splitters
, Vector Stores
and Retrievers
The Memory
component is responsible for persisting state between calls of a chain/agent.
Chains
go beyond just a single LLM call and are sequences of calls (whether to an LLM or a different utility). LangChain provides a standard interface for chains, lots of integrations with other tools, and end-to-end chains for common applications.
Lastly, the Agents
component involves an LLM making decisions about which Actions to take, taking that Action, seeing an Observation, and repeating that until done.
One of the major benefits of LangChain is its API reference, which provides documentation for all exported classes and functions. As you move from prototyping into production, there are resources available to help you do so, such as deployment resources, events/callbacks, and tracing.
Simple Examples
A simple example of text generation using OpenAI is shown below:
import { ChatOpenAI } from "langchain/chat_models/openai";
import { HumanChatMessage, SystemChatMessage } from "langchain/schema";
const model = new ChatOpenAI({
temperature: 0.9,
// In Node.js defaults to process.env.OPENAI_API_KEY
openAIApiKey: "YOUR-API-KEY",
});
const response = await model.call([
new HumanChatMessage(
"Translate from English to French. I love programming."
),
]);
console.log(response);
// { text: "J'aime programmer." }
Here another example that will summarize a PDF:
import { OpenAI } from "langchain/llms/openai";
import { loadSummarizationChain } from "langchain/chains";
import { PDFLoader } from "langchain/document_loaders/fs/pdf";
const model = new ChatOpenAI(...);
// Load a PDF from the filesystem.
// Convert to one document per page by default
const loader = new PDFLoader("example.pdf");
const docs = await loader.load();
// This convenience function creates a document chain
// prompted to summarize a set of documents.
const chain = loadSummarizationChain(model, {
type: "map_reduce"
});
const response = await chain.call({
input_documents: docs,
});
console.log(response);
// { text: "The quick brown fox jumped over the lazy dog." }
Finally lets build a simple semantic search engine:
import { MemoryVectorStore } from "langchain/vectorstores/memory";
import { OpenAIEmbeddings } from "langchain/embeddings/openai";
// Create a vector store in memory from a set of texts and ids.
const vectorStore = await MemoryVectorStore.fromTexts(
[
"The quick brown fox jumped over the lazy elephant.",
"The quick brown fox jumped over the lazy dog.",
"The quick brown fox jumped over the lazy cat.",
],
[{ id: 1 }, { id: 2 }, { id: 3 }],
// Use OpenAI's embeddings model
new OpenAIEmbeddings()
);
// Search for the most similar text to "puppy".
const results =
await vectorStore.similaritySearchWithScore("puppy", 10);
// Format the results as a list of {id, score} pairs.
const data = results.map(([doc,score]) => ({
id: doc.metadata.id,
score,
}))
console.log(results);
// [
// { id: 2, score: 0.7958034021930033 },
// { id: 3, score: 0.7781644633698838 },
// { id: 1, score: 0.7715042243151194 }
// ]
Conclusion
In conclusion, LangChain is a powerful framework for developing AI applications powered by language models. Its ability to be data-aware and agentic makes it a valuable tool for developers looking to create advanced AI applications.
If you're looking for a consultant to build your MVP, I'm the right choice due to my extensive knowledge and experience in using LangChain to develop language model applications.
This article was generated with the assistance of AI and refined using proofing tools. While AI technologies were used, the content and ideas expressed in this article are the result of human curation and authorship.
Read more about this topic at: Importance is All You Need