Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

langchain-mistralai[major]: Add MistralAI chat and embed #3623

Merged
merged 27 commits into from
Dec 14, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
27 commits
Select commit Hold shift + click to select a range
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
45 changes: 45 additions & 0 deletions docs/core_docs/docs/integrations/chat/mistral.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
---
sidebar_label: Mistral AI
---

import CodeBlock from "@theme/CodeBlock";

# ChatMistralAI

[Mistral AI](https://mistral.ai/) is a research organization and hosting platform for LLMs.
They're most known for their family of 7B models ([`mistral7b` // `mistral-tiny`](https://mistral.ai/news/announcing-mistral-7b/), [`mixtral8x7b` // `mistral-small`](https://mistral.ai/news/mixtral-of-experts/)).

The LangChain implementation of Mistral's models uses their hosted generation API, making it easier to access their models without needing to run them locally.

## Models

Mistral's API offers access to two of their models: `mistral7b` and `mixtral8x7b`. You can access these through the API by using their model names:

- `mistral7b` -> `mistral-tiny`
- `mixtral8x7b` -> `mistral-small`
bracesproul marked this conversation as resolved.
Show resolved Hide resolved

## Setup

In order to use the Mistral API you'll need an API key. You can sign up for a Mistral account and create an API key [here](https://console.mistral.ai/).

## Usage

When sending chat messages to mistral, there are a few requirements to follow:

- The first message can _*not*_ be an assistant (ai) message.
- Messages _*must*_ alternate between user and assistant (ai) messages.
- Messages can _*not*_ end with an assistant (ai) or system message.

import ChatMistralAIExample from "@examples/models/chat/chat_mistralai.ts";

<CodeBlock language="typescript">{ChatMistralAIExample}</CodeBlock>

You can see a simple LangSmith trace of this here:

### Streaming

Mistral's API also supports streaming token responses. The example below demonstrates how to use this feature.

import ChatStreamMistralAIExample from "@examples/models/chat/chat_stream_mistralai.ts";

<CodeBlock language="typescript">{ChatStreamMistralAIExample}</CodeBlock>
14 changes: 14 additions & 0 deletions docs/core_docs/docs/integrations/text_embedding/mistralai.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
---
sidebar_label: Mistral AI
---

# Mistral AI

The `MistralAIEmbeddings` class uses the Mistral AI API to generate embeddings for a given text.

## Usage

import CodeBlock from "@theme/CodeBlock";
import MistralExample from "@examples/models/embeddings/mistral.ts";

<CodeBlock language="typescript">{MistralExample}</CodeBlock>
2 changes: 2 additions & 0 deletions examples/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,9 @@
"@gomomento/sdk": "^1.51.1",
"@google/generative-ai": "^0.1.0",
"@langchain/community": "workspace:*",
"@langchain/core": "workspace:*",
"@langchain/google-genai": "workspace:*",
"@langchain/mistralai": "workspace:*",
"@opensearch-project/opensearch": "^2.2.0",
"@pinecone-database/pinecone": "^1.1.0",
"@planetscale/database": "^1.8.0",
Expand Down
44 changes: 44 additions & 0 deletions examples/src/models/chat/chat_mistralai.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
import { ChatMistralAI } from "@langchain/mistralai";
bracesproul marked this conversation as resolved.
Show resolved Hide resolved
import { ChatPromptTemplate } from "langchain/prompts";

const model = new ChatMistralAI({
apiKey: process.env.MISTRAL_API_KEY,
modelName: "mistral-small",
});
const prompt = ChatPromptTemplate.fromMessages([
["system", "You are a helpful assistant"],
["human", "{input}"],
]);
const chain = prompt.pipe(model);
const response = await chain.invoke({
input: "Hello",
});
console.log("response", response);
/**
response AIMessage {
lc_namespace: [ 'langchain_core', 'messages' ],
content: "Hello! I'm here to help answer any questions you might have or provide information on a variety of topics. How can I assist you today?\n" +
'\n' +
'Here are some common tasks I can help with:\n' +
'\n' +
'* Setting alarms or reminders\n' +
'* Sending emails or messages\n' +
'* Making phone calls\n' +
'* Providing weather information\n' +
'* Creating to-do lists\n' +
'* Offering suggestions for restaurants, movies, or other local activities\n' +
'* Providing definitions and explanations for words or concepts\n' +
'* Translating text into different languages\n' +
'* Playing music or podcasts\n' +
'* Setting timers\n' +
'* Providing directions or traffic information\n' +
'* And much more!\n' +
'\n' +
"Let me know how I can help you specifically, and I'll do my best to make your day easier and more productive!\n" +
'\n' +
'Best regards,\n' +
'Your helpful assistant.',
name: undefined,
additional_kwargs: {}
}
*/
32 changes: 32 additions & 0 deletions examples/src/models/chat/chat_stream_mistralai.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
import { ChatMistralAI } from "@langchain/mistralai";
bracesproul marked this conversation as resolved.
Show resolved Hide resolved
import { ChatPromptTemplate } from "langchain/prompts";
import { StringOutputParser } from "langchain/schema/output_parser";

const model = new ChatMistralAI({
apiKey: process.env.MISTRAL_API_KEY,
modelName: "mistral-small",
});
const prompt = ChatPromptTemplate.fromMessages([
["system", "You are a helpful assistant"],
["human", "{input}"],
]);
const outputParser = new StringOutputParser();
const chain = prompt.pipe(model).pipe(outputParser);
const response = await chain.stream({
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would change this to use an output parser

input: "Hello",
});
for await (const item of response) {
console.log("stream item:", item);
}
/**
stream item:
stream item: Hello! I'm here to help answer any questions you
stream item: might have or assist you with any task you'd like to
stream item: accomplish. I can provide information
stream item: on a wide range of topics
stream item: , from math and science to history and literature. I can
stream item: also help you manage your schedule, set reminders, and
stream item: much more. Is there something specific you need help with? Let
stream item: me know!
stream item:
*/
11 changes: 11 additions & 0 deletions examples/src/models/embeddings/mistral.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
import { MistralAIEmbeddings } from "@langchain/mistralai";

/* Embed queries */
const embeddings = new MistralAIEmbeddings({
apiKey: process.env.MISTRAL_API_KEY,
});
const res = await embeddings.embedQuery("Hello world");
console.log(res);
/* Embed documents */
const documentRes = await embeddings.embedDocuments(["Hello world", "Bye bye"]);
console.log({ documentRes });
1 change: 1 addition & 0 deletions langchain-core/.eslintrc.cjs
Original file line number Diff line number Diff line change
Expand Up @@ -51,6 +51,7 @@ module.exports = {
"no-await-in-loop": 0,
"no-bitwise": 0,
"no-console": 0,
"no-empty-function": 0,
"no-restricted-syntax": 0,
"no-shadow": 0,
"no-continue": 0,
Expand Down
2 changes: 1 addition & 1 deletion langchain-core/src/language_models/chat_models.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ import {
AIMessage,
BaseMessage,
BaseMessageChunk,
BaseMessageLike,
type BaseMessageLike,
HumanMessage,
coerceMessageLikeToMessage,
} from "../messages/index.js";
Expand Down
2 changes: 1 addition & 1 deletion langchain-core/src/messages/tests/base_message.test.ts
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
import { test } from "@jest/globals";
import { ChatPromptTemplate } from "../../prompts/chat.js";
import { HumanMessage } from "../../messages/index.js";
import { HumanMessage } from "../index.js";

test("Test ChatPromptTemplate can format OpenAI content image messages", async () => {
const message = new HumanMessage({
Expand Down
66 changes: 66 additions & 0 deletions libs/langchain-mistralai/.eslintrc.cjs
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
module.exports = {
extends: [
"airbnb-base",
"eslint:recommended",
"prettier",
"plugin:@typescript-eslint/recommended",
],
parserOptions: {
ecmaVersion: 12,
parser: "@typescript-eslint/parser",
project: "./tsconfig.json",
sourceType: "module",
},
plugins: ["@typescript-eslint", "no-instanceof"],
ignorePatterns: [
".eslintrc.cjs",
"scripts",
"node_modules",
"dist",
"dist-cjs",
"*.js",
"*.cjs",
"*.d.ts",
],
rules: {
"no-process-env": 2,
"no-instanceof/no-instanceof": 2,
"@typescript-eslint/explicit-module-boundary-types": 0,
"@typescript-eslint/no-empty-function": 0,
"@typescript-eslint/no-shadow": 0,
"@typescript-eslint/no-empty-interface": 0,
"@typescript-eslint/no-use-before-define": ["error", "nofunc"],
"@typescript-eslint/no-unused-vars": ["warn", { args: "none" }],
"@typescript-eslint/no-floating-promises": "error",
"@typescript-eslint/no-misused-promises": "error",
camelcase: 0,
"class-methods-use-this": 0,
"import/extensions": [2, "ignorePackages"],
"import/no-extraneous-dependencies": [
"error",
{ devDependencies: ["**/*.test.ts"] },
],
"import/no-unresolved": 0,
"import/prefer-default-export": 0,
"keyword-spacing": "error",
"max-classes-per-file": 0,
"max-len": 0,
"no-await-in-loop": 0,
"no-bitwise": 0,
"no-console": 0,
"no-restricted-syntax": 0,
"no-shadow": 0,
"no-continue": 0,
"no-void": 0,
"no-underscore-dangle": 0,
"no-use-before-define": 0,
"no-useless-constructor": 0,
"no-return-await": 0,
"consistent-return": 0,
"no-else-return": 0,
"func-names": 0,
"no-lonely-if": 0,
"prefer-rest-params": 0,
"new-cap": ["error", { properties: false, capIsNew: false }],
},
};
6 changes: 6 additions & 0 deletions libs/langchain-mistralai/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
index.cjs
index.js
index.d.ts
node_modules
dist
.yarn
21 changes: 21 additions & 0 deletions libs/langchain-mistralai/LICENSE
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
The MIT License

Copyright (c) 2023 LangChain

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
19 changes: 19 additions & 0 deletions libs/langchain-mistralai/jest.config.cjs
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
/** @type {import('ts-jest').JestConfigWithTsJest} */
module.exports = {
preset: "ts-jest/presets/default-esm",
testEnvironment: "./jest.env.cjs",
modulePathIgnorePatterns: ["dist/", "docs/"],
moduleNameMapper: {
"^(\\.{1,2}/.*)\\.js$": "$1",
},
transform: {
'^.+\\.tsx?$': ['@swc/jest'],
},
transformIgnorePatterns: [
"/node_modules/",
"\\.pnp\\.[^\\/]+$",
"./scripts/jest-setup-after-env.js",
],
setupFiles: ["dotenv/config"],
testTimeout: 20_000,
};
12 changes: 12 additions & 0 deletions libs/langchain-mistralai/jest.env.cjs
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
const { TestEnvironment } = require("jest-environment-node");

class AdjustedTestEnvironmentToSupportFloat32Array extends TestEnvironment {
constructor(config, context) {
// Make `instanceof Float32Array` return true in tests
// to avoid https://github.com/xenova/transformers.js/issues/57 and https://github.com/jestjs/jest/issues/2549
super(config, context);
this.global.Float32Array = Float32Array;
}
}

module.exports = AdjustedTestEnvironmentToSupportFloat32Array;
79 changes: 79 additions & 0 deletions libs/langchain-mistralai/package.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,79 @@
{
bracesproul marked this conversation as resolved.
Show resolved Hide resolved
"name": "@langchain/mistralai",
"version": "0.0.1",
"description": "MistralAI integration for LangChain.js",
"type": "module",
"engines": {
"node": ">=18"
},
"main": "./index.js",
"types": "./index.d.ts",
"repository": {
"type": "git",
"url": "[email protected]:langchain-ai/langchainjs.git"
},
"scripts": {
"build": "yarn clean && yarn build:deps && yarn build:esm && yarn build:cjs && yarn build:scripts",
"build:deps": "yarn run turbo:command build --filter=@langchain/core",
"build:esm": "NODE_OPTIONS=--max-old-space-size=4096 tsc --outDir dist/ && rm -rf dist/tests dist/**/tests",
"build:cjs": "NODE_OPTIONS=--max-old-space-size=4096 tsc --outDir dist-cjs/ -p tsconfig.cjs.json && node scripts/move-cjs-to-dist.js && rm -rf dist-cjs",
"build:watch": "node scripts/create-entrypoints.js && tsc --outDir dist/ --watch",
"build:scripts": "node scripts/create-entrypoints.js && node scripts/check-tree-shaking.js",
"lint": "NODE_OPTIONS=--max-old-space-size=4096 eslint src && dpdm --exit-code circular:1 --no-warning --no-tree src/*.ts src/**/*.ts",
"lint:fix": "yarn lint --fix",
"clean": "rm -rf dist/ && NODE_OPTIONS=--max-old-space-size=4096 node scripts/create-entrypoints.js pre",
"prepack": "yarn build",
"release": "release-it --only-version --config .release-it.json",
"test": "NODE_OPTIONS=--experimental-vm-modules jest --testPathIgnorePatterns=\\.int\\.test.ts --testTimeout 30000 --maxWorkers=50%",
"test:watch": "NODE_OPTIONS=--experimental-vm-modules jest --watch --testPathIgnorePatterns=\\.int\\.test.ts",
"test:single": "NODE_OPTIONS=--experimental-vm-modules yarn run jest --config jest.config.cjs --testTimeout 100000",
"test:int": "NODE_OPTIONS=--experimental-vm-modules jest --testPathPattern=\\.int\\.test.ts --testTimeout 100000 --maxWorkers=50%",
"format": "prettier --write \"src\"",
"format:check": "prettier --check \"src\""
},
"author": "LangChain",
"license": "MIT",
"dependencies": {
"@langchain/core": "~0.1.0",
"@mistralai/mistralai": "^0.0.7"
},
"devDependencies": {
"@jest/globals": "^29.5.0",
"@swc/core": "^1.3.90",
"@swc/jest": "^0.2.29",
"@tsconfig/recommended": "^1.0.3",
"@typescript-eslint/eslint-plugin": "^6.12.0",
"@typescript-eslint/parser": "^6.12.0",
"dotenv": "^16.3.1",
"dpdm": "^3.12.0",
"eslint": "^8.33.0",
"eslint-config-airbnb-base": "^15.0.0",
"eslint-config-prettier": "^8.6.0",
"eslint-plugin-import": "^2.27.5",
"eslint-plugin-no-instanceof": "^1.0.1",
"eslint-plugin-prettier": "^4.2.1",
"jest": "^29.5.0",
"jest-environment-node": "^29.6.4",
"prettier": "^2.8.3",
"rollup": "^4.5.2",
"ts-jest": "^29.1.0",
"typescript": "<5.2.0"
},
"publishConfig": {
"access": "public"
},
"exports": {
".": {
"types": "./index.d.ts",
"import": "./index.js",
"require": "./index.cjs"
},
"./package.json": "./package.json"
},
"files": [
"dist/",
"index.cjs",
"index.js",
"index.d.ts"
]
}
Loading