Skip to content

Commit

Permalink
✨ feat: Add custom top-p (#107)
Browse files Browse the repository at this point in the history
  • Loading branch information
Mingholy authored Jul 8, 2024
1 parent 2ebe92d commit 0ed03c2
Show file tree
Hide file tree
Showing 4 changed files with 23 additions and 16 deletions.
3 changes: 2 additions & 1 deletion packages/lobe-i18n/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ English ・ [简体中文](./README.zh-CN.md) ・ [Changelog](./CHANGELOG.md) ·
- [x] ♻️ Support incremental i18n updates, automatically extract new content based on `entry` files.
- [x] 🗂️ Support single file mode `en_US.json` and folder mode `en_US/common.json` to work perfectly with `i18next`.
- [x] 🌲 Support `flat` and `tree` structure for locale files.
- [x] 🛠️ Support customizing OpenAI models, API proxies, and temperature.
- [x] 🛠️ Support customizing OpenAI models, API proxies, temperature and topP.
- [x] 📝 Support automated i18n translation of `Markdown` files.

<div align="right">
Expand Down Expand Up @@ -154,6 +154,7 @@ This project provides some additional configuration items set with environment v
| reference | | `string` | - | Provide some context for more accurate translations |
| splitToken | | `number` | - | Split the localized JSON file by tokens, automatically calculated by default |
| temperature | | `number` | `0` | Sampling temperature to use |
| topP | | `number` | `1` | Nucleus sampling threshold, controls the diversity of generated text |
| concurrency | | `number` | `5` | Number of concurrently pending promises returned |
| experimental | | `experimental` | `{}` | Experimental features, see below |
| markdown | | `markdown` | `{}` | See `markdown` configuration below |
Expand Down
29 changes: 15 additions & 14 deletions packages/lobe-i18n/README.zh-CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ Lobe i18n 是一款使用 ChatGPT 自动化 i18n 的 CLI 流程工具
- [x] ♻️ 支持 i18n 增量更新,按照 `entry` 文件自动提取新增内容
- [x] 🗂️ 支持单文件模式 `en_US.json` 和文件夹 `en_US/common.json` 模式,完美配合 `i18next` 使用
- [x] 🌲 支持 `扁平``树状` locale 文件
- [x] 🛠️ 支持自定义 OpenAI 模型、API 代理、temperature
- [x] 🛠️ 支持自定义 OpenAI 模型、API 代理、temperature、topP
- [x] 📝 支持 `Markdown` i18n 翻译自动化

<div align="right">
Expand Down Expand Up @@ -145,19 +145,20 @@ $ lobe-i18n -c './custom-config.js' # or use the full flag --config

## 🌏 Locale 配置

| 属性名称 | 必填 | 类型 | 默认值 | 描述 |
| ------------- | ---- | -------------- | --------------- | ---------------------------------------- |
| entry | `*` | `string` | - | 入口文件或文件夹 |
| entryLocale | `*` | `string` | - | 作为翻译参考的语言 |
| modelName | | `string` | `gpt-3.5-turbo` | 使用的模型 |
| output | `*` | `string` | - | 存储本地化文件的位置 |
| outputLocales | `*` | `string[] ` | `[]` | 需要进行翻译的所有语言 |
| reference | | `string` | - | 提供一些上下文以获得更准确的翻译 |
| splitToken | | `number` | - | 按令牌分割本地化 JSON 文件,默认自动计算 |
| temperature | | `number` | `0` | 使用的采样温度 |
| concurrency | | `number` | `5` | 同时并发的队列请求数量 |
| experimental | | `experimental` | `{}` | 实验性功能,见下文 |
| markdown | | `markdown` | `{}` |`markdown` 配置说明 |
| 属性名称 | 必填 | 类型 | 默认值 | 描述 |
| ------------- | ---- | -------------- | --------------- | -------------------------------------------------------- |
| entry | `*` | `string` | - | 入口文件或文件夹 |
| entryLocale | `*` | `string` | - | 作为翻译参考的语言 |
| modelName | | `string` | `gpt-3.5-turbo` | 使用的模型 |
| output | `*` | `string` | - | 存储本地化文件的位置 |
| outputLocales | `*` | `string[] ` | `[]` | 需要进行翻译的所有语言 |
| reference | | `string` | - | 提供一些上下文以获得更准确的翻译 |
| splitToken | | `number` | - | 按令牌分割本地化 JSON 文件,默认自动计算 |
| temperature | | `number` | `0` | 使用的采样温度 |
| topP | | `number` | `1` | 生成过程中的核采样方法概率阈值,取值越大生成的随机性越高 |
| concurrency | | `number` | `5` | 同时并发的队列请求数量 |
| experimental | | `experimental` | `{}` | 实验性功能,见下文 |
| markdown | | `markdown` | `{}` |`markdown` 配置说明 |

#### `experimental`

Expand Down
3 changes: 2 additions & 1 deletion packages/lobe-i18n/src/core/TranslateLocale.ts
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ export class TranslateLocale {
modelName: config.modelName,
openAIApiKey,
temperature: config.temperature,
topP: config.topP,
});
this.promptJson = promptJsonTranslate(config.reference);
this.promptString = promptStringTranslate(config.reference);
Expand Down Expand Up @@ -71,7 +72,7 @@ export class TranslateLocale {
to,
});

const res = await this.model.call(
const res = await this.model.invoke(
formattedChatPrompt,
this.isJsonMode
? {
Expand Down
4 changes: 4 additions & 0 deletions packages/lobe-i18n/src/types/config.ts
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,10 @@ export interface I18nConfigLocale {
* @description Sampling temperature to use
*/
temperature?: number;
/**
* @description Nucleus sampling threshold
*/
topP?: number;
}

export enum MarkdownModeType {
Expand Down

0 comments on commit 0ed03c2

Please sign in to comment.