Skip to content

Commit

Permalink
feat(messages): add support for image inputs (#303)
Browse files Browse the repository at this point in the history
  • Loading branch information
stainless-bot authored and RobertCraigie committed Mar 4, 2024
1 parent 1852a80 commit 1b87d9e
Show file tree
Hide file tree
Showing 14 changed files with 228 additions and 126 deletions.
44 changes: 24 additions & 20 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,8 +30,8 @@ const anthropic = new Anthropic({
async function main() {
const message = await anthropic.messages.create({
max_tokens: 1024,
messages: [{ role: 'user', content: 'How does a court case get to the supreme court?' }],
model: 'claude-2.1',
messages: [{ role: 'user', content: 'Hello, Claude' }],
model: 'claude-3-opus-20240229',
});

console.log(message.content);
Expand All @@ -51,8 +51,8 @@ const anthropic = new Anthropic();

const stream = await anthropic.messages.create({
max_tokens: 1024,
messages: [{ role: 'user', content: 'your prompt here' }],
model: 'claude-2.1',
messages: [{ role: 'user', content: 'Hello, Claude' }],
model: 'claude-3-opus-20240229',
stream: true,
});
for await (const messageStreamEvent of stream) {
Expand All @@ -78,8 +78,8 @@ const anthropic = new Anthropic({
async function main() {
const params: Anthropic.MessageCreateParams = {
max_tokens: 1024,
messages: [{ role: 'user', content: 'Where can I get a good coffee in my neighbourhood?' }],
model: 'claude-2.1',
messages: [{ role: 'user', content: 'Hello, Claude' }],
model: 'claude-3-opus-20240229',
};
const message: Anthropic.Message = await anthropic.messages.create(params);
}
Expand All @@ -91,9 +91,13 @@ Documentation for each method, request param, and response field are available i

## Counting Tokens

We provide a [separate package](https://github.com/anthropics/anthropic-tokenizer-typescript) for counting how many tokens a given piece of text contains.
You can see the exact usage for a given request through the `usage` response property, e.g.

See the [repository documentation](https://github.com/anthropics/anthropic-tokenizer-typescript) for more details.
```ts
const message = await client.messages.create(...)
console.log(message.usage)
// { input_tokens: 25, output_tokens: 13 }
```

## Streaming Helpers

Expand All @@ -107,7 +111,7 @@ const anthropic = new Anthropic();
async function main() {
const stream = anthropic.messages
.stream({
model: 'claude-2.1',
model: 'claude-3-opus-20240229',
max_tokens: 1024,
messages: [
{
Expand Down Expand Up @@ -143,8 +147,8 @@ async function main() {
const message = await anthropic.messages
.create({
max_tokens: 1024,
messages: [{ role: 'user', content: 'your prompt here' }],
model: 'claude-2.1',
messages: [{ role: 'user', content: 'Hello, Claude' }],
model: 'claude-3-opus-20240229',
})
.catch((err) => {
if (err instanceof Anthropic.APIError) {
Expand Down Expand Up @@ -189,7 +193,7 @@ const anthropic = new Anthropic({
});

// Or, configure per-request:
await anthropic.messages.create({ max_tokens: 1024, messages: [{ role: 'user', content: 'Can you help me effectively ask for a raise at work?' }], model: 'claude-2.1' }, {
await anthropic.messages.create({ max_tokens: 1024, messages: [{ role: 'user', content: 'Hello, Claude' }], model: 'claude-3-opus-20240229' }, {
maxRetries: 5,
});
```
Expand All @@ -206,7 +210,7 @@ const anthropic = new Anthropic({
});

// Override per-request:
await anthropic.messages.create({ max_tokens: 1024, messages: [{ role: 'user', content: 'Where can I get a good coffee in my neighbourhood?' }], model: 'claude-2.1' }, {
await anthropic.messages.create({ max_tokens: 1024, messages: [{ role: 'user', content: 'Hello, Claude' }], model: 'claude-3-opus-20240229' }, {
timeout: 5 * 1000,
});
```
Expand All @@ -231,8 +235,8 @@ const anthropic = new Anthropic();
const message = await anthropic.messages.create(
{
max_tokens: 1024,
messages: [{ role: 'user', content: 'Where can I get a good coffee in my neighbourhood?' }],
model: 'claude-2.1',
messages: [{ role: 'user', content: 'Hello, Claude' }],
model: 'claude-3-opus-20240229',
},
{ headers: { 'anthropic-version': 'My-Custom-Value' } },
);
Expand All @@ -253,8 +257,8 @@ const anthropic = new Anthropic();
const response = await anthropic.messages
.create({
max_tokens: 1024,
messages: [{ role: 'user', content: 'Where can I get a good coffee in my neighbourhood?' }],
model: 'claude-2.1',
messages: [{ role: 'user', content: 'Hello, Claude' }],
model: 'claude-3-opus-20240229',
})
.asResponse();
console.log(response.headers.get('X-My-Header'));
Expand All @@ -263,8 +267,8 @@ console.log(response.statusText); // access the underlying Response object
const { data: message, response: raw } = await anthropic.messages
.create({
max_tokens: 1024,
messages: [{ role: 'user', content: 'Where can I get a good coffee in my neighbourhood?' }],
model: 'claude-2.1',
messages: [{ role: 'user', content: 'Hello, Claude' }],
model: 'claude-3-opus-20240229',
})
.withResponse();
console.log(raw.headers.get('X-My-Header'));
Expand Down Expand Up @@ -326,7 +330,7 @@ const anthropic = new Anthropic({
});

// Override per-request:
await anthropic.messages.create({ max_tokens: 1024, messages: [{ role: 'user', content: 'Where can I get a good coffee in my neighbourhood?' }], model: 'claude-2.1' }, {
await anthropic.messages.create({ max_tokens: 1024, messages: [{ role: 'user', content: 'Hello, Claude' }], model: 'claude-3-opus-20240229' }, {
baseURL: 'http://localhost:8080/test-api',
httpAgent: new http.Agent({ keepAlive: false }),
})
Expand Down
11 changes: 1 addition & 10 deletions api.md
Original file line number Diff line number Diff line change
@@ -1,15 +1,5 @@
# Anthropic

# Completions

Types:

- <code><a href="./src/resources/completions.ts">Completion</a></code>

Methods:

- <code title="post /v1/complete">client.completions.<a href="./src/resources/completions.ts">create</a>({ ...params }) -> Completion</code>

# Messages

Types:
Expand All @@ -18,6 +8,7 @@ Types:
- <code><a href="./src/resources/messages.ts">ContentBlockDeltaEvent</a></code>
- <code><a href="./src/resources/messages.ts">ContentBlockStartEvent</a></code>
- <code><a href="./src/resources/messages.ts">ContentBlockStopEvent</a></code>
- <code><a href="./src/resources/messages.ts">ImageBlockParam</a></code>
- <code><a href="./src/resources/messages.ts">Message</a></code>
- <code><a href="./src/resources/messages.ts">MessageDeltaEvent</a></code>
- <code><a href="./src/resources/messages.ts">MessageDeltaUsage</a></code>
Expand Down
2 changes: 1 addition & 1 deletion examples/cancellation.ts
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ async function main() {

const stream = await client.completions.create({
prompt: `${Anthropic.HUMAN_PROMPT}${question}${Anthropic.AI_PROMPT}:`,
model: 'claude-2.1',
model: 'claude-3-opus-20240229',
stream: true,
max_tokens_to_sample: 500,
});
Expand Down
2 changes: 1 addition & 1 deletion examples/demo.ts
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ const client = new Anthropic(); // gets API Key from environment variable ANTHRO
async function main() {
const result = await client.completions.create({
prompt: `${Anthropic.HUMAN_PROMPT} how does a court case get to the Supreme Court? ${Anthropic.AI_PROMPT}`,
model: 'claude-2.1',
model: 'claude-3-opus-20240229',
max_tokens_to_sample: 300,
});
console.log(result.completion);
Expand Down
2 changes: 1 addition & 1 deletion examples/raw-streaming.ts
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ async function main() {

const stream = await client.completions.create({
prompt: `${Anthropic.HUMAN_PROMPT}${question}${Anthropic.AI_PROMPT}:`,
model: 'claude-2.1',
model: 'claude-3-opus-20240229',
stream: true,
max_tokens_to_sample: 500,
});
Expand Down
2 changes: 1 addition & 1 deletion examples/streaming.ts
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ async function main() {
content: `Hey Claude! How can I recursively list all files in a directory in Rust?`,
},
],
model: 'claude-2.1',
model: 'claude-3-opus-20240229',
max_tokens: 1024,
})
// Once a content block is fully streamed, this event will fire
Expand Down
2 changes: 1 addition & 1 deletion packages/bedrock-sdk/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ const anthropic = new AnthropicBedrock();

async function main() {
const completion = await anthropic.completions.create({
model: 'anthropic.claude-instant-v1',
model: 'anthropic.claude-3-opus-20240229-v1:0',
prompt: `${Anthropic.HUMAN_PROMPT} how does a court case get to the Supreme Court? ${Anthropic.AI_PROMPT}`,
stop_sequences: [Anthropic.HUMAN_PROMPT],
max_tokens_to_sample: 800,
Expand Down
2 changes: 1 addition & 1 deletion packages/bedrock-sdk/examples/demo.ts
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ const anthropic = new AnthropicBedrock();

async function main() {
const completion = await anthropic.completions.create({
model: 'anthropic.claude-instant-v1',
model: 'anthropic.claude-3-opus-20240229-v1:0',
prompt: `${Anthropic.HUMAN_PROMPT} how does a court case get to the Supreme Court? ${Anthropic.AI_PROMPT}`,
stop_sequences: [Anthropic.HUMAN_PROMPT],
max_tokens_to_sample: 800,
Expand Down
1 change: 1 addition & 0 deletions src/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -241,6 +241,7 @@ export namespace Anthropic {
export import ContentBlockDeltaEvent = API.ContentBlockDeltaEvent;
export import ContentBlockStartEvent = API.ContentBlockStartEvent;
export import ContentBlockStopEvent = API.ContentBlockStopEvent;
export import ImageBlockParam = API.ImageBlockParam;
export import Message = API.Message;
export import MessageDeltaEvent = API.MessageDeltaEvent;
export import MessageDeltaUsage = API.MessageDeltaUsage;
Expand Down
29 changes: 19 additions & 10 deletions src/resources/completions.ts
Original file line number Diff line number Diff line change
Expand Up @@ -69,6 +69,11 @@ export interface Completion {
*/
stop_reason: string | null;

/**
* Object type.
*
* For Text Completions, this is always `"completion"`.
*/
type: 'completion';
}

Expand All @@ -86,16 +91,10 @@ export interface CompletionCreateParamsBase {
/**
* The model that will complete your prompt.
*
* As we improve Claude, we develop new versions of it that you can query. The
* `model` parameter controls which version of Claude responds to your request.
* Right now we offer two model families: Claude, and Claude Instant. You can use
* them by setting `model` to `"claude-2.1"` or `"claude-instant-1.2"`,
* respectively.
*
* See [models](https://docs.anthropic.com/claude/reference/selecting-a-model) for
* See [models](https://docs.anthropic.com/claude/docs/models-overview) for
* additional details and options.
*/
model: (string & {}) | 'claude-2.1' | 'claude-instant-1';
model: (string & {}) | 'claude-3-opus-20240229' | 'claude-2.1' | 'claude-instant-1';

/**
* The prompt that you want Claude to complete.
Expand Down Expand Up @@ -141,8 +140,12 @@ export interface CompletionCreateParamsBase {
/**
* Amount of randomness injected into the response.
*
* Defaults to 1. Ranges from 0 to 1. Use temp closer to 0 for analytical /
* multiple choice, and closer to 1 for creative and generative tasks.
* Defaults to `1.0`. Ranges from `0.0` to `1.0`. Use `temperature` closer to `0.0`
* for analytical / multiple choice, and closer to `1.0` for creative and
* generative tasks.
*
* Note that even with `temperature` of `0.0`, the results will not be fully
* deterministic.
*/
temperature?: number;

Expand All @@ -151,6 +154,9 @@ export interface CompletionCreateParamsBase {
*
* Used to remove "long tail" low probability responses.
* [Learn more technical details here](https://towardsdatascience.com/how-to-sample-from-language-models-682bceb97277).
*
* Recommended for advanced use cases only. You usually only need to use
* `temperature`.
*/
top_k?: number;

Expand All @@ -161,6 +167,9 @@ export interface CompletionCreateParamsBase {
* for each subsequent token in decreasing probability order and cut it off once it
* reaches a particular probability specified by `top_p`. You should either alter
* `temperature` or `top_p`, but not both.
*
* Recommended for advanced use cases only. You usually only need to use
* `temperature`.
*/
top_p?: number;
}
Expand Down
1 change: 1 addition & 0 deletions src/resources/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@ export {
ContentBlockDeltaEvent,
ContentBlockStartEvent,
ContentBlockStopEvent,
ImageBlockParam,
Message,
MessageDeltaEvent,
MessageDeltaUsage,
Expand Down
Loading

0 comments on commit 1b87d9e

Please sign in to comment.