Skip to content

Commit

Permalink
feat(api): OpenAPI spec update via Stainless API (#167)
Browse files Browse the repository at this point in the history
  • Loading branch information
Stainless Bot committed Jul 29, 2024
1 parent c46a704 commit f4194ed
Show file tree
Hide file tree
Showing 3 changed files with 23 additions and 23 deletions.
2 changes: 2 additions & 0 deletions .github/workflows/release-doctor.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
name: Release Doctor
on:
pull_request:
branches:
- main
workflow_dispatch:

jobs:
Expand Down
2 changes: 1 addition & 1 deletion .stats.yml
Original file line number Diff line number Diff line change
@@ -1,2 +1,2 @@
configured_endpoints: 21
openapi_spec_url: https://storage.googleapis.com/stainless-sdk-openapi-specs/prompt-foundry%2Fprompt-foundry-sdk-0042044f00457ff0bf65c07207eea291e4df838e2bdab4dfc602eec8d3517c42.yml
openapi_spec_url: https://storage.googleapis.com/stainless-sdk-openapi-specs/prompt-foundry%2Fprompt-foundry-sdk-441451c27073e45d1bdc832c5b66c26d90bd185bd94bd461b91257fbf0987ef2.yml
42 changes: 20 additions & 22 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,32 +31,30 @@ npm install openai
Import the OpenAI and Prompt Foundry SDKs

```js
import PromptFoundry from "@prompt-foundry/typescript-sdk";
import { Configuration, OpenAIApi } from "openai";
import PromptFoundry from '@prompt-foundry/typescript-sdk';
import { Configuration, OpenAIApi } from 'openai';

// Initialize Prompt Foundry SDK with your API key
const promptFoundry = new PromptFoundry({
apiKey: process.env["PROMPT_FOUNDRY_API_KEY"],
apiKey: process.env['PROMPT_FOUNDRY_API_KEY'],
});

// Initialize OpenAI SDK with your API key
const configuration = new Configuration({
apiKey: process.env["OPENAI_API_KEY"],
apiKey: process.env['OPENAI_API_KEY'],
});
const openai = new OpenAIApi(configuration);

async function main() {
// Retrieve model parameters for the prompt
const modelParameters = await promptFoundry.prompts.getParameters("1212121", {
variables: { hello: "world" },
const modelParameters = await promptFoundry.prompts.getParameters('1212121', {
variables: { hello: 'world' },
});

// check if provider is Open AI
if (modelParameters.provider === "openai") {
if (modelParameters.provider === 'openai') {
// Use the retrieved parameters to create a chat completion request
const modelResponse = await openai.chat.completions.create(
modelParameters.parameters
);
const modelResponse = await openai.chat.completions.create(modelParameters.parameters);

// Print the response from OpenAI
console.log(modelResponse.data);
Expand All @@ -77,27 +75,27 @@ npm install @anthropic-ai/sdk
Import the Anthropic and Prompt Foundry SDKs

```js
import PromptFoundry from "@prompt-foundry/typescript-sdk";
import Anthropic from "@anthropic-ai/sdk";
import PromptFoundry from '@prompt-foundry/typescript-sdk';
import Anthropic from '@anthropic-ai/sdk';

// Initialize Prompt Foundry SDK with your API key
const promptFoundry = new PromptFoundry({
apiKey: process.env["PROMPT_FOUNDRY_API_KEY"],
apiKey: process.env['PROMPT_FOUNDRY_API_KEY'],
});

// Initialize Anthropic SDK with your API key
const anthropic = new Anthropic({
apiKey: process.env["ANTHROPIC_API_KEY"],
apiKey: process.env['ANTHROPIC_API_KEY'],
});

async function main() {
// Retrieve model parameters for the prompt
const modelParameters = await promptFoundry.prompts.getParameters("1212121", {
variables: { hello: "world" },
const modelParameters = await promptFoundry.prompts.getParameters('1212121', {
variables: { hello: 'world' },
});

// check if provider is Open AI
if (modelParameters.provider === "anthropic") {
if (modelParameters.provider === 'anthropic') {
// Use the retrieved parameters to create a chat completion request
const message = await anthropic.messages.create(modelParameters.parameters);

Expand All @@ -117,7 +115,7 @@ This library includes TypeScript definitions for all request params and response
```ts
import PromptFoundry from '@prompt-foundry/typescript-sdk';

const promptFoundry = new PromptFoundry({
const client = new PromptFoundry({
apiKey: process.env['PROMPT_FOUNDRY_API_KEY'], // This is the default and can be omitted
});

Expand Down Expand Up @@ -177,7 +175,7 @@ You can use the `maxRetries` option to configure or disable this:
<!-- prettier-ignore -->
```js
// Configure the default for all requests:
const promptFoundry = new PromptFoundry({
const client = new PromptFoundry({
maxRetries: 0, // default is 2
});

Expand All @@ -194,7 +192,7 @@ Requests time out after 1 minute by default. You can configure this with a `time
<!-- prettier-ignore -->
```ts
// Configure the default for all requests:
const promptFoundry = new PromptFoundry({
const client = new PromptFoundry({
timeout: 20 * 1000, // 20 seconds (default is 1 minute)
});

Expand All @@ -218,7 +216,7 @@ You can also use the `.withResponse()` method to get the raw `Response` along wi

<!-- prettier-ignore -->
```ts
const promptFoundry = new PromptFoundry();
const client = new PromptFoundry();

const response = await promptFoundry.prompts.getParameters('1212121').asResponse();
console.log(response.headers.get('X-My-Header'));
Expand Down Expand Up @@ -327,7 +325,7 @@ import http from 'http';
import { HttpsProxyAgent } from 'https-proxy-agent';

// Configure the default for all requests:
const promptFoundry = new PromptFoundry({
const client = new PromptFoundry({
httpAgent: new HttpsProxyAgent(process.env.PROXY_URL),
});

Expand Down

0 comments on commit f4194ed

Please sign in to comment.