Skip to content

Commit

Permalink
Add examples for JS (#3)
Browse files Browse the repository at this point in the history
* Add examples for JS

* add copyright banner

* update consume package version

* add link for react_native

* add link in main README
  • Loading branch information
fs-eire authored May 17, 2021
1 parent ca25bb6 commit ae2c541
Show file tree
Hide file tree
Showing 25 changed files with 648 additions and 0 deletions.
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@ Outline the examples in the repository.
| Example | Description |
|-------------------|--------------------------------------------|
|[Android Image Classifier](mobile/examples/image_classifications/android)| An example application for ONNX Runtime on Android. The example app uses image classification which is able to continuously classify the objects it sees from the device's camera in real-time and displays the most probable inference result on the screen. |
|[JavaScript API examples](js)| Examples that demonstrates how to use JavaScript API for ONNX Runtime. |

## Contributing

Expand Down
5 changes: 5 additions & 0 deletions js/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
dist
node_modules

!**/*.onnx
package-lock.json
27 changes: 27 additions & 0 deletions js/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
# ONNX Runtime JavaScript examples

## Summary

This folder contains several JavaScript examples. Most of the examples, unless remarked explicitly, are available in all NPM packages as described below:

- [onnxruntime-node](https://github.com/microsoft/onnxruntime/tree/master/js/node): Node.js binding for ONNXRuntime. Can be used in Node.js applications and Node.js compatible environment (eg. Electron.js).
- [onnxruntime-web](https://github.com/microsoft/onnxruntime/tree/master/js/web): ONNXRuntime on browsers.
- [onnxruntime-react-native](https://github.com/microsoft/onnxruntime/tree/master/js/react_native): ONNXRuntime for React Native applications on Android and iOS.

## Usage

Click links for README of each examples.

### Quick Start

* [Quick Start - Nodejs Binding](quick-start_onnxruntime-node) - a demonstration of basic usage of ONNX Runtime Node.js binding.

* [Quick Start - Web (using script tag)](quick-start_onnxruntime-web-script-tag) - a demonstration of basic usage of ONNX Runtime Web using script tag.

* [Quick Start - Web (using bundler)](quick-start_onnxruntime-web-bundler) - a demonstration of basic usage of ONNX Runtime Web using a bundler.

### API usage

* [API usage - Tensor](api-usage_tensor) - a demonstration of basic usage of `Tensor`.

* [API usage - InferenceSession](api-usage_inference-session) - a demonstration of basic usage of `InferenceSession`.
20 changes: 20 additions & 0 deletions js/api-usage_inference-session/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
# API usage - InferenceSession

## Summary

This example is a demonstration of basic usage of `InferenceSession`.

- `inference-session-create.js`: In this example, we create `InferenceSession` in different ways.
- `inference-session-properties.js`: In this example, we get input/output names from an `InferenceSession` object.
- `inference-session-run.js`: In this example, we run the model inferencing in different ways.

For more information about `SessionOptions` and `RunOptions`, please refer to other examples.

## Usage

```sh
npm install
node ./inference-session-create.js
node ./inference-session-properties.js
node ./inference-session-run.js
```
74 changes: 74 additions & 0 deletions js/api-usage_inference-session/inference-session-create.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,74 @@
// Copyright (c) Microsoft Corporation.
// Licensed under the MIT license.

const fs = require('fs');
const util = require('util');
const ort = require('onnxruntime-node');

// following code also works for onnxruntime-web.

const InferenceSession = ort.InferenceSession;

// use an async context to call onnxruntime functions.
async function main() {
try {
// create session option object
const options = createMySessionOptions();

//
// create inference session from a ONNX model file path or URL
//
const session01 = await InferenceSession.create('./model.onnx');
const session01_B = await InferenceSession.create('./model.onnx', options); // specify options

//
// create inference session from an Node.js Buffer (Uint8Array)
//
const buffer02 = await readMyModelDataFile('./model.onnx'); // buffer is Uint8Array
const session02 = await InferenceSession.create(buffer02);
const session02_B = await InferenceSession.create(buffer02, options); // specify options

//
// create inference session from an ArrayBuffer
//
const arrayBuffer03 = buffer02.buffer;
const offset03 = buffer02.byteOffset;
const length03 = buffer02.byteLength;
const session03 = await InferenceSession.create(arrayBuffer03, offset03, length03);
const session03_B = await InferenceSession.create(arrayBuffer03, offset03, length03, options); // specify options

// example for browser
//const arrayBuffer03_C = await fetchMyModel('./model.onnx');
//const session03_C = await InferenceSession.create(arrayBuffer03_C);
} catch (e) {
console.error(`failed to create inference session: ${e}`);
}
}

main();

function createMySessionOptions() {
// session options: please refer to the other example for details usage for session options

// example of a session option object in node.js:
// specify intra operator threads number to 1 and disable CPU memory arena
return { intraOpNumThreads: 1, enableCpuMemArena: false }

// example of a session option object in browser:
// specify WebAssembly exection provider
//return { executionProviders: ['wasm'] };

}

async function readMyModelDataFile(filepathOrUri) {
// read model file content (Node.js) as Buffer (Uint8Array)
return await util.promisify(fs.readFile)(filepathOrUri);
}

async function fetchMyModel(filepathOrUri) {
// use fetch to read model file (browser) as ArrayBuffer
if (typeof fetch !== 'undefined') {
const response = await fetch(filepathOrUri);
return await response.arrayBuffer();
}
}
27 changes: 27 additions & 0 deletions js/api-usage_inference-session/inference-session-properties.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
// Copyright (c) Microsoft Corporation.
// Licensed under the MIT license.

const ort = require('onnxruntime-node');

// following code also works for onnxruntime-web.

const InferenceSession = ort.InferenceSession;

// use an async context to call onnxruntime functions.
async function main() {
try {
// create session and load model.onnx
const session = await InferenceSession.create('./model.onnx');;

//
// get input/output names from inference session object
//
const inputNames = session.inputNames;
const outputNames = session.outputNames;

} catch (e) {
console.error(`failed to create inference session: ${e}`);
}
}

main();
76 changes: 76 additions & 0 deletions js/api-usage_inference-session/inference-session-run.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,76 @@
// Copyright (c) Microsoft Corporation.
// Licensed under the MIT license.

const ort = require('onnxruntime-node');

// following code also works for onnxruntime-web.

const InferenceSession = ort.InferenceSession;
const Tensor = ort.Tensor;

// use an async context to call onnxruntime functions.
async function main() {
try {
// create session and load model.onnx
const session = await InferenceSession.create('./model.onnx');

// prepare inputs
const dataA = prepareDataA(); // Float32Array(12)
const dataB = prepareDataB(); // Float32Array(12)
const tensorA = new ort.Tensor('float32', dataA, [3, 4]);
const tensorB = new ort.Tensor('float32', dataB, [4, 3]);

// prepare feeds. use model input names as keys.
const feeds = {
a: new Tensor('float32', dataA, [3, 4]),
b: new Tensor('float32', dataB, [4, 3])
};

// run options
const option = createRunOptions();

//
// feed inputs and run
//
const results_02 = await session.run(feeds);
const results_02_B = await session.run(feeds, option); // specify options

//
// run with specified names of fetches (outputs)
//
const results_03 = await session.run(feeds, ['c']);
const results_03_B = await session.run(feeds, ['c'], option); // specify options

//
// run with fetches (outputs) as nullable map
//
const results_04 = await session.run(feeds, { c: null });
const results_04_B = await session.run(feeds, { c: null }, option); // specify options

//
// run with fetches (outputs) as nullable map, with tensor as value
//
const preAllocatedTensorC = new Tensor(new Float32Array(9), [3, 3]);
const results_05 = await session.run(feeds, { c: preAllocatedTensorC });
const results_05_B = await session.run(feeds, { c: preAllocatedTensorC }, option); // specify options

} catch (e) {
console.error(`failed to inference ONNX model: ${e}.`);
}
}

main();

function prepareDataA() {
return Float32Array.from([1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12]);
}
function prepareDataB() {
return Float32Array.from([10, 20, 30, 40, 50, 60, 70, 80, 90, 100, 110, 120]);
}

function createRunOptions() {
// run options: please refer to the other example for details usage for run options

// specify log verbose to this inference run
return { logSeverityLevel: 0 };
}
16 changes: 16 additions & 0 deletions js/api-usage_inference-session/model.onnx
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
 backend-test:b

a
bc"MatMultest_matmul_2dZ
a


Z
b


b
c


B
10 changes: 10 additions & 0 deletions js/api-usage_inference-session/package.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
{
"name": "api-usage_inference-session",
"private": true,
"version": "1.0.0",
"description": "This example is a demonstration of basic usage of InferenceSession.",
"main": "index.js",
"dependencies": {
"onnxruntime-node": "^1.8.0"
}
}
16 changes: 16 additions & 0 deletions js/api-usage_tensor/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
# API usage - Create Tensor

## Summary

This example is a demonstration of basic usage of `Tensor`.

- `tensor-create.js`: In this example, we create tensors in different ways.
- `tensor-properties.js`: In this example, we get tensor properties from a Tensor object.

## Usage

```sh
npm install
node ./tensor-create.js
node ./tensor-properties.js
```
9 changes: 9 additions & 0 deletions js/api-usage_tensor/package.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
{
"name": "api-usage_create-tensor",
"private": true,
"version": "1.0.0",
"description": "This example is a demonstration of basic usage of Tensor.",
"dependencies": {
"onnxruntime-node": "^1.8.0"
}
}
68 changes: 68 additions & 0 deletions js/api-usage_tensor/tensor-create.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,68 @@
// Copyright (c) Microsoft Corporation.
// Licensed under the MIT license.

const ort = require('onnxruntime-node');

// following code also works for onnxruntime-web.

const Tensor = ort.Tensor;

//
// create a [2x3x4] float tensor
//
const buffer01 = new Float32Array(24);
buffer01[0] = 0.1; // fill buffer data
const tensor01 = new Tensor('float32', buffer01, [2, 3, 4]);
// type 'float32' can be omitted and the type is inferred from data
const tensor01_B = new Tensor(buffer01, [2, 3, 4]);

//
// create a [1x2] boolean tensor
//
const buffer02 = new Uint8Array(2);
buffer02[0] = 1; // true
buffer02[1] = 0; // false
const tensor02 = new Tensor('bool', buffer02, [1, 2]); // type 'bool' cannot omit as both 'bool' and 'uint8' uses Uint8Array.

//
// create a scaler float64 tensor
//
const tensor03 = new Tensor(new Float64Array(1), []);
tensor03.data[0] = 1.0; // setting data after tensor is created is allowed

//
// create a one-dimension tensor
//
const tensor04 = new Tensor(new Float32Array(100), [100]);
const tensor04_B = new Tensor(new Float32Array(100)); // dims can be omitted if it is a 1-D tensor. tensor04.dims = [100]

//
// create a [1x2] string tensor
//
const tensor05 = new Tensor('string', ['a', 'b'], [1, 2]);
const tensor05_B = new Tensor(['a', 'b'], [1, 2]); // type 'string' can be omitted

//
// !!! BAD USAGES !!!
// followings are bad usages that may cause an error to be thrown. try not to make these mistakes.
//

// create from mismatched TypedArray
try {
const tensor = new Tensor('float64', new Float32Array(100)); // 'float64' must use with Float64Array as data.
} catch{ }

// bad dimension (negative value)
try {
const tensor = new Tensor(new Float32Array(100), [1, 2, -1]); // negative dims is not allowed.
} catch{ }

// size mismatch (scalar size should be 1)
try {
const tensor = new Tensor(new Float32Array(0), []);
} catch{ }

// size mismatch (5 * 6 != 40)
try {
const tensor = new Tensor(new Float32Array(40), [5, 6]);
} catch{ }
Loading

0 comments on commit ae2c541

Please sign in to comment.