Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WinML] [C++/WinRT] Clarify how to share Ort::Env environments with WinRT/WinML instances #4971

Open
wbudd opened this issue Aug 29, 2020 · 9 comments

Comments

@wbudd
Copy link

wbudd commented Aug 29, 2020

This is actually a cross-post of an issue I posted over at WinML, but I thought it might be worth asking here too.

According to WinML documentation, the NuGet WinML solution provides "Direct access to the onnxruntime.dll".

However, all ONNXRuntime functionality requires creation of an Ort::Env environment instance as its first order of business, which apparently can only be created once per process.

A consequence of that seems to be that if I have WinML inference tasks running in thread A through the WinRT API; I am unable to, say, lookup ONNX input/output tensor names/dimension through the ONNXRuntime API in thread B—or vice versa— given that I'm neither able to reference WinML's internal Ort::Env (or can I?), nor can I pass my own Ort::Env instance to WinML constructors (or can I?). Attempting to use separate instances anyway results in the following error—either thrown by WinML or my side, depending simply on which side creates their instance first:

Only one instance of LoggingManager created with InstanceType::Default can exist at any point in time.

Is there anyway to share the same onnxruntime.dll with WinRT/WinML, for example by accessing the WinML backend through the ONNXRuntime API instead?

@pranavsharma
Copy link
Contributor

There was a bug in OrtEnv creation due to which it was throwing the error you mentioned. It has been fixed and will be available as part of the upcoming 1.5 release.

@wbudd
Copy link
Author

wbudd commented Sep 9, 2020

Thanks! Some clarification would be nice though. Does this mean that the statement "OrtEnv should be created only once, for each process" is inaccurate (or no longer accurate)? If so, does that mean that within the same process, an internal Ort::Env inside the WinML library can now live alongside another Ort::Env in an application which also calls Onnx Runtime directly?

@pranavsharma
Copy link
Contributor

CreateEnv will always return the same instance of OrtEnv no matter how many times you call it. A bug prevented users from calling CreateEnv more than once; this bug has been fixed.
Not sure which version of WinML you're using, so I can't comment on that, but if the version that calls CreateEnv is used, you should get the above mentioned behavior. cc @martinb35 to comment on the WinML api.

@stale
Copy link

stale bot commented Nov 16, 2020

This issue has been automatically marked as stale due to inactivity and will be closed in 7 days if no further activity occurs. If further support is needed, please provide an update and/or more details.

@stale stale bot added the stale issues that have not been addressed in a while; categorized by a bot label Nov 16, 2020
@wbudd
Copy link
Author

wbudd commented Nov 16, 2020

cc @martinb35 to comment on the WinML api.

I wouldn't mind a comment here @martinb35 — before the stale bot sweeps yet another issue under the rug.

@stale stale bot removed the stale issues that have not been addressed in a while; categorized by a bot label Nov 16, 2020
@deischi
Copy link

deischi commented Dec 15, 2020

That CreateEnv should be used only once per Session is not very well documented (should I create a separate issue for that?)

From the interface it seems that you can have multiple environments. In most cases this probably does not matter, but if you e.g. would like to use logging functions, or configure a global thread pool you might notice only very late.

Updating the documentation (header file) would be helpful.

@martinb35
Copy link
Contributor

@wbudd - you can see the Windows ML call to ORT::CreateEnv here, which is the same way that @pranavsharma mentioned, so it should return the same instance. Having said that, our primary use cases are either all through the Windows ML interface or all through the OnnxRuntime interface, and we haven't designed/tested (yet) to handle the use case where you use both as your describing.

@pranavsharma - did you address the comment from @deischi about documenting how to use CreateEnv?

@wbudd
Copy link
Author

wbudd commented Dec 15, 2020

@martinb35 Thanks for confirming that it should be the same instance.

In any case, I came to the conclusion that using the Microsoft.ML.OnnxRuntime.DirectML NuGet package instead of the WinML package makes a lot more sense when you also need ONNX Runtime functionality beyond what WinML offers; so that's what I recommend anyone who finds themselves in a similar situation.

@smk2007
Copy link
Member

smk2007 commented Mar 8, 2021

@wbudd Yep, it makes more sense to use Microsoft.ML.ONNXRuntime.* packages when "makes a lot more sense when you also need ONNX Runtime functionality beyond what WinML offers"!

However, looking at your original question, it seemed like you wanted to "lookup ONNX input/output tensor names/dimension."

You should be able to look up input/output tensors with the LearningModel.Inputs and Outputs APIs in WinML. Was this not sufficient for you?

@faxu faxu removed the type:support label Aug 18, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants