Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New outputs API #1128

Merged
merged 94 commits into from
Oct 4, 2022
Merged

New outputs API #1128

merged 94 commits into from
Oct 4, 2022

Conversation

EddyGharbi
Copy link
Contributor

@EddyGharbi EddyGharbi commented Jun 2, 2022

Description

This change exposes the next evolution of our API, letting the client manage the persistence of the user settings. This PR succeeds other ones and is exposing the following: streaming / recording / replay buffer / global settings / audio settings.

Please note that:

  1. This new API is exposed through a Typescript interface that can be found in js/module.ts
  2. legacy settings is referring to the current system that manages OBS settings through NodeOBS.
  3. The creation of different objects is handled by the client through the create methods from the interface factories. The release and destruction of the said objects are handled in C++.
  4. The following settings cannot be changed while outputs are active: video / audio / output settings. The only exception are the video bitrate that can be changed through updating the bitrate properties of a video encoder. [WARNING] There isn't a way to check for that inside the C++ code, if the client doesn't handle this correctly, it will result in the application crashing.

- Stream Settings
This section is referring to the following section of the settings:

image

These settings are managed through the use of the IService interface.
Getting the legacySettings will return the legacy settings.
Setting the legacySettings will save the service back into the legacy system.
A service needs to be associated to a streaming output to be used. The explanation on how to do this will be covered later on in this PR.

- Delay Settings
This section is referring to the following section of the settings:

image

These settings are managed through the use of the IDelay interface. They should be pretty self explanatory on what is in the UI vs typescript interface. Getting / saving the legacy settings is done through the streaming object holding the reference to it. Keep in mind that after creating this object, it needs to be associated with the streaming output in use in order to take effect.

- Reconnect Settings
This section is referring to the following section of the settings:

image

These settings are managed through the use of the IReconnect interface. They should be pretty self explanatory on what is in the UI vs typescript interface. Getting / saving the legacy settings is done through the streaming object holding the reference to it. Keep in mind that after creating this object, it needs to be associated with the streaming output in use in order to take effect.

- Network Settings
This section is referring to the following section of the settings:

image

These settings are managed through the use of the INetwork interface. They should be pretty self explanatory on what is in the UI vs typescript interface. Getting / saving the legacy settings is done through the streaming object holding the reference to it. Keep in mind that after creating this object, it needs to be associated with the streaming output in use in order to take effect.

- Simple Streaming Output Settings
This section is referring to the following section of the settings:

image

These settings are managed through the use of the ISimpleStreaming interface.
Getting / setting the legacySettings will return / save settings from the legacy system.

The interface extends the IStreaming interface. Here are some explanations on what needs to be done to use it:

  • A valid video encoder needs to be created through the IVideoEncoder interface and set to the streaming output. In the simple mode, only the preset property is shown to the user.
  • Valid service / delay / reconnect / network objects need to be created and set
  • enforceServiceBitrate is just a boolean setting that will force the C++ code to apply the bitrate limits from platforms before going live.
  • signalHandler is the way the client will register a JS callback to received signals from the output.
  • A valid audio encoder needs to be created through the IAudioEncoder interface and set to the streaming output. The Audio Bitrate field from the screenshot is the only thing that needs to be set for this audio encoder.

- Simple Recording Output Settings
This section is referring to the following section of the settings:

image

These settings are managed through the use of the ISimpleRecording interface.
Getting / setting the legacySettings will return / save settings from the legacy system.

The interface extends the IRecording interface. Here are some explanations on what needs to be done to use it:

  • Most of the settings should be self explanatory and it should be straight forward to what they correspond on the screenshot.
  • The 1 thing that needs to be handled is the case where the quality is set to Stream. This means that the recording output will use the video and audio encoders from the streaming output. Rather than forcing the client to associate the correct encoders reference, the only thing that needs to be done is to set the streaming member of the ISimpleRecording interface to the Streaming interface it associates to. By doing that, the C++ code will automatically know which encoders to use. They of course need to be created and valid on the streaming side.

- Simple Replay buffer Output Settings
This section is referring to the following section of the settings:

image

These settings are managed through the use of the ISimpleReplayBuffer interface.
Getting / setting the legacySettings will return / save settings from the legacy system.

The interface extends the IReplayBuffer interface. Here are some explanations on what needs to be done to use it:

  • Most of the settings should be self explanatory and it should be straight forward to what they correspond on the screenshot.
  • The concept is the same as for the Recording output. In most cases, the replay buffer will be using the recording output encoders, so it will need to hold a reference to the recording output the user is current using. Please note that if the recording output uses the streaming encoders, you do not need to also hold a reference to the streaming output inside the Replay buffer, it will automatically use whatever the recording uses.
  • The only time you need to save the streaming reference is when selective recording is on and the user choose to use the streaming output instead of the recording through the below setting:

image

- Advanced Output Settings
I'm not gonna expend much here since the concepts are pretty similar to the Simple output explained above. The main difference is regarding the audio encoders. On this mode Audio tracks are used rather than audio encoders directly.

The IAudioTrack interface represents audio tracks. The C++ code has a static list of 6 audio tracks, the client responsible from creating and setting these audio tracks. In the case of the streaming output, the index to the used audio track needs to be passed, starting from 0. In the case of the recording output, the mixer value needs to be set, it is a bit mask value.

- Video Settings
This section is referring to the following section of the settings:

  1. Video tab
    image

  2. Advanced tab
    image

These settings are managed through the use of the IVideo interface.
Getting / setting the legacySettings will return / save settings from the legacy system.
It can only be one videoContext set and needs to be set in order for the app to be able to run properly. Ideally it should be set directly after initializing the OBS API.

- Audio Settings
This section is referring to the following section of the settings:
image

These settings are managed through the use of the IAudio interface.
Getting / setting the legacySettings will return / save settings from the legacy system.
It can only be one audioContext set and needs to be set in order for the app to be able to run properly. Ideally it should be set directly after initializing the OBS API.

- Browser source HW acceleration
This section is referring to the following section of the settings:
image

2 static functions were exposed in order to manage this setting (boolean):

  • SetBrowserAcceleration: sets the settings and saves it to the legacy settings automatically
  • GetBrowserAccelerationLegacy gets the legacy value

- Enable media file caching
This section is referring to the following section of the settings:
image

2 static functions were exposed in order to manage this setting (boolean):

  • SetMediaFileCaching: sets the settings and saves it to the legacy settings automatically
  • GetMediaFileCachingLegacy gets the legacy value

- Process priority
This section is referring to the following section of the settings:
image

2 static functions were exposed in order to manage this setting (EProcessPriority):

  • SetProcessPriority: sets the settings and saves it to the legacy settings automatically. The possible value are described in the following typescript enum: EProcessPriority
  • GetProcessPriorityLegacy gets the legacy value

Motivation and Context

It comes from wanting to have the client manage the persistent of the settings and have a simpler to use API.

How Has This Been Tested?

Lots of manual tests with different scenarios and config to go live, recording, use the replay buffer, etc... The tests are also being migrated to use this new API as part of the PR.

Types of changes

  • New feature (non-breaking change which adds functionality) / API changes

Checklist:

  • The code has been tested.
  • All commit messages are properly formatted and commits squashed where appropriate.
  • I have included updates to all appropriate documentation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants