Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New reva config #4015

Merged
merged 76 commits into from
Jul 3, 2023
Merged

New reva config #4015

merged 76 commits into from
Jul 3, 2023

Conversation

gmgigi96
Copy link
Member

@gmgigi96 gmgigi96 commented Jun 27, 2023

Context and Problem Statement

The configuration process for reva presents significant challenges due to its microservice architecture, which necessitates the repetition of configuration parameters across different parts of the services. To enable various drivers of the same service provider, such as LDAP, OIDC, and public shares as authentication providers, different TOML files are needed for each service.

Additionally, each microservice within reva requires the specification of a unique address, which demands a deep understanding of the reva architecture.

Collectively, these issues make it arduous for users to deploy a simplified (mono-process) version of reva.

Decision drivers

  • Repeated configurations across microservices
  • Different configuration files just to enable one service (and its dependencies)

Decision Outcome

The configuration will contain a new [vars] section, containing all the common parameters used by different services.
These variables, in the form of name:value, can be recalled by a different part of the configuration, as values, using the sintax {{ vars.name }}. The values can be of any type.
For example:

[vars]
db_hostname = "database.cern.ch"
db_password = "secret-password"
db_port     = 10

[grpc.services.usershareprovider]
db_hostname = "{{ vars.db_hostname }}"
db_password = "{{ vars.db_password }}"
db_port     = "{{ vars.db_port }}"
...

[grpc.services.publicshareprovider]
db_hostname = "{{ vars.db_hostname }}"
db_password = "{{ vars.db_password }}"
db_port     = "{{ vars.db_port }}"
...

To enable different drivers of the same provider, every provider configuration is a list:

[[grpc.services.authprovider]]
driver = "oidc"

[[grpc.services.authprovider]]
driver = "machine"

It is still possible to declare only one provider, if it will be the only one in the configuration file.

[grpc.services.authprovider]
driver = "oidc"

Because the same provider cannot listen on the same port, every provider allow to specify an address in the form host:port.

[[grpc.services.authprovider]]
driver  = "oidc"
address = "localhost:9000"

[[grpc.services.authprovider]]
driver  = "machine"
address = "localhost:9001"

If the address variable is not provided, the runtime will allocate a random one. If other services have to refer to a provider address, the template {{ grpc.services.<name>[<index>].address }} can be used. The list is 0-based indexed, and the order reflects the same as in the configuration file.

[[grpc.services.authprovider]]
driver = "oidc"

[[grpc.services.authprovider]]
driver = "machine"

[grpc.services.authregistry.driver.static]
machine = "{{ grpc.services.authprovider[1].address }}"

A common address for all the services defined in the same toml file can be provider in the [grpc] section:

[grpc]
address = "localhost:9142"

[grpc.services.storageprovider]
...

[grpc.services.authprovider]
...

Having a list of providers with the grpc.address variable is not allowed.

@gmgigi96 gmgigi96 force-pushed the reva-config branch 2 times, most recently from 3f2f8b2 to 9a650a8 Compare June 28, 2023 14:55
@cs3org cs3org deleted a comment from update-docs bot Jun 30, 2023
@gmgigi96 gmgigi96 marked this pull request as ready for review June 30, 2023 07:45
@gmgigi96 gmgigi96 requested review from labkode, wkloucek, glpatcern and a team as code owners June 30, 2023 07:45
@labkode labkode merged commit bf70373 into cs3org:master Jul 3, 2023
abaldacchino pushed a commit to abaldacchino/reva that referenced this pull request Aug 25, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants