The DAS Protocol is a set of Protocol Buffers (protobuf) and gRPC service definitions that describe a Data Access Service (DAS) API used by RAW Labs. It provides a unified, schema-aware interface for querying and managing data across multiple data sources.
The DAS Protocol acts as a standard contract for both clients and servers to communicate with data sources via gRPC. It allows:
- Dynamic discovery of tables and functions.
- Schema-aware queries with typed columns and constraints.
- CRUD operations on tables (insert, update, delete).
- Function execution with typed parameters and return values.
- Health checks to monitor the service availability.
By defining these capabilities in protobuf, we enable language-agnostic integration, so developers in different ecosystems can leverage the DAS functionality without being tied to a single framework or language.
-
Schema Discovery
Query table or function definitions to understand the shape and semantics of the data. -
Typed Data Model
The protocol includes comprehensive type definitions intypes.proto
andvalues.proto
, supporting everything from basic scalar types to complex records and lists. -
Rich Query Language
Build queries with operators likeEQUALS
,LESS_THAN
,LIKE
, etc. Define sorting, path keys, and retrieve row estimates. -
CRUD Operations
Perform create, read, update, and delete operations on DAS-managed tables. -
Dynamic Function Invocation
Fetch function definitions and execute them with named or positional arguments. -
Configurable Environment
Pass environment variables or metadata to the service through the Environment messages. -
Health Checks
Use theHealthCheckService
to ensure the service is running properly.
- Purpose: Register or unregister a DAS instance.
- Key RPCs:
Register
Unregister
service RegistrationService {
rpc Register (RegisterRequest) returns (RegisterResponse);
rpc Unregister (DASId) returns (UnregisterResponse);
}
- Example Use Case: Initialize a new DAS with configuration options, or tear it down from the registry.
- Purpose: Interact with data tables through a uniform interface.
- Key Operations:
GetTableDefinitions
– retrieve metadata (columns, descriptions, etc.)ExecuteTable
– perform a query and stream back result rowsInsertTable
,UpdateTable
,DeleteTable
– CRUD operationsGetTableEstimate
– estimate row counts before running queries
- Example Protos:
service TablesService {
rpc GetTableDefinitions (GetTableDefinitionsRequest) returns (GetTableDefinitionsResponse);
rpc ExecuteTable (ExecuteTableRequest) returns (stream Rows);
// ...
}
- Purpose: Discover and execute user-defined functions available within DAS.
- Key Operations:
GetFunctionDefinitions
– list function signatures and metadataExecuteFunction
– call a function by name with typed parameters
service FunctionsService {
rpc GetFunctionDefinitions (GetFunctionDefinitionsRequest) returns (GetFunctionDefinitionsResponse);
rpc ExecuteFunction (ExecuteFunctionRequest) returns (ExecuteFunctionResponse);
}
- Purpose: Provide basic health status checks for a DAS instance.
- Key RPC:
Check
– returns a simpleSERVING
orNOT_SERVING
status
service HealthCheckService {
rpc Check (HealthCheckRequest) returns (HealthCheckResponse);
}
All protobuf definitions live under src/main/protobuf/com/rawlabs/protocol/das/v1/
.
common/
: Common messages likeDASId
andEnvironment
.tables/
: Table-related messages (tables.proto
).functions/
: Function-related messages (functions.proto
).query/
: Query-related messages for operators, sorting, path keys, etc. (query.proto
).services/
: gRPC service definitions (registration_service.proto
,tables_service.proto
, etc.).types/
: Core type system definitions (types.proto
andvalues.proto
).
This repository uses sbt to:
- Compile the protobuf files
- Generate Scala/Java gRPC stubs
- Optionally publish artifacts locally or to a repository
- sbt (1.x or later)
- Protobuf compiler (if you plan to manually compile
.proto
files in other languages)
- Clone this repository:
git clone https://github.com/raw-labs/protocol-das.git cd protocol-das
- Publish locally:
This compiles the protobuf files and generates Scala/Java classes. You can then reference the published artifacts from another sbt-based project by adding the corresponding coordinates to your dependencies.
sbt publishLocal
If you publish the generated artifacts to your local or remote repository:
- Add a dependency in your
build.sbt
(example coordinates shown—adjust as needed):libraryDependencies ++= Seq( "com.raw-labs" %% "protocol-das" % "0.1.0-SNAPSHOT" )
- Once included, you can import classes like
com.rawlabs.protocol.das.v1.services.TablesServiceGrpc
andcom.rawlabs.protocol.das.v1.services.RegistrationServiceGrpc
in your Scala/Java code.
Because these services and messages are defined via protobuf, you can also generate client/server stubs in many other languages (e.g., Python, C#, Go). You’ll need:
- The
.proto
files found in this repo (or from your local build). - The relevant protobuf/gRPC code generators in your language of choice.
Use of this software is governed by the Business Source License 1.1. As of the Change Date specified in that file, this software will be governed by the Apache License, Version 2.0.
For detailed information, see the BSL License file.
Questions?
If you have any questions or need support, please open an issue in this repository or reach out to us. We look forward to your feedback and contributions!