Skip to content

Commit

Permalink
hi job
Browse files Browse the repository at this point in the history
  • Loading branch information
wangpatrick57 committed Nov 6, 2024
1 parent c4a4018 commit 1ed3375
Show file tree
Hide file tree
Showing 2 changed files with 61 additions and 50 deletions.
108 changes: 59 additions & 49 deletions .github/workflows/tests.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,64 +6,74 @@ on:
branches: [main]

jobs:
tests:
# The code for the self-hosted runners is at https://github.com/wangpatrick57/dbgym-runners.
hi:
runs-on: self-hosted

steps:
- uses: actions/checkout@v2

- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.10'
- name: hi
run: |
echo hi
# We could choose to set up dependencies manually in the GHA runner instead of installing them during the GHA.
#
# However, I think it's better to do them in the GHA itself so that we're testing our dependency installation step
# in addition to our actual code. It also removes the need to manually reinstall dependencies on the GHA runners
# every time we add a new dependency.
#
# Note that the GHA runners are stateful. Dependencies installed from previous runs will still be on the runner.
# This means this step will usually be pretty fast as most dependencies will already be cached. However, it also
# means that past runs might interfere with the current run, so you sometimes may need to restart the GHA runners.
# tests:
# # The code for the self-hosted runners is at https://github.com/wangpatrick57/dbgym-runners.
# runs-on: self-hosted

# We need to do `. "$HOME/.cargo/env"` in each step for it to work.
- name: Install dependencies
run: |
./dependencies/install_dependencies.sh
# steps:
# - uses: actions/checkout@v2

- name: Check formatting
run: |
./scripts/check_format.sh
# - name: Set up Python
# uses: actions/setup-python@v4
# with:
# python-version: '3.10'

- name: Static type checking
run: |
./scripts/mypy.sh
# # We could choose to set up dependencies manually in the GHA runner instead of installing them during the GHA.
# #
# # However, I think it's better to do them in the GHA itself so that we're testing our dependency installation step
# # in addition to our actual code. It also removes the need to manually reinstall dependencies on the GHA runners
# # every time we add a new dependency.
# #
# # Note that the GHA runners are stateful. Dependencies installed from previous runs will still be on the runner.
# # This means this step will usually be pretty fast as most dependencies will already be cached. However, it also
# # means that past runs might interfere with the current run, so you sometimes may need to restart the GHA runners.

- name: Run unit tests
# Unit tests are defined as tests which don't require any external systems to be running.
run: |
. "$HOME/.cargo/env"
./scripts/run_unit_tests.sh
# # We need to do `. "$HOME/.cargo/env"` in each step for it to work.
# - name: Install dependencies
# run: |
# ./dependencies/install_dependencies.sh

- name: Run integration tests
# Integration tests do require external systems to be running (most commonly a database instance).
# Unlike end-to-end tests though, they test a specific module in a detailed manner, much like a unit test does.
env:
# We set `INTENDED_DBDATA_HARDWARE` so that it's seen when `integtest_pg_conn.py` executes `./tune/env/set_up_env_integtests.sh`.
INTENDED_DBDATA_HARDWARE: ssd
run: |
. "$HOME/.cargo/env"
export
./scripts/run_integration_tests.sh
# - name: Check formatting
# run: |
# ./scripts/check_format.sh

- name: Run end-to-end tests
# End-to-end tests are like integration tests in that they require external systems to be running.
# Unlike integration tests though, they don't perform detailed checks for any individual module.
#
# Note that we need to run with a non-root user in order to start Postgres. This is configured in the .yaml
# file for our self-hosted GHA runners.
run: |
. "$HOME/.cargo/env"
python -m scripts.run_protox_e2e_test ssd
# - name: Static type checking
# run: |
# ./scripts/mypy.sh

# - name: Run unit tests
# # Unit tests are defined as tests which don't require any external systems to be running.
# run: |
# . "$HOME/.cargo/env"
# ./scripts/run_unit_tests.sh

# - name: Run integration tests
# # Integration tests do require external systems to be running (most commonly a database instance).
# # Unlike end-to-end tests though, they test a specific module in a detailed manner, much like a unit test does.
# env:
# # We set `INTENDED_DBDATA_HARDWARE` so that it's seen when `integtest_pg_conn.py` executes `./tune/env/set_up_env_integtests.sh`.
# INTENDED_DBDATA_HARDWARE: ssd
# run: |
# . "$HOME/.cargo/env"
# export
# ./scripts/run_integration_tests.sh

# - name: Run end-to-end tests
# # End-to-end tests are like integration tests in that they require external systems to be running.
# # Unlike integration tests though, they don't perform detailed checks for any individual module.
# #
# # Note that we need to run with a non-root user in order to start Postgres. This is configured in the .yaml
# # file for our self-hosted GHA runners.
# run: |
# . "$HOME/.cargo/env"
# python -m scripts.run_protox_e2e_test ssd
3 changes: 2 additions & 1 deletion scripts/run_protox_e2e_test.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@

import yaml

from benchmark.constants import DEFAULT_SCALE_FACTOR
from util.pg import get_is_postgres_running
from util.workspace import (
default_embedder_path,
Expand Down Expand Up @@ -76,7 +77,7 @@ def run_e2e_for_benchmark(benchmark_name: str, intended_dbdata_hardware: str) ->
embedding_train_args = "--iterations-per-epoch 1 --num-points-to-sample 1 --num-batches 1 --batch-size 64 --start-epoch 15 --num-samples 4 --train-max-concurrent 4 --num-curate 2"
tune_hpo_args = "--num-samples 2 --max-concurrent 2 --workload-timeout 15 --query-timeout 1 --tune-duration-during-hpo 0.01"
elif benchmark_name == "job":
scale_factor = 1
scale_factor = DEFAULT_SCALE_FACTOR
query_subset = "demo"
workload_name_suffix = query_subset
embedding_datagen_args = ""
Expand Down

0 comments on commit 1ed3375

Please sign in to comment.