You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Validating gs://clean-b1dbca25/cmip6/ScenarioMIP/KIOST/KIOST-ESM/ssp245/r1i1p1f1/day/tasmax/gr1/v20191202.zarr
Traceback (most recent call last):
File "/argo/staging/script", line 18, in <module>
_test_timesteps(ds, data_type, time_period)
File "/opt/dodola/dodola/core.py", line 645, in _test_timesteps
assert (
AssertionError: projection cmip6 file is missing timesteps, only has 21535
From ssp126 (note the different timestep count):
Validating gs://clean-b1dbca25/cmip6/ScenarioMIP/KIOST/KIOST-ESM/ssp126/r1i1p1f1/day/tasmax/gr1/v20191202.zarr
Traceback (most recent call last):
File "/argo/staging/script", line 18, in <module>
_test_timesteps(ds, data_type, time_period)
File "/opt/dodola/dodola/core.py", line 645, in _test_timesteps
assert (
AssertionError: projection cmip6 file is missing timesteps, only has 4380
ssp585, on the other hand, bombs validation with invalid tasmax values:
Validating gs://clean-b1dbca25/cmip6/ScenarioMIP/KIOST/KIOST-ESM/ssp585/r1i1p1f1/day/tasmax/gr1/v20191202.zarr
Traceback (most recent call last):
File "/argo/staging/script", line 44, in <module>
tasks = dask.compute(*tasks)
File "/opt/conda/lib/python3.9/site-packages/dask/base.py", line 570, in compute
results = schedule(dsk, keys, **kwargs)
File "/opt/conda/lib/python3.9/site-packages/dask/threaded.py", line 79, in get
results = get_async(
File "/opt/conda/lib/python3.9/site-packages/dask/local.py", line 507, in get_async
raise_exception(exc, tb)
File "/opt/conda/lib/python3.9/site-packages/dask/local.py", line 315, in reraise
raise exc
File "/opt/conda/lib/python3.9/site-packages/dask/local.py", line 220, in execute_task
result = _execute_task(task, data)
File "/opt/conda/lib/python3.9/site-packages/dask/core.py", line 119, in _execute_task
return func(*(_execute_task(a, cache) for a in args))
File "/argo/staging/script", line 29, in clear_memory_intensive_tests
_test_temp_range(d, v)
File "/opt/dodola/dodola/core.py", line 684, in _test_temp_range
assert (ds[var].min() > 150) and (
AssertionError: tasmax values are invalid
the historical cleaning fails when trying to crop late-end off of the ssp data before appending it to historical:
File "pandas/_libs/hashtable_class_helper.pxi", line 4562, in pandas._libs.hashtable.PyObjectHashTable.get_item
KeyError: cftime.DatetimeNoLeap(2025, 12, 31, 23, 59, 59, 999999, has_year_zero=True)
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/argo/staging/script", line 17, in <module>
ds = ds.sel(time=slice(from_time, to_time))
File "/opt/conda/lib/python3.9/site-packages/xarray/core/dataset.py", line 2474, in sel
pos_indexers, new_indexes = remap_label_indexers(
File "/opt/conda/lib/python3.9/site-packages/xarray/core/coordinates.py", line 421, in remap_label_indexers
pos_indexers, new_indexes = indexing.remap_label_indexers(
File "/opt/conda/lib/python3.9/site-packages/xarray/core/indexing.py", line 117, in remap_label_indexers
idxr, new_idx = index.query(labels, method=method, tolerance=tolerance)
File "/opt/conda/lib/python3.9/site-packages/xarray/core/indexes.py", line 198, in query
indexer = _query_slice(index, label, coord_name, method, tolerance)
File "/opt/conda/lib/python3.9/site-packages/xarray/core/indexes.py", line 91, in _query_slice
indexer = index.slice_indexer(
File "/opt/conda/lib/python3.9/site-packages/pandas/core/indexes/base.py", line 5278, in slice_indexer
start_slice, end_slice = self.slice_locs(start, end, step=step, kind=kind)
File "/opt/conda/lib/python3.9/site-packages/pandas/core/indexes/base.py", line 5483, in slice_locs
end_slice = self.get_slice_bound(end, "right", kind)
File "/opt/conda/lib/python3.9/site-packages/pandas/core/indexes/base.py", line 5397, in get_slice_bound
raise err
File "/opt/conda/lib/python3.9/site-packages/pandas/core/indexes/base.py", line 5391, in get_slice_bound
slc = self.get_loc(label)
File "/opt/conda/lib/python3.9/site-packages/xarray/coding/cftimeindex.py", line 466, in get_loc
return pd.Index.get_loc(self, key, method=method, tolerance=tolerance)
File "/opt/conda/lib/python3.9/site-packages/pandas/core/indexes/base.py", line 3083, in get_loc
raise KeyError(key) from err
KeyError: cftime.DatetimeNoLeap(2025, 12, 31, 23, 59, 59, 999999, has_year_zero=True)
The text was updated successfully, but these errors were encountered:
@brews I'd say we table this one for now - this is a lot of failures. The invalid tasmax values are cause for concern, and looks like ssp126 is actually missing data. We might want to push this to our second round.
From run https://argo.cildc6.org/workflows/default/e2e-tasmax-jobs-f8dkt
KIOST-ESM tasmax fails cleaning validation a couple different errors.
This blocks progress on run #315.
Log from ssp245:
From ssp126 (note the different timestep count):
ssp585, on the other hand, bombs validation with invalid tasmax values:
the historical cleaning fails when trying to crop late-end off of the ssp data before appending it to historical:
The text was updated successfully, but these errors were encountered: