You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
What happened:
If a 0d dask array is passed as an argument to isel(), an error occurs because dask arrays do not have a .item() method. I came across this when trying to use the result of da.argmax() from a dask-backed array to select from the DataArray.
What you expected to happen: isel() returns the value at the index contained in the 0d dask array.
and k.values.item() or int(k.data) would fix my issue, but I don't know the reason for using .item() in the first place, so I'm not sure if either of these would have some undesirable side-effect.
May be related to #2511, but from the code snippet above, I think this is a specific issue of 0d dask arrays rather than a generic dask-indexing issue like #2511.
I'd like to fix this because it breaks the nice new features of argmin() and argmax() if the DataArray is dask-backed.
Environment:
Output of xr.show_versions()
INSTALLED VERSIONS
commit: None
python: 3.7.6 | packaged by conda-forge | (default, Jun 1 2020, 18:57:50)
[GCC 7.5.0]
python-bits: 64
OS: Linux
OS-release: 5.4.0-42-generic
machine: x86_64
processor: x86_64
byteorder: little
LC_ALL: None
LANG: en_GB.UTF-8
LOCALE: en_GB.UTF-8
libhdf5: 1.10.5
libnetcdf: 4.7.4
* Attempt to fix indexing for Dask
This is a naive attempt to make `isel` work with Dask
Known limitation: it triggers the computation.
* Works now.
* avoid importorskip
* More tests and fixes
* Raise nicer error when indexing with boolean dask array
* Annotate tests
* edit query tests
* [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
* Fixes#4276
Pass 0d dask arrays through for indexing.
* Add xfail notes.
* backcompat: vendor np.broadcast_shapes
* Small improvement
* fix: Handle scalars properly.
* fix bad test
* Check computes with setitem
* Better error
* Cleanup
* Raise nice error with VectorizedIndexer and dask.
* Add whats-new
---------
Co-authored-by: dcherian <[email protected]>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Deepak Cherian <[email protected]>
What happened:
If a 0d dask array is passed as an argument to
isel()
, an error occurs because dask arrays do not have a.item()
method. I came across this when trying to use the result ofda.argmax()
from a dask-backed array to select from the DataArray.What you expected to happen:
isel()
returns the value at the index contained in the 0d dask array.Minimal Complete Verifiable Example:
Anything else we need to know?:
I think the problem is here
xarray/xarray/core/variable.py
Lines 546 to 548 in a198218
and
k.values.item()
orint(k.data)
would fix my issue, but I don't know the reason for using.item()
in the first place, so I'm not sure if either of these would have some undesirable side-effect.May be related to #2511, but from the code snippet above, I think this is a specific issue of 0d dask arrays rather than a generic dask-indexing issue like #2511.
I'd like to fix this because it breaks the nice new features of
argmin()
andargmax()
if theDataArray
is dask-backed.Environment:
Output of xr.show_versions()
INSTALLED VERSIONS
commit: None
python: 3.7.6 | packaged by conda-forge | (default, Jun 1 2020, 18:57:50)
[GCC 7.5.0]
python-bits: 64
OS: Linux
OS-release: 5.4.0-42-generic
machine: x86_64
processor: x86_64
byteorder: little
LC_ALL: None
LANG: en_GB.UTF-8
LOCALE: en_GB.UTF-8
libhdf5: 1.10.5
libnetcdf: 4.7.4
xarray: 0.16.0
pandas: 1.0.5
numpy: 1.18.5
scipy: 1.4.1
netCDF4: 1.5.3
pydap: None
h5netcdf: None
h5py: 2.10.0
Nio: None
zarr: None
cftime: 1.2.1
nc_time_axis: None
PseudoNetCDF: None
rasterio: None
cfgrib: None
iris: None
bottleneck: None
dask: 2.19.0
distributed: 2.21.0
matplotlib: 3.2.2
cartopy: None
seaborn: None
numbagg: None
pint: 0.13
setuptools: 49.2.0.post20200712
pip: 20.1.1
conda: 4.8.3
pytest: 5.4.3
IPython: 7.15.0
sphinx: None
The text was updated successfully, but these errors were encountered: