Skip to content

Commit

Permalink
Merge pull request #6 from pytroll/master
Browse files Browse the repository at this point in the history
Bring my fork up to date
  • Loading branch information
adybbroe authored Apr 11, 2019
2 parents 3e4f423 + 35fbc9f commit 8f6ffba
Show file tree
Hide file tree
Showing 16 changed files with 328 additions and 58 deletions.
52 changes: 52 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,55 @@
## Version 0.14.0 (2019/04/09)

### Issues Closed

* [Issue 698](https://github.com/pytroll/satpy/issues/698) - Read WKT geotiff
* [Issue 692](https://github.com/pytroll/satpy/issues/692) - sdr_viirs_l1b reader fails in 0.13, recent master, Works with version 0.12.0 ([PR 693](https://github.com/pytroll/satpy/pull/693))
* [Issue 683](https://github.com/pytroll/satpy/issues/683) - Question: Change image size when saving with satpy.save_dataset ([PR 691](https://github.com/pytroll/satpy/pull/691))
* [Issue 681](https://github.com/pytroll/satpy/issues/681) - incorrect data offset in HSD files ([PR 689](https://github.com/pytroll/satpy/pull/689))
* [Issue 666](https://github.com/pytroll/satpy/issues/666) - Add drawing of lat lon graticules when saving dataset ([PR 668](https://github.com/pytroll/satpy/pull/668))
* [Issue 646](https://github.com/pytroll/satpy/issues/646) - Add 'demo' subpackage for accessing example data ([PR 686](https://github.com/pytroll/satpy/pull/686))
* [Issue 528](https://github.com/pytroll/satpy/issues/528) - Support dask version of PySpectral ([PR 529](https://github.com/pytroll/satpy/pull/529))
* [Issue 511](https://github.com/pytroll/satpy/issues/511) - Add/update documentation about composites and compositors ([PR 705](https://github.com/pytroll/satpy/pull/705))

In this release 8 issues were closed.

### Pull Requests Merged

#### Bugs fixed

* [PR 700](https://github.com/pytroll/satpy/pull/700) - Mask out invalid values in the precipitation probability product
* [PR 693](https://github.com/pytroll/satpy/pull/693) - Fix VIIRS SDR reading of visible channels at nighttime ([692](https://github.com/pytroll/satpy/issues/692))
* [PR 689](https://github.com/pytroll/satpy/pull/689) - Fix Himawari HSD reader's incorrect header information ([681](https://github.com/pytroll/satpy/issues/681))
* [PR 688](https://github.com/pytroll/satpy/pull/688) - Fix offset correction in seviri_l1b_hrit
* [PR 685](https://github.com/pytroll/satpy/pull/685) - Fix bug in Scene.resample causing AssertionError
* [PR 677](https://github.com/pytroll/satpy/pull/677) - Fix MultiScene save_animation when distributed isn't installed
* [PR 675](https://github.com/pytroll/satpy/pull/675) - Do not pass `filter_parameters` to the filehandler creation

#### Features added

* [PR 691](https://github.com/pytroll/satpy/pull/691) - Add Scene.aggregate method (python 3 only) ([683](https://github.com/pytroll/satpy/issues/683))
* [PR 686](https://github.com/pytroll/satpy/pull/686) - Add demo subpackage to simplify test data download ([646](https://github.com/pytroll/satpy/issues/646))
* [PR 676](https://github.com/pytroll/satpy/pull/676) - Feature add nightfog modis
* [PR 674](https://github.com/pytroll/satpy/pull/674) - Use platform ID to choose the right reader for AVHRR GAC data
* [PR 671](https://github.com/pytroll/satpy/pull/671) - Add satellite position to dataset attributes (seviri_l1b_hrit)
* [PR 669](https://github.com/pytroll/satpy/pull/669) - Add ocean-color for viirs and modis
* [PR 668](https://github.com/pytroll/satpy/pull/668) - Add grid/graticules to add_overlay function. ([666](https://github.com/pytroll/satpy/issues/666))
* [PR 665](https://github.com/pytroll/satpy/pull/665) - Add reader for VIIRS Active Fires
* [PR 645](https://github.com/pytroll/satpy/pull/645) - Reader for the SAR OCN L2 wind product in SAFE format.
* [PR 565](https://github.com/pytroll/satpy/pull/565) - Add reader for FY-3 VIRR (virr_l1b)
* [PR 529](https://github.com/pytroll/satpy/pull/529) - Add dask support to NIRReflectance modifier ([528](https://github.com/pytroll/satpy/issues/528))

#### Documentation changes

* [PR 707](https://github.com/pytroll/satpy/pull/707) - Add ABI Meso demo data case and clean up documentation
* [PR 705](https://github.com/pytroll/satpy/pull/705) - Document composites ([511](https://github.com/pytroll/satpy/issues/511))
* [PR 701](https://github.com/pytroll/satpy/pull/701) - Clarify release instructions
* [PR 699](https://github.com/pytroll/satpy/pull/699) - Rename SatPy to Satpy throughout documentation
* [PR 673](https://github.com/pytroll/satpy/pull/673) - Add information about GDAL_CACHEMAX to FAQ

In this release 23 pull requests were closed.


## Version 0.13.0 (2019/03/18)

### Issues Closed
Expand Down
12 changes: 0 additions & 12 deletions doc/source/api/satpy.demo.rst
Original file line number Diff line number Diff line change
@@ -1,18 +1,6 @@
satpy.demo package
==================

Submodules
----------

satpy.demo.google\_cloud\_platform module
-----------------------------------------

.. automodule:: satpy.demo.google_cloud_platform
:members:
:undoc-members:
:show-inheritance:


Module contents
---------------

Expand Down
10 changes: 9 additions & 1 deletion doc/source/api/satpy.readers.rst
Original file line number Diff line number Diff line change
Expand Up @@ -365,7 +365,7 @@ satpy.readers.viirs\_compact module
:show-inheritance:

satpy.readers.viirs\_edr\_active\_fires module
--------------------------------------
----------------------------------------------

.. automodule:: satpy.readers.viirs_edr_active_fires
:members:
Expand Down Expand Up @@ -396,6 +396,14 @@ satpy.readers.viirs\_sdr module
:undoc-members:
:show-inheritance:

satpy.readers.virr\_l1b module
------------------------------

.. automodule:: satpy.readers.virr_l1b
:members:
:undoc-members:
:show-inheritance:

satpy.readers.xmlformat module
------------------------------

Expand Down
3 changes: 2 additions & 1 deletion doc/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,8 @@ def __getattr__(cls, name):
# Add any Sphinx extension module names here, as strings. They can be extensions
# coming with Sphinx (named 'sphinx.ext.*') or your custom ones.
extensions = ['sphinx.ext.autodoc', 'sphinx.ext.intersphinx', 'sphinx.ext.todo', 'sphinx.ext.coverage',
'sphinx.ext.doctest', 'sphinx.ext.napoleon', 'sphinx.ext.autosummary', 'doi_role']
'sphinx.ext.doctest', 'sphinx.ext.napoleon', 'sphinx.ext.autosummary', 'doi_role',
'sphinx.ext.viewcode']

# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
Expand Down
7 changes: 5 additions & 2 deletions doc/source/data_download.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,11 @@ Downloading Data
================

One of the main features of Satpy is its ability to read various satellite
data formats. However, it does not currently provide any functionality for
downloading data from any remote sources. Satpy assumes all data is available
data formats. However, it currently only provides limited methods for
downloading data from remote sources and these methods are limited to demo
data for `Pytroll examples <https://github.com/pytroll/pytroll-examples>`_.
See the examples and the :mod:`~satpy.demo` API documentation for details.
Otherwise, Satpy assumes all data is available
through the local system, either as a local directory or network
mounted file systems. Certain readers that use ``xarray`` to open data files
may be able to load files from remote systems by using OpenDAP or similar
Expand Down
25 changes: 16 additions & 9 deletions satpy/composites/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -49,6 +49,16 @@
from satpy.writers import get_enhanced_image
from satpy import CHUNK_SIZE

try:
from pyspectral.near_infrared_reflectance import Calculator
except ImportError:
Calculator = None
try:
from pyorbital.astronomy import sun_zenith_angle
except ImportError:
sun_zenith_angle = None


LOG = logging.getLogger(__name__)


Expand Down Expand Up @@ -550,7 +560,7 @@ def __call__(self, projectables, optional_datasets=None, **info):
self._init_refl3x(projectables)
_nir, _ = projectables
refl = self._get_reflectance(projectables, optional_datasets) * 100
proj = xr.DataArray(refl.filled(np.nan), dims=_nir.dims,
proj = xr.DataArray(refl, dims=_nir.dims,
coords=_nir.coords, attrs=_nir.attrs)

proj.attrs['units'] = '%'
Expand All @@ -560,12 +570,9 @@ def __call__(self, projectables, optional_datasets=None, **info):

def _init_refl3x(self, projectables):
"""Initiate the 3.x reflectance derivations."""
try:
from pyspectral.near_infrared_reflectance import Calculator
except ImportError:
if not Calculator:
LOG.info("Couldn't load pyspectral")
raise

raise ImportError("No module named pyspectral.near_infrared_reflectance")
_nir, _tb11 = projectables
self._refl3x = Calculator(_nir.attrs['platform_name'], _nir.attrs['sensor'], _nir.attrs['name'])

Expand All @@ -588,9 +595,10 @@ def _get_reflectance(self, projectables, optional_datasets):

# Check if the sun-zenith angle was provided:
if sun_zenith is None:
from pyorbital.astronomy import sun_zenith_angle as sza
if sun_zenith_angle is None:
raise ImportError("No module named pyorbital.astronomy")
lons, lats = _nir.attrs["area"].get_lonlats_dask(CHUNK_SIZE)
sun_zenith = sza(_nir.attrs['start_time'], lons, lats)
sun_zenith = sun_zenith_angle(_nir.attrs['start_time'], lons, lats)

return self._refl3x.reflectance_from_tbs(sun_zenith, _nir, _tb11, tb_ir_co2=tb13_4)

Expand All @@ -610,7 +618,6 @@ def __call__(self, projectables, optional_datasets=None, **info):
# needs to be derived first in order to get the emissive part.
_ = self._get_reflectance(projectables, optional_datasets)
_nir, _ = projectables
raise NotImplementedError("This compositor wasn't fully converted to dask yet.")
proj = xr.DataArray(self._refl3x.emissive_part_3x(), attrs=_nir.attrs,
dims=_nir.dims, coords=_nir.coords)

Expand Down
83 changes: 73 additions & 10 deletions satpy/demo/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,17 +22,22 @@
for how the data is downloaded (via the ``method`` keyword argument)
including:
- gcsfs: Download data from a public google cloud storage bucket using the
- gcsfs:
Download data from a public google cloud storage bucket using the
``gcsfs`` package.
- unidata_thredds: Access data using OpenDAP or similar method from Unidata's
- unidata_thredds:
Access data using OpenDAP or similar method from Unidata's
public THREDDS server
(https://thredds.unidata.ucar.edu/thredds/catalog.html).
- uwaos_thredds: Access data using OpenDAP or similar method from the
- uwaos_thredds:
Access data using OpenDAP or similar method from the
University of Wisconsin - Madison's AOS department's THREDDS server.
- http: A last resort download method when nothing else is available of a
- http:
A last resort download method when nothing else is available of a
tarball or zip file from one or more servers available to the Satpy
project.
- uw_arcdata: A network mount available on many servers at the Space Science
- uw_arcdata:
A network mount available on many servers at the Space Science
and Engineering Center (SSEC) at the University of Wisconsin - Madison.
This is method is mainly meant when tutorials are taught at the SSEC
using a Jupyter Hub server.
Expand All @@ -51,7 +56,7 @@
LOG = logging.getLogger(__name__)


def makedirs(directory, exist_ok=False):
def _makedirs(directory, exist_ok=False):
"""Python 2.7 friendly os.makedirs.
After python 2.7 is dropped, just use `os.makedirs` with `existsok=True`.
Expand All @@ -64,7 +69,7 @@ def makedirs(directory, exist_ok=False):


def get_us_midlatitude_cyclone_abi(base_dir='.', method=None, force=False):
"""Get GOES-16 ABI (CONUS sector) data from March 14th 00:00Z.
"""Get GOES-16 ABI (CONUS sector) data from 2019-03-14 00:00Z.
Args:
base_dir (str): Base directory for downloaded files.
Expand All @@ -75,17 +80,75 @@ def get_us_midlatitude_cyclone_abi(base_dir='.', method=None, force=False):
the local system. Warning: May delete non-demo files stored in
download directory.
Total size: ~110MB
"""
if method is None:
method = 'gcsfs'
if method not in ['gcsfs']:
raise NotImplementedError("Demo data download method '{}' not "
"implemented yet.".format(method))

from .google_cloud_platform import get_bucket_files
from ._google_cloud_platform import get_bucket_files
patterns = ['gs://gcp-public-data-goes-16/ABI-L1b-RadC/2019/073/00/*0002*.nc']
subdir = os.path.join(base_dir, 'abi_l1b', '20190314_us_midlatitude_cyclone')
makedirs(subdir, exist_ok=True)
_makedirs(subdir, exist_ok=True)
filenames = get_bucket_files(patterns, subdir, force=force)
assert len(filenames) == 16, "Not all ABI files could be downloaded"
assert len(filenames) == 16, "Not all files could be downloaded"
return filenames


def get_hurricane_florence_abi(base_dir='.', method=None, force=False,
channels=range(1, 17), num_frames=10):
"""Get GOES-16 ABI (Meso sector) data from 2018-09-11 13:00Z to 17:00Z.
Args:
base_dir (str): Base directory for downloaded files.
method (str): Force download method for the data if not already cached.
Allowed options are: 'gcsfs'. Default of ``None`` will
choose the best method based on environment settings.
force (bool): Force re-download of data regardless of its existence on
the local system. Warning: May delete non-demo files stored in
download directory.
channels (list): Channels to include in download. Defaults to all
16 channels.
num_frames (int or slice): Number of frames to download. Maximum
240 frames. Default 10 frames.
Size per frame (all channels): ~15MB
Total size (default 10 frames, all channels): ~124MB
Total size (240 frames, all channels): ~3.5GB
"""
if method is None:
method = 'gcsfs'
if method not in ['gcsfs']:
raise NotImplementedError("Demo data download method '{}' not "
"implemented yet.".format(method))
if isinstance(num_frames, (int, float)):
frame_slice = slice(0, num_frames)
else:
frame_slice = num_frames

from ._google_cloud_platform import get_bucket_files

patterns = []
for channel in channels:
# patterns += ['gs://gcp-public-data-goes-16/ABI-L1b-RadM/2018/254/1[3456]/'
# '*C{:02d}*s20182541[3456]*.nc'.format(channel)]
patterns += [(
'gs://gcp-public-data-goes-16/ABI-L1b-RadM/2018/254/13/*RadM1*C{:02d}*s201825413*.nc'.format(channel),
'gs://gcp-public-data-goes-16/ABI-L1b-RadM/2018/254/14/*RadM1*C{:02d}*s201825414*.nc'.format(channel),
'gs://gcp-public-data-goes-16/ABI-L1b-RadM/2018/254/15/*RadM1*C{:02d}*s201825415*.nc'.format(channel),
'gs://gcp-public-data-goes-16/ABI-L1b-RadM/2018/254/16/*RadM1*C{:02d}*s201825416*.nc'.format(channel),
)]
subdir = os.path.join(base_dir, 'abi_l1b', '20180911_hurricane_florence_abi_l1b')
_makedirs(subdir, exist_ok=True)
filenames = get_bucket_files(patterns, subdir, force=force, pattern_slice=frame_slice)

actual_slice = frame_slice.indices(240) # 240 max frames
num_frames = int((actual_slice[1] - actual_slice[0]) / actual_slice[2])
assert len(filenames) == len(channels) * num_frames, "Not all files could be downloaded"
return filenames
Original file line number Diff line number Diff line change
Expand Up @@ -41,18 +41,24 @@ def is_google_cloud_instance():
return False


def get_bucket_files(glob_pattern, base_dir, force=False):
def get_bucket_files(glob_pattern, base_dir, force=False, pattern_slice=slice(None)):
"""Helper function to download files from Google Cloud Storage.
Args:
glob_pattern (str or list): Glob pattern string or series of patterns
used to search for on Google Cloud Storage. The pattern should
include the "gs://" protocol prefix.
include the "gs://" protocol prefix. If a list of lists, then the
results of each sublist pattern are concatenated and the result is
treated as one pattern result. This is important for things like
``pattern_slice`` and complicated glob patterns not supported by
GCP.
base_dir (str): Root directory to place downloaded files on the local
system.
force (bool): Force re-download of data regardless of its existence on
the local system. Warning: May delete non-demo files stored in
download directory.
pattern_slice (slice): Slice object to limit the number of files
returned by each glob pattern.
"""
if gcsfs is None:
Expand All @@ -67,16 +73,25 @@ def get_bucket_files(glob_pattern, base_dir, force=False):
fs = gcsfs.GCSFileSystem(token='anon')
filenames = []
for gp in glob_pattern:
for fn in fs.glob(gp):
# handle multiple glob patterns being treated as one pattern
# for complicated patterns that GCP can't handle
if isinstance(gp, str):
glob_results = list(fs.glob(gp))
else:
# flat list of results
glob_results = [fn for pat in gp for fn in fs.glob(pat)]

for fn in glob_results[pattern_slice]:
ondisk_fn = os.path.basename(fn)
ondisk_pathname = os.path.join(base_dir, ondisk_fn)
filenames.append(ondisk_pathname)
LOG.info("Downloading: {}".format(ondisk_pathname))

if force and os.path.isfile(ondisk_pathname):
os.remove(ondisk_pathname)
elif os.path.isfile(ondisk_pathname):
LOG.info("Found existing: {}".format(ondisk_pathname))
continue
LOG.info("Downloading: {}".format(ondisk_pathname))
fs.get('gs://' + fn, ondisk_pathname)

if not filenames:
Expand Down
8 changes: 4 additions & 4 deletions satpy/etc/areas.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -399,8 +399,8 @@ seviri_0deg:
height: 3712
width: 3712
area_extent:
lower_left_xy: [-5570248.477339261, -5567248.074173444]
upper_right_xy: [5567248.074173444, 5570248.477339261]
lower_left_xy: [-5570248.686685662, -5567248.28340708]
upper_right_xy: [5567248.28340708, 5570248.686685662]
seviri_iodc:
description: Full globe MSG image 41.5 degrees
projection:
Expand All @@ -413,8 +413,8 @@ seviri_iodc:
height: 3712
width: 3712
area_extent:
lower_left_xy: [-5570248.477339261, -5567248.074173444]
upper_right_xy: [5567248.074173444, 5570248.477339261]
lower_left_xy: [-5570248.686685662, -5567248.28340708]
upper_right_xy: [5567248.28340708, 5570248.686685662]
msg_resample_area:
description: Full globe MSG image 20.75 degrees
projection:
Expand Down
4 changes: 2 additions & 2 deletions satpy/readers/seviri_l1b_hrit.py
Original file line number Diff line number Diff line change
Expand Up @@ -213,9 +213,9 @@ def _find_navigation_coefs(self):
# Find index of interval enclosing the nominal timestamp of the scan
time = np.datetime64(self.prologue['ImageAcquisition']['PlannedAcquisitionTime']['TrueRepeatCycleStart'])
intervals_tstart = self.prologue['SatelliteStatus']['Orbit']['OrbitPolynomial']['StartTime'][0].astype(
'datetime64')
'datetime64[us]')
intervals_tend = self.prologue['SatelliteStatus']['Orbit']['OrbitPolynomial']['EndTime'][0].astype(
'datetime64')
'datetime64[us]')
try:
return np.where(np.logical_and(time >= intervals_tstart, time < intervals_tend))[0][0]
except IndexError:
Expand Down
Loading

0 comments on commit 8f6ffba

Please sign in to comment.