Skip to content

Commit

Permalink
Merge remote-tracking branch 'ufs/develop' into uw_configlayer_integr…
Browse files Browse the repository at this point in the history
…ation
  • Loading branch information
christinaholtNOAA committed Feb 21, 2025
2 parents ed96831 + b249d58 commit 89d423c
Show file tree
Hide file tree
Showing 14 changed files with 95 additions and 116 deletions.
2 changes: 1 addition & 1 deletion .github/scripts/check_tech_doc.sh
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
set -eo pipefail

# Install prerequisites
pip install sphinx
pip install Sphinx==7.4.7
pip install sphinx-rtd-theme
pip install sphinxcontrib-bibtex

Expand Down
7 changes: 6 additions & 1 deletion .github/workflows/doc_tests.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -22,4 +22,9 @@ jobs:
- name: Build documentation
run: |
cd doc
make doc
make clean && make html
- name: Check links
if: ${{ !cancelled() }}
run: |
cd doc
make linkcheck
2 changes: 1 addition & 1 deletion doc/Makefile
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Makefile for Sphinx documentation

SPHINXOPTS = -a -n -W
SPHINXOPTS = -a -n -W --keep-going
SPHINXBUILD = sphinx-build
SOURCEDIR = .
BUILDDIR = build
Expand Down
18 changes: 7 additions & 11 deletions doc/UsersGuide/BuildingRunningTesting/AQM.rst
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ Quick Start Guide (SRW-AQM)

.. attention::

These instructions should work smoothly on Hera and WCOSS2, but users on other systems may need to make additional adjustments.
These instructions should work smoothly on Hera, Hercules, Derecho and Orion but users on other systems may need to make additional adjustments.

Download the Code
-------------------
Expand Down Expand Up @@ -51,7 +51,7 @@ On Hera and WCOSS2, users can build the SRW App AQM binaries with the following
./devbuild.sh -p=<machine> -a=ATMAQ
where ``<machine>`` is ``hera``, or ``wcoss2``. The ``-a`` argument indicates the configuration/version of the application to build.
where ``<machine>`` is ``hera``, ``hercules``, ``orion``, or ``derecho``. The ``-a`` argument indicates the configuration/version of the application to build.

Building the SRW App with AQM on other machines, including other :srw-wiki:`Level 1 <Supported-Platforms-and-Compilers>` platforms, is not currently guaranteed to work, and users may have to make adjustments to the modulefiles for their system.

Expand Down Expand Up @@ -79,13 +79,11 @@ Load the python environment for the workflow:

.. code-block:: console
# On WCOSS2 (do not run on other systems):
source ../versions/run.ver.wcoss2
# On all systems (including WCOSS2):
# On all systems:
module use /path/to/ufs-srweather-app/modulefiles
module load wflow_<machine>
where ``<machine>`` is ``hera`` or ``wcoss2``. The workflow should load on other platforms listed under the ``MACHINE`` variable in :numref:`Section %s <user>`, but users may need to adjust other elements of the process when running on those platforms.
where ``<machine>`` is ``hera``, ``hercules``, ``orion``, or ``derecho``. The workflow should load on other platforms listed under the ``MACHINE`` variable in :numref:`Section %s <user>`, but users may need to adjust other elements of the process when running on those platforms.

If the console outputs a message, the user should run the commands specified in the message. For example, if the output says:

Expand All @@ -106,10 +104,8 @@ Users will need to configure their experiment by setting parameters in the ``con
.. code-block:: console
cd ush
cp config.aqm.community.yaml config.yaml
cp config.aqm.yaml config.yaml
Users may prefer to copy the ``config.aqm.nco.realtime.yaml`` for a default "nco" mode experiment instead.

Users will need to change the ``MACHINE`` and ``ACCOUNT`` variables in ``config.yaml`` to match their system. They may also wish to adjust other experiment settings. For more information on each task and variable, see :numref:`Section %s <ConfigWorkflow>`.

The community AQM configuration assumes that users have :term:`HPSS` access and attempts to download the data from HPSS. However, if users have the data on their system already, they may prefer to add the following lines to ``task_get_extrn_*:`` in their ``config.yaml`` file, adjusting the file path to point to the correct data locations:
Expand All @@ -125,7 +121,7 @@ The community AQM configuration assumes that users have :term:`HPSS` access and
On Level 1 systems, users can find :term:`ICs/LBCs` in the usual :ref:`input data locations <Data>` under ``FV3GFS/netcdf/2023021700`` and ``FV3GFS/netcdf/2023021706``. Users can also download the data required for the community experiment from the `UFS SRW App Data Bucket <https://noaa-ufs-srw-pds.s3.amazonaws.com/index.html#develop-20240618/input_model_data/FV3GFS/netcdf/>`__.

Users may also wish to change :term:`cron`-related parameters in ``config.yaml``. In the ``config.aqm.community.yaml`` file, which was copied into ``config.yaml``, cron is used for automatic submission and resubmission of the workflow:
Users may also wish to change :term:`cron`-related parameters in ``config.yaml``. In the ``config.aqm.yaml`` file, which was copied into ``config.yaml``, cron is used for automatic submission and resubmission of the workflow:

.. code-block:: console
Expand All @@ -135,7 +131,7 @@ Users may also wish to change :term:`cron`-related parameters in ``config.yaml``
This means that cron will submit the launch script every 3 minutes. Users may choose not to submit using cron or to submit at a different frequency. Note that users should create a crontab by running ``crontab -e`` the first time they use cron.

When using the basic ``config.aqm.community.yaml`` experiment, the AQM pre-processing tasks are automatically turned on because ``"parm/wflow/aqm_prep.yaml"`` appears in the list of workflow files in the ``rocoto: tasks: taskgroups:`` section of ``config.yaml`` (see :numref:`Section %s <TasksPrepAQM>` for task descriptions). To turn on AQM *post*-processing tasks in the workflow, include ``"parm/wflow/aqm_post.yaml"`` in the ``rocoto: tasks: taskgroups:`` section, too (see :numref:`Section %s <TasksPostAQM>` for task descriptions).
When using the basic ``config.aqm.yaml`` experiment, the AQM pre-processing tasks are automatically turned on because ``"parm/wflow/aqm_prep.yaml"`` appears in the list of workflow files in the ``rocoto: tasks: taskgroups:`` section of ``config.yaml`` (see :numref:`Section %s <TasksPrepAQM>` for task descriptions). To turn on AQM *post*-processing tasks in the workflow, include ``"parm/wflow/aqm_post.yaml"`` in the ``rocoto: tasks: taskgroups:`` section, too (see :numref:`Section %s <TasksPostAQM>` for task descriptions).

.. attention::

Expand Down
8 changes: 6 additions & 2 deletions doc/UsersGuide/BuildingRunningTesting/RunSRW.rst
Original file line number Diff line number Diff line change
Expand Up @@ -310,9 +310,9 @@ On platforms where Rocoto and :term:`cron` are available, users can automate res
USE_CRON_TO_RELAUNCH: true
CRON_RELAUNCH_INTVL_MNTS: 3
.. note::
.. attention::

On Orion, *cron* is only available on the orion-login-1 node, so users will need to work on that node when running *cron* jobs on Orion.
Cron is not available on Derecho or Gaea. On Orion/Hercules, users must be logged into the [hercules/orion]-login-1 node to use cron.

When running with GNU compilers (i.e., if the modulefile used to set up the build environment in :numref:`Section %s <BuildExecutables>` uses a GNU compiler), users must also set ``COMPILER: "gnu"`` in the ``workflow:`` section of the ``config.yaml`` file.

Expand Down Expand Up @@ -1340,6 +1340,10 @@ where ``/path/to/experiment/directory`` is changed to correspond to the user's `
* On NOAA Cloud instances, ``*/1 * * * *`` (or ``CRON_RELAUNCH_INTVL_MNTS: 1``) is the preferred option for cron jobs because compute nodes will shut down if they remain idle too long. If the compute node shuts down, it can take 15-20 minutes to start up a new one.
* On other NOAA HPC systems, administrators discourage using ``*/1 * * * *`` due to load problems. ``*/3 * * * *`` (or ``CRON_RELAUNCH_INTVL_MNTS: 3``) is the preferred option for cron jobs on other Level 1 systems.

.. attention::

Cron is not available on Derecho or Gaea. On Orion/Hercules, users must be logged into the [hercules/orion]-login-1 node to use cron.

To check the experiment progress:

.. code-block:: console
Expand Down
4 changes: 2 additions & 2 deletions doc/UsersGuide/CustomizingTheWorkflow/ConfigWorkflow.rst
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ If non-default parameters are selected for the variables in this section, they s
Setting ``RUN_ENVIR`` to "community" is recommended in most cases for users who are not running in NCO's production environment. Valid values: ``"nco"`` | ``"community"``

``MACHINE``: (Default: "BIG_COMPUTER")
The machine (a.k.a. platform or system) on which the workflow will run. Currently supported platforms are listed on the :srw-wiki:`SRW App Wiki page <Supported-Platforms-and-Compilers>`. When running the SRW App on any ParallelWorks/NOAA Cloud system, use "NOAACLOUD" regardless of the underlying system (AWS, GCP, or Azure). Valid values: ``"HERA"`` | ``"ORION"`` | ``"HERCULES"`` | ``"JET"`` | ``"CHEYENNE"`` | ``"DERECHO"`` | ``"GAEA"`` | ``"GAEA-C6"`` | ``"NOAACLOUD"`` | ``"STAMPEDE"`` | ``"ODIN"`` | ``"MACOS"`` | ``"LINUX"`` | ``"SINGULARITY"`` | ``"WCOSS2"`` (Check ``ufs-srweather-app/ush/valid_param_vals.yaml`` for the most up-to-date list of supported platforms.)
The machine (a.k.a. platform or system) on which the workflow will run. Currently supported platforms are listed on the :srw-wiki:`SRW App Wiki page <Supported-Platforms-and-Compilers>`. When running the SRW App on any ParallelWorks/NOAA Cloud system, use "NOAACLOUD" regardless of the underlying system (AWS, GCP, or Azure). Valid values: ``"HERA"`` | ``"ORION"`` | ``"HERCULES"`` | ``"JET"`` | ``"DERECHO"`` | ``"GAEA"`` | ``"GAEA-C6"`` | ``"NOAACLOUD"`` | ``"MACOS"`` | ``"LINUX"`` | ``"SINGULARITY"`` | ``"WCOSS2"`` (Check ``ufs-srweather-app/ush/valid_param_vals.yaml`` for the most up-to-date list of supported platforms.)

.. hint::
Users who are NOT on a named, supported Level 1 or 2 platform will need to set the ``MACHINE`` variable to ``LINUX`` or ``MACOS``. To combine use of a Linux or MacOS platform with the Rocoto workflow manager, users will also need to set ``WORKFLOW_MANAGER: "rocoto"`` in the ``platform:`` section of ``config.yaml``. This combination will assume a Slurm batch manager when generating the XML.
Expand Down Expand Up @@ -2073,7 +2073,7 @@ Non-default parameters for coupled Air Quality Modeling (AQM) tasks are set in t


Smoke and Dust Configuration Parameters
=====================================
=========================================

Non-default parameters for Smoke and Dust tasks are set in the ``smoke_dust_parm:`` section of the ``config.yaml`` file.

Expand Down
39 changes: 39 additions & 0 deletions doc/UsersGuide/Reference/FAQ.rst
Original file line number Diff line number Diff line change
Expand Up @@ -208,6 +208,45 @@ See `SRW Discussion #990 <https://github.com/ufs-community/ufs-srweather-app/dis

.. _AddPhys:

How do I change the number of vertical levels in the SRW App?
==============================================================

The predefined grids included with the SRW App are configured to run with 65 levels by default. However, advanced users may wish to vary the number of vertical levels in the grids they are using, and documentation has recently been added explaining how to do this! Users can check out the `Limited Area Model Grid <https://ufs-srweather-app.readthedocs.io/en/develop/UsersGuide/CustomizingTheWorkflow/LAMGrids.html#changing-the-number-of-vertical-levels>`__ chapter for instructions.

After modifying one of the SRW App executables, how can I rebuild the executable?
==================================================================================

When users make changes to one of the SRW App executables, they can rerun the ``devbuild.sh`` script using the command ``./devbuild.sh --platform=<machine_name>``. This will eventually bring up three options: ``[R]emove, [C]ontinue, or [Q]uit``.

The Continue option will recompile the modified routines and rebuild only the affected executables. The Remove option provides a clean build; it completely removes the existing build directory and rebuilds all executables from scratch instead of reusing the existing build where possible. The build log files for the CMake and Make step will appear in ``ufs-srweather-app/build/log.cmake`` and ``ufs-srweather-app/build/log.make``; any errors encountered should be detailed in those files.

Users should note that the Continue option may not work as expected for changes to CCPP because the ``ccpp_prebuild.py`` script will not be rerun. It is typically best to recompile the model entirely in this case by selecting the Remove option for a clean build.

A convenience script, ``devclean.sh``, is also available. This script can be used to remove build artifacts in cases where something goes wrong with the build or where changes have been made to the source code and the executables need to be rebuilt. Users can run this script by entering ``./devclean.sh -a``. Following this step, they can rerun the ``devbuild.sh`` script to rebuild the SRW App. Running ``./devclean.sh -h`` will list additional options available.

See `SRW Discussion #1007 <https://github.com/ufs-community/ufs-srweather-app/discussions/1007>`__ for the question that inspired this FAQ.

If I log out of my system after building the SRW App and want to return to work later, where do I start?
=========================================================================================================

If you logged out before building the SRW App, you can `restart your work from the step you left off on <https://ufs-srweather-app.readthedocs.io/en/develop/UsersGuide/BuildingRunningTesting/Quickstart.html#building-and-running-the-ufs-srw-application>`__ before logging out.

If you already built the SRW App, you can simply reload the conda environment and then pick up where you left off:

.. code-block:: console
source /path/to/ufs-srweather-app/etc/lmod-setup.sh <platform>
module use /path/to/ufs-srweather-app/modulefiles
module load wflow_<platform>
conda activate srw_app
For example, from here, you can `configure a new experiment <https://ufs-srweather-app.readthedocs.io/en/develop/UsersGuide/BuildingRunningTesting/RunSRW.html#general-instructions-for-all-systems>`__ (in config.yaml) or check on `progress from an old experiment <https://ufs-srweather-app.readthedocs.io/en/develop/UsersGuide/BuildingRunningTesting/RunSRW.html#run-the-workflow-using-rocoto>`__ (e.g., using rocotostat or the tail command from within the experiment directory).

I ran one experiment; how can I configure a new experiment?
============================================================

To configure a new experiment, users can simply modify variables in their existing ush/config.yaml file. When the user generates a new experiment by running ./generate_FV3LAM_wflow.py, the contents of their ush/config.yaml file are copied into the new experiment directory and used throughout the new experiment. Even though modifying ush/config.yaml overwrites the contents of that file, everything needed for a given experiment will be copied or linked to its experiment directory when generating an experiment. Presuming that the user either changes the EXPT_SUBDIR name or keeps the default PREEXISTING_DIR_METHOD: rename setting, details of any previous configurations can be referenced by checking the config.yaml copy in a previous experiments’ experiment directory.

:bolditalic:`Advanced:` How can I add a physics scheme (e.g., YSU PBL) to the UFS SRW App?
===============================================================================================

Expand Down
6 changes: 3 additions & 3 deletions doc/UsersGuide/Reference/Glossary.rst
Original file line number Diff line number Diff line change
Expand Up @@ -100,10 +100,10 @@ Glossary

FV3
The Finite-Volume Cubed-Sphere :term:`dynamical core` (dycore). Developed at NOAA's `Geophysical
Fluid Dynamics Laboratory <https://www.gfdl.noaa.gov/>`__ (GFDL), it is a scalable and flexible dycore capable of both hydrostatic and non-hydrostatic atmospheric simulations. It is the dycore used in the UFS Weather Model.
Fluid Dynamics Laboratory <https://www.gfdl.noaa.gov/>`_ (GFDL), it is a scalable and flexible dycore capable of both hydrostatic and non-hydrostatic atmospheric simulations. It is the dycore used in the UFS Weather Model.

FVCOM
`Finite Volume Community Ocean Model <https://www.fvcom.org/>`__. FVCOM is used in modeling work for the `Great Lakes Coastal Forecasting System (next-gen FVCOM) <https://www.glerl.noaa.gov/res/glcfs/>`__ conducted by the `Great Lakes Environmental Research Laboratory <https://www.glerl.noaa.gov/>`__.
`Finite Volume Community Ocean Model <http://www.fvcom.org/>`_. FVCOM is used in modeling work for the `Great Lakes Coastal Forecasting System (next-gen FVCOM) <https://www.glerl.noaa.gov/res/glcfs/>`_ conducted by the `Great Lakes Environmental Research Laboratory <https://www.glerl.noaa.gov/>`_.

GFS
`Global Forecast System <https://www.ncei.noaa.gov/products/weather-climate-models/global-forecast>`_. The GFS is a National Centers for Environmental Prediction (:term:`NCEP`) weather forecast model that generates data for dozens of atmospheric and land-soil variables, including temperatures, winds, precipitation, soil moisture, and atmospheric ozone concentration. The system couples four separate models (atmosphere, ocean, land/soil, and sea ice) that work together to accurately depict weather conditions.
Expand Down Expand Up @@ -165,7 +165,7 @@ Glossary
MPI stands for Message Passing Interface. An MPI is a standardized communication system used in parallel programming. It establishes portable and efficient syntax for the exchange of messages and data between multiple processors that are used by a single computer program. An MPI is required for high-performance computing (HPC) systems.

MRMS
Multi-Radar/Multi-Sensor (MRMS) System Analysis data. This data is required for METplus composite reflectivity or :term:`echo top` verification tasks within the SRW App. A two-day archive of precipitation, radar, and aviation and severe weather fields is publicly available and can be accessed `here <https://mrms.ncep.noaa.gov/data/>`__.
Multi-Radar/Multi-Sensor (MRMS) System Analysis data. This data is required for METplus composite reflectivity or :term:`echo top` verification tasks within the SRW App. A two-day archive of precipitation, radar, and aviation and severe weather fields is publicly available and can be accessed `here <https://mrms.ncep.noaa.gov/>`_.

NAM
`North American Mesoscale Forecast System <https://www.ncei.noaa.gov/products/weather-climate-models/north-american-mesoscale>`_. NAM generates multiple grids (or domains) of weather forecasts over the North American continent at various horizontal resolutions. Each grid contains data for dozens of weather parameters, including temperature, precipitation, lightning, and turbulent kinetic energy. NAM uses additional numerical weather models to generate high-resolution forecasts over fixed regions, and occasionally to follow significant weather events like hurricanes.
Expand Down
30 changes: 25 additions & 5 deletions doc/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,9 +36,18 @@

numfig = True

nitpick_ignore = [('py:class', 'obj'),('py:class',
'yaml.dumper.Dumper'),('py:class',
'xml.etree.ElementTree'),('py:class', 'Basemap'),]
nitpick_ignore = [('py:class', 'obj'),
('py:class', 'yaml.dumper.Dumper'),
('py:class', 'xml.etree.ElementTree'),
('py:class', 'Basemap'),
('py:class', 'pandas.DataFrame'),
('py:class', 'numpy.ndarray'),
('py:class', 'pandas.Index'),
('py:class', 'xarray.DataArray'),
('py:class', 'netCDF4.Dataset'),
('py:class', 'ESMF.Field'),
('py:class', 'ESMF.Grid'),
]

# -- General configuration ---------------------------------------------------

Expand Down Expand Up @@ -255,8 +264,19 @@ def setup(app):

# -- Options for autodoc extension ---------------------------------------

autodoc_mock_imports = ["f90nml","cartopy","mpl_toolkits.basemap","fill_jinja_template",
"matplotlib","numpy","uwtools","mpl_toolkits","metplus",
autodoc_mock_imports = ["f90nml",
"cartopy",
"mpl_toolkits.basemap",
"fill_jinja_template",
"matplotlib",
"numpy",
"uwtools",
"mpl_toolkits",
"metplus",
"netCDF4",
"pandas",
"xarray",
"ESMF",
]

logger = logging.getLogger(__name__)
Expand Down
2 changes: 1 addition & 1 deletion doc/requirements.in
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
sphinx>=7.4.0
sphinx==7.4.7
sphinx_rtd_theme
sphinxcontrib-bibtex
Loading

0 comments on commit 89d423c

Please sign in to comment.