Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[develop] Pull in relevant doc updates from release v2.1.0 #498

Merged
merged 35 commits into from
Nov 29, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
35 commits
Select commit Hold shift + click to select a range
b972c55
update build process img
gspetro Nov 16, 2022
2bcd36a
update img
gspetro Nov 16, 2022
7c9c0ec
Merge branch 'ufs-community:develop' into develop
gspetro-NOAA Nov 16, 2022
d25b5c2
Merge branch 'ufs-community:develop' into develop
gspetro-NOAA Nov 18, 2022
76d561c
Revert changes
gspetro Nov 18, 2022
0390cde
Merge branch 'develop' of github.com:gspetro-NOAA/ufs-srweather-app i…
gspetro Nov 18, 2022
45920d1
Merge branch 'ufs-community:develop' into develop
gspetro-NOAA Nov 21, 2022
e79cec2
BuildSRW updates - esp mac/linux
gspetro Nov 22, 2022
add19ee
remove Contributor's Guide
gspetro Nov 22, 2022
7b7d6d9
update Glossary and RunSRW
gspetro Nov 22, 2022
2722d15
update build/run process images
gspetro Nov 22, 2022
1726c06
remove outdated images; add new
gspetro Nov 22, 2022
2216de9
Components & CCPP updates, misc
gspetro Nov 22, 2022
59ed2ff
ConfigWflow updates
gspetro Nov 22, 2022
913dd01
Container updates
gspetro Nov 22, 2022
94f3230
update FAQ
gspetro Nov 22, 2022
d3b7f03
graphics updates
gspetro Nov 22, 2022
959b35d
I/O updates
gspetro Nov 22, 2022
2536549
Intro updates
gspetro Nov 22, 2022
08e5700
Grid ch updates
gspetro Nov 22, 2022
03fb3df
Grid ch updates
gspetro Nov 22, 2022
0cd62b6
Quickstart Updates
gspetro Nov 22, 2022
e8102d6
Rocoto edits
gspetro Nov 22, 2022
2ff4c4e
WE2E ch
gspetro Nov 22, 2022
870e256
update supported tests
gspetro Nov 22, 2022
67cd01b
update fix file list
gspetro Nov 22, 2022
ddd6db4
minor update
gspetro Nov 23, 2022
fa123a5
minor edits
gspetro Nov 23, 2022
b1eb9d7
minor edits
gspetro Nov 23, 2022
dd343d6
misc minor fixes
gspetro Nov 23, 2022
29e80c9
change /home to /Users/gillianpetro
gspetro Nov 24, 2022
07a730a
fix geopotential height
gspetro Nov 28, 2022
bff0240
update hpc-stack doc link
gspetro Nov 28, 2022
30f1a14
nperlin changes
gspetro Nov 28, 2022
83c6037
nperlin changes
gspetro Nov 28, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# UFS Short-Range Weather Application

The Unified Forecast System (UFS) is a community-based, coupled, comprehensive Earth modeling system. It is designed to be the source system for NOAA’s operational numerical weather prediction applications while enabling research, development, and contribution opportunities for the broader weather enterprise. For more information about the UFS, visit the UFS Portal at https://ufscommunity.org/.
The Unified Forecast System (UFS) is a community-based, coupled, comprehensive Earth modeling system. It is designed to be the source system for NOAA’s operational numerical weather prediction applications while enabling research, development, and contribution opportunities for the broader Weather Enterprise. For more information about the UFS, visit the UFS Portal at https://ufscommunity.org/.

The UFS includes multiple applications (see a complete list at https://ufscommunity.org/science/aboutapps/) that support different forecast durations and spatial domains. This documentation describes the development branch of the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes to several days. The development branch of the application is continually evolving as the system undergoes open development. The latest SRW App release (v2.1.0) represents a snapshot of this continuously evolving system.

Expand Down
94 changes: 39 additions & 55 deletions docs/UsersGuide/source/BuildSRW.rst

Large diffs are not rendered by default.

37 changes: 17 additions & 20 deletions docs/UsersGuide/source/Components.rst

Large diffs are not rendered by default.

18 changes: 8 additions & 10 deletions docs/UsersGuide/source/ConfigWorkflow.rst
Original file line number Diff line number Diff line change
Expand Up @@ -488,8 +488,7 @@ Verification Tasks
``RUN_TASK_VX_ENSPOINT``: (Default: false)
Flag that determines whether to run the ensemble point verification task. If this flag is set, both ensemble-stat point verification and point verification of ensemble-stat output is computed. The :ref:`MET Ensemble-Stat tool <ensemble-stat>` provides verification statistics for ensemble forecasts and can be used in conjunction with the :ref:`MET Point-Stat tool <point-stat>`. See :numref:`Section %s <VX-enspoint>` for additional parameters related to this task. Valid values: ``True`` | ``False``

..
COMMENT: Might be worth defining "ensemble-stat verification for gridded data," "ensemble point verification," "ensemble-stat point verification," and "point verification of ensemble-stat output"
.. COMMENT: COMMENT: Define "ensemble-stat verification for gridded data," "ensemble point verification," "ensemble-stat point verification," and "point verification of ensemble-stat output"?

.. _make-grid:

Expand Down Expand Up @@ -917,10 +916,12 @@ Non-default parameters for the ``make_lbcs`` task are set in the ``task_make_lbc
``OMP_STACKSIZE_MAKE_LBCS``: (Default: "1024m")
Controls the size of the stack for threads created by the OpenMP implementation.

.. _FcstConfigParams:

FORECAST Configuration Parameters
=====================================

Non-default parameters for the ``run_fcst`` task are set in the ``task_run_fcst`` section of the ``config.yaml`` file.
Non-default parameters for the ``run_fcst`` task are set in the ``task_run_fcst:`` section of the ``config.yaml`` file.

Basic Task Parameters
---------------------------------
Expand Down Expand Up @@ -961,13 +962,15 @@ For each workflow task, certain parameter values must be passed to the job sched
``OMP_STACKSIZE_RUN_FCST``: (Default: "1024m")
Controls the size of the stack for threads created by the OpenMP implementation.

.. _ModelConfigParams:

Model Configuration Parameters
----------------------------------

These parameters set values in the Weather Model's ``model_configure`` file.

``DT_ATMOS``: (Default: "")
Time step for the outermost atmospheric model loop in seconds. This corresponds to the frequency at which the physics routines and the top level dynamics routine are called. (Note that one call to the top-level dynamics routine results in multiple calls to the horizontal dynamics, :term:`tracer` transport, and vertical dynamics routines; see the `FV3 dycore scientific documentation <https://repository.library.noaa.gov/view/noaa/30725>`__ for details.) Must be set. Takes an integer value. In the SRW App, a default value for ``DT_ATMOS`` appears in the ``set_predef_grid_params.yaml`` script, but a different value can be set in ``config.yaml``.
Time step for the outermost atmospheric model loop in seconds. This corresponds to the frequency at which the physics routines and the top level dynamics routine are called. (Note that one call to the top-level dynamics routine results in multiple calls to the horizontal dynamics, :term:`tracer` transport, and vertical dynamics routines; see the `FV3 dycore scientific documentation <https://repository.library.noaa.gov/view/noaa/30725>`__ for details.) Must be set. Takes an integer value. In the SRW App, a default value for ``DT_ATMOS`` appears in the ``set_predef_grid_params.yaml`` script, but a different value can be set in ``config.yaml``. In general, the smaller the grid cell size is, the smaller this value needs to be in order to avoid numerical instabilities during the forecast.

``RESTART_INTERVAL``: (Default: 0)
Frequency of the output restart files in hours. Using the default interval (0), restart files are produced at the end of a forecast run. When ``RESTART_INTERVAL: 1``, restart files are produced every hour with the prefix "YYYYMMDD.HHmmSS." in the ``RESTART`` directory.
Expand Down Expand Up @@ -1012,7 +1015,7 @@ Write-Component (Quilting) Parameters
Flag that determines whether to output extra (debugging) information from :term:`ESMF` routines. Note that the write component uses ESMF library routines to interpolate from the native forecast model grid to the user-specified output grid (which is defined in the model configuration file ``model_configure`` in the forecast run directory). Valid values: ``True`` | ``False``

``WRTCMP_write_groups``: (Default: 1)
The number of write groups (i.e., groups of :term:`MPI` tasks) to use in the write component.
The number of write groups (i.e., groups of :term:`MPI` tasks) to use in the write component. Each write group will write to one set of output files (a ``dynf${fhr}.nc`` and a ``phyf${fhr}.nc`` file, where ``${fhr}`` is the forecast hour). Each write group contains ``WRTCMP_write_tasks_per_group`` tasks. Usually, one write group is sufficient. This may need to be increased if the forecast is proceeding so quickly that a single write group cannot complete writing to its set of files before there is a need/request to start writing the next set of files at the next output time.

``WRTCMP_write_tasks_per_group``: (Default: 20)
The number of MPI tasks to allocate for each write group.
Expand Down Expand Up @@ -1912,11 +1915,6 @@ The parameters below turn on SPP in Noah or RUC LSM (support for Noah MP is in p
``LSM_SPP_MAG_LIST``: (Default: [ 0.017, 0.001, 0.001, 0.001, 0.001, 0.001, 0.2 ] )
Sets the maximum random pattern amplitude for each of the LSM perturbations.

.. COMMENT: This variable no longer appears and was going to be removed. See if anything has replaced it.
``LSM_SPP_EACH_STEP``: (Default: "true")
When set to "TRUE", it sets ``lndp_each_step=.true.`` and perturbs each time step.


.. _HaloBlend:

Halo Blend Parameter
Expand Down
18 changes: 9 additions & 9 deletions docs/UsersGuide/source/ContainerQuickstart.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
Container-Based Quick Start Guide
====================================

This Container-Based Quick Start Guide will help users build and run the "out-of-the-box" case for the Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application using a `Singularity <https://sylabs.io/guides/3.5/user-guide/introduction.html>`__ container. The :term:`container` approach provides a uniform enviroment in which to build and run the SRW App. Normally, the details of building and running the SRW App vary from system to system due to the many possible combinations of operating systems, compilers, :term:`MPI`'s, and package versions available. Installation via Singularity container reduces this variability and allows for a smoother SRW App build experience. Normally, containers can only run on a single compute node and are not compatible with the `Rocoto workflow manager <https://github.com/christopherwharrop/rocoto/wiki/Documentation>`__, so users must run each task in the workflow manually. However, the Singularity container described in this chapter has been adapted such that it is able to run across multiple nodes using Rocoto. This makes it an excellent starting point for beginners. The :ref:`non-container build approach <BuildSRW>` may still be more appropriate for users who desire additional customizability, particularly if they already have experience running the SRW App.
This Container-Based Quick Start Guide will help users build and run the "out-of-the-box" case for the Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application using a `Singularity <https://sylabs.io/guides/3.5/user-guide/introduction.html>`__ container. The :term:`container` approach provides a uniform enviroment in which to build and run the SRW App. Normally, the details of building and running the SRW App vary from system to system due to the many possible combinations of operating systems, compilers, :term:`MPIs <MPI>`, and package versions available. Installation via Singularity container reduces this variability and allows for a smoother SRW App build experience. Normally, containers can only run on a single compute node and are not compatible with the `Rocoto workflow manager <https://github.com/christopherwharrop/rocoto/wiki/Documentation>`__, so users must run each task in the workflow manually. However, the Singularity container described in this chapter has been adapted such that it is able to run across multiple nodes using Rocoto. This makes it an excellent starting point for beginners. The :ref:`non-container build approach <BuildSRW>` may still be more appropriate for users who desire additional customizability, particularly if they already have experience running the SRW App.

The "out-of-the-box" SRW App case described in this User's Guide builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (:term:`CONUS`) grid (RRFS_CONUS_25km), the Global Forecast System (:term:`GFS`) version 16 physics suite (FV3_GFS_v16 :term:`CCPP`), and :term:`FV3`-based GFS raw external model data for initialization.

Expand All @@ -28,7 +28,7 @@ Users must have an **Intel** compiler and :term:`MPI` (available for free `here
Install Singularity
^^^^^^^^^^^^^^^^^^^^^^^

To build and run the SRW App using a Singularity container, first install the Singularity package according to the `Singularity Installation Guide <https://sylabs.io/guides/3.2/user-guide/installation.html#>`__. This will include the installation of dependencies and the installation of the Go programming language. SingularityCE Version 3.7 or above is recommended.
To build and run the SRW App using a Singularity container, first install the Singularity package according to the `Singularity Installation Guide <https://docs.sylabs.io/guides/3.2/user-guide/installation.html>`__. This will include the installation of dependencies and the installation of the Go programming language. SingularityCE Version 3.7 or above is recommended.

.. warning::
Docker containers can only be run with root privileges, and users cannot have root privileges on :term:`HPCs <HPC>`. Therefore, it is not possible to build the SRW App, which uses the HPC-Stack, inside a Docker container on an HPC system. However, a Singularity image may be built directly from a Docker image for use on the system.
Expand Down Expand Up @@ -124,11 +124,11 @@ On non-Level 1 systems, users should build the container in a writable sandbox:
Some users may prefer to issue the command without the ``sudo`` prefix. Whether ``sudo`` is required is system-dependent.

.. note::
Users can choose to build a release version of the container (SRW App v2.0.0) using a similar command:
Users can choose to build a release version of the container (SRW App v2.1.0) using a similar command:

.. code-block:: console

sudo singularity build --sandbox ubuntu20.04-intel-srwapp docker://noaaepic/ubuntu20.04-intel22-ufs-srwapp:release-public-v2
sudo singularity build --sandbox ubuntu20.04-intel-srwapp docker://noaaepic/ubuntu20.04-intel-srwapp:release-public-v2.1.0


.. _WorkOnHPC:
Expand Down Expand Up @@ -210,7 +210,7 @@ Users can run ``exit`` to exit the shell.
Download and Stage the Data
============================

The SRW App requires input files to run. These include static datasets, initial and boundary condition files, and model configuration files. On Level 1 systems, the data required to run SRW App tests are already available as long as the bind argument (starting with ``-B``) in :numref:`Step %s <BuildC>` included the directory with the input model data. For Level 2-4 systems, the data must be added manually by the user. Detailed instructions on how to add the data can be found in :numref:`Section %s <DownloadingStagingInput>`. Sections :numref:`%s <Input>` and :numref:`%s <OutputFiles>` contain useful background information on the input and output files used in the SRW App.
The SRW App requires input files to run. These include static datasets, initial and boundary condition files, and model configuration files. On Level 1 systems, the data required to run SRW App tests are already available as long as the bind argument (starting with ``-B``) in :numref:`Step %s <BuildC>` included the directory with the input model data. See :numref:`Table %s <DataLocations>` for Level 1 data locations. For Level 2-4 systems, the data must be added manually by the user. Detailed instructions on how to add the data can be found in :numref:`Section %s <DownloadingStagingInput>`. Sections :numref:`%s <Input>` and :numref:`%s <OutputFiles>` contain useful background information on the input and output files used in the SRW App.

.. _GenerateForecastC:

Expand Down Expand Up @@ -246,7 +246,7 @@ To activate the regional workflow, run the following commands:

where:

* ``<path/to/modulefiles>`` is replaced with the actual path to the modulefiles on the user's system (often ``$PWD/modulefiles``), and
* ``<path/to/modulefiles>`` is replaced with the actual path to the modulefiles on the user's local system (often ``$PWD/modulefiles``), and
* ``<platform>`` is a valid, lowercased machine/platform name (see the ``MACHINE`` variable in :numref:`Section %s <user>`).

The ``wflow_<platform>`` modulefile will then output instructions to activate the regional workflow. The user should run the commands specified in the modulefile output. For example, if the output says:
Expand Down Expand Up @@ -310,7 +310,7 @@ From here, users can follow the steps below to configure the out-of-the-box SRW
.. code-block:: console

USE_CRON_TO_RELAUNCH: TRUE
CRON_RELAUNCH_INTVL_MNTS: 02
CRON_RELAUNCH_INTVL_MNTS: 3

There are instructions for running the experiment via additional methods in :numref:`Section %s <Run>`. However, this technique (automation via :term:`crontab`) is the simplest option.

Expand Down Expand Up @@ -351,7 +351,7 @@ Run the following command to generate the workflow:

.. code-block:: console

python generate_FV3LAM_wflow.py
./generate_FV3LAM_wflow.py

This workflow generation script creates an experiment directory and populates it with all the data needed to run through the workflow. The last line of output from this script should start with ``*/3 * * * *`` (or similar).

Expand Down Expand Up @@ -387,7 +387,7 @@ where:
New Experiment
===============

To run a new experiment in the container at a later time, users will need to rerun the commands in :numref:`Section %s <SetUpPythonEnvC>` to reactivate the regional workflow. Then, users can configure a new experiment by updating the environment variables in ``config.yaml`` to reflect the desired experiment configuration. Basic instructions appear in :numref:`Section %s <SetUpConfigFileC>` above, and detailed instructions can be viewed in :numref:`Section %s <UserSpecificConfig>`. After adjusting the configuration file, regenerate the experiment by running ``python generate_FV3LAM_wflow.py``.
To run a new experiment in the container at a later time, users will need to rerun the commands in :numref:`Section %s <SetUpPythonEnvC>` to reactivate the regional workflow. Then, users can configure a new experiment by updating the environment variables in ``config.yaml`` to reflect the desired experiment configuration. Basic instructions appear in :numref:`Section %s <SetUpConfigFileC>` above, and detailed instructions can be viewed in :numref:`Section %s <UserSpecificConfig>`. After adjusting the configuration file, regenerate the experiment by running ``./generate_FV3LAM_wflow.py``.

Plot the Output
===============
Expand Down
Loading