From 370834b0d28931ddb52fefcf4f473400b4917298 Mon Sep 17 00:00:00 2001
From: Michael Lueken <63728921+MichaelLueken@users.noreply.github.com>
Date: Mon, 21 Nov 2022 12:13:41 -0500
Subject: [PATCH 001/124] [develop] Updated README.md and Introduction.rst for
the SRW v2.1.0 release (#493)
---
README.md | 6 +++---
docs/UsersGuide/source/Introduction.rst | 6 +++---
2 files changed, 6 insertions(+), 6 deletions(-)
diff --git a/README.md b/README.md
index 6956aa7a40..f746ae5c78 100644
--- a/README.md
+++ b/README.md
@@ -2,14 +2,14 @@
The Unified Forecast System (UFS) is a community-based, coupled, comprehensive Earth modeling system. It is designed to be the source system for NOAA’s operational numerical weather prediction applications while enabling research, development, and contribution opportunities for the broader weather enterprise. For more information about the UFS, visit the UFS Portal at https://ufscommunity.org/.
-The UFS includes multiple applications (see a complete list at https://ufscommunity.org/science/aboutapps/) that support different forecast durations and spatial domains. This documentation describes the development branch of the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes to several days. The development branch of the application is continually evolving as the system undergoes open development. The latest SRW App release (v2.0.0) represents a snapshot of this continuously evolving system.
+The UFS includes multiple applications (see a complete list at https://ufscommunity.org/science/aboutapps/) that support different forecast durations and spatial domains. This documentation describes the development branch of the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes to several days. The development branch of the application is continually evolving as the system undergoes open development. The latest SRW App release (v2.1.0) represents a snapshot of this continuously evolving system.
-The UFS SRW App User's Guide associated with the development branch is at: https://ufs-srweather-app.readthedocs.io/en/develop/, while the guide specific to the SRW App v2.0.0 release can be found at: https://ufs-srweather-app.readthedocs.io/en/release-public-v2/. The repository is at: https://github.com/ufs-community/ufs-srweather-app.
+The UFS SRW App User's Guide associated with the development branch is at: https://ufs-srweather-app.readthedocs.io/en/develop/, while the guide specific to the SRW App v2.1.0 release can be found at: https://ufs-srweather-app.readthedocs.io/en/release-public-v2.1.0/. The repository is at: https://github.com/ufs-community/ufs-srweather-app.
For instructions on how to clone the repository, build the code, and run the workflow, see:
https://github.com/ufs-community/ufs-srweather-app/wiki/Getting-Started
-UFS Development Team. (2022, June 23). Unified Forecast System (UFS) Short-Range Weather (SRW) Application (Version v2.0.0). Zenodo. https://doi.org/10.5281/zenodo.6505854
+UFS Development Team. (2022, Nov. 17). Unified Forecast System (UFS) Short-Range Weather (SRW) Application (Version v2.1.0). Zenodo. https://doi.org/10.5281/zenodo.7277602
[](https://github.com/ufs-community/ufs-srweather-app/actions/workflows/python_unittests.yaml)
[](https://github.com/ufs-community/ufs-srweather-app/actions/workflows/python_func_tests.yaml)
diff --git a/docs/UsersGuide/source/Introduction.rst b/docs/UsersGuide/source/Introduction.rst
index 7e39ba7769..476f013838 100644
--- a/docs/UsersGuide/source/Introduction.rst
+++ b/docs/UsersGuide/source/Introduction.rst
@@ -6,13 +6,13 @@ Introduction
The Unified Forecast System (:term:`UFS`) is a community-based, coupled, comprehensive Earth modeling system. NOAA's operational model suite for numerical weather prediction (:term:`NWP`) is quickly transitioning to the UFS from a number of different modeling systems. The UFS enables research, development, and contribution opportunities within the broader :term:`Weather Enterprise` (including government, industry, and academia). For more information about the UFS, visit the `UFS Portal `__.
-The UFS includes `multiple applications `__ that support different forecast durations and spatial domains. This documentation describes the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes to several days. The SRW Application v2.0.0 release includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. These components are documented within this User's Guide and supported through a `community forum `__. New and improved capabilities for this release include the addition of a verification package (METplus) for both deterministic and ensemble simulations and support for four stochastically perturbed physics schemes. Future work will expand the capabilities of the application to include data assimilation (DA) and a forecast restart/cycling capability.
+The UFS includes `multiple applications `__ that support different forecast durations and spatial domains. This documentation describes the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes to several days. The SRW Application v2.1.0 release includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. These components are documented within this User's Guide and supported through a `community forum `__. New and improved capabilities for this release include the addition of a verification package (METplus) for both deterministic and ensemble simulations and support for four stochastically perturbed physics schemes. Future work will expand the capabilities of the application to include data assimilation (DA) and a forecast restart/cycling capability.
This documentation provides a :ref:`Quick Start Guide ` designed for use on `Level 1 systems `__ or as an overview of the workflow. It also provides a :ref:`Container-Based Quick Start Guide ` for running the SRW Application in a container and detailed chapters on :ref:`building ` and :ref:`running ` the SRW App on any supported platform. Additionally, this User's Guide provides an overview of the :ref:`release components ` and details on how to customize or modify different portions of the workflow.
-The SRW App v2.0.0 citation is as follows and should be used when presenting results based on research conducted with the App:
+The SRW App v2.1.0 citation is as follows and should be used when presenting results based on research conducted with the App:
-UFS Development Team. (2022, June 23). Unified Forecast System (UFS) Short-Range Weather (SRW) Application (Version v2.0.0). Zenodo. https://doi.org/10.5281/zenodo.6505854
+UFS Development Team. (2022, Nov. 17). Unified Forecast System (UFS) Short-Range Weather (SRW) Application (Version v2.1.0). Zenodo. https://doi.org/10.5281/zenodo.7277602
How to Use This Document
From f0963f62f77516f4e7c1203e08b40a8d80012372 Mon Sep 17 00:00:00 2001
From: Mark Potts <33099090+mark-a-potts@users.noreply.github.com>
Date: Mon, 28 Nov 2022 09:30:59 -0500
Subject: [PATCH 002/124] [develop] Python-based crontab replacement (#466)
* new script to run and monitor e2e tests without cron
* fixed find command and directed system output to /dev/null
* Modify run_WE2E_tests.sh
- When on Cheyenne, set use_cron_to_relaunch=false
- When use_cron_to_relaunch=false, output a message at the end of the script
describing how to run the new run_srw_tests.py script to manage the tests
- Regardless of other variables, print a message at the end of the script
showing the user where the test directory is
* Updated .cicd/scripts/srw_test.sh to work with ush/run_srw_tests.py
Co-authored-by: Michael Kavulich, Jr
Co-authored-by: michael.lueken
---
.cicd/scripts/srw_test.sh | 22 ++++++++++---
tests/WE2E/run_WE2E_tests.sh | 42 ++++++++++++++++++++++++
ush/run_srw_tests.py | 62 ++++++++++++++++++++++++++++++++++++
3 files changed, 122 insertions(+), 4 deletions(-)
create mode 100755 ush/run_srw_tests.py
diff --git a/.cicd/scripts/srw_test.sh b/.cicd/scripts/srw_test.sh
index 4c4cbc95af..8df2ff5c2f 100755
--- a/.cicd/scripts/srw_test.sh
+++ b/.cicd/scripts/srw_test.sh
@@ -42,13 +42,22 @@ cd ${we2e_test_dir}
expt_basedir=${we2e_experiment_base_dir} \
opsroot=${nco_dir}
-# Allow the tests to start before checking for status.
-# TODO: Create a parameter that sets the initial start delay.
-sleep 300
+# Run the new run_srw_tests script if the machine is Cheyenne.
+if [[ "${platform}" = "cheyenne" ]]; then
+ cd ${workspace}/ush
+ ./run_srw_tests.py -e=${we2e_experiment_base_dir}
+ cd ${we2e_test_dir}
+fi
# Progress file
progress_file="${workspace}/we2e_test_results-${platform}-${SRW_COMPILER}.txt"
+# Allow the tests to start before checking for status.
+# TODO: Create a parameter that sets the initial start delay.
+if [[ "${platform}" != "cheyenne" ]]; then
+ sleep 300
+fi
+
# Wait for all tests to complete.
while true; do
@@ -74,10 +83,15 @@ done
# TODO: Create parameter that sets the interval for the we2e cron jobs; this
# value should be some factor of that interval to ensure the cron jobs execute
# before the workspace is cleaned up.
-sleep 600
+if [[ "${platform}" != "cheyenne" ]]; then
+ sleep 600
+fi
# Set exit code to number of failures
set +e
failures=$(grep "Workflow status: FAILURE" ${progress_file} | wc -l)
+if [[ $failures -ne 0 ]]; then
+ failures=1
+fi
set -e
exit ${failures}
diff --git a/tests/WE2E/run_WE2E_tests.sh b/tests/WE2E/run_WE2E_tests.sh
index 461041c08d..3887483ecd 100755
--- a/tests/WE2E/run_WE2E_tests.sh
+++ b/tests/WE2E/run_WE2E_tests.sh
@@ -395,6 +395,16 @@ if [ -z "${machine}" ]; then
The argument \"machine\" specifying the machine or platform on which to
run the WE2E tests was not specified in the call to this script. \
${help_msg}"
+fi
+ # Cheyenne-specific test limitation
+
+if [ "${machine,,}" = "cheyenne" ]; then
+ use_cron_to_relaunch=FALSE
+ echo "
+Due to system limitations, the 'use_cron_to_relaunch' command can not be used on
+the '${machine}' machine. Setting this variable to false.
+
+"
fi
if [ -z "${account}" ]; then
@@ -1310,6 +1320,38 @@ Could not generate an experiment for the test specified by test_name:
test_name = \"${test_name}\""
done
+
+# Print notes about monitoring/running jobs if use_cron_to_relaunch = FALSE
+topdir=${scrfunc_dir%/*/*/*}
+expt_dirs_fullpath="${topdir}/expt_dirs"
+
+echo "
+ ========================================================================
+ ========================================================================
+
+ All experiments have been generated in the directory
+ ${expt_dirs_fullpath}
+
+ ========================================================================
+ ========================================================================
+"
+
+if [ "${use_cron_to_relaunch,,}" = "false" ]; then
+ echo "
+
+The variable 'use_cron_to_relaunch' has been set to FALSE. Jobs will not be automatically run via crontab.
+
+You can run each task manually in the experiment directory:
+(${expt_dirs_fullpath})
+
+Or you can use the 'run_srw_tests.py' script in the ush/ directory:
+
+ cd $USHdir
+ ./run_srw_tests.py -e=${expt_dirs_fullpath}
+
+"
+fi
+
#
#-----------------------------------------------------------------------
#
diff --git a/ush/run_srw_tests.py b/ush/run_srw_tests.py
new file mode 100755
index 0000000000..9e77be14b8
--- /dev/null
+++ b/ush/run_srw_tests.py
@@ -0,0 +1,62 @@
+#!/usr/bin/env python3
+
+import os
+import subprocess
+import time
+import argparse
+
+# Python class to handle the launching of a set of SRW tests
+# The expectation is to have a "clean" experiment directory with only new experiments
+# that are ready to run (e.g. no _old* experiments left around from previous tests
+# This script takes only one parameter "-e" or "--exptdir" which points to the
+# expt_basedir specified when the run_WE2E_tests.sh is run to set up the tests.
+# The script will work sequentially through each of the test directories and
+# launch the workflow for each with a call to launch_FV3LAM_wflow.sh
+# After the initial launch, the checkTests method is called to monitor the
+# status of each test and call the launch_FV3LAM_wflow.sh script repeatedly
+# in each uncompleted workflow until all workflows are done.
+class SRWTest:
+ def __init__(self, exptdir):
+ self.exptdir=exptdir
+ # Get a list of test directories
+ cmdstring="find {} -maxdepth 1 -type d | tail -n+2".format(self.exptdir)
+ status= subprocess.check_output(cmdstring,shell=True).strip().decode('utf-8')
+ # Turn the stdout from the shell command into a list
+ self.testDirectories = status.split("\n")
+ self.launchcmd = "./launch_FV3LAM_wflow.sh >& /dev/null"
+ # Loop through each of the test directories and launch the initial jobs in the workflow
+ for testD in self.testDirectories:
+ print("starting {} workflow".format(testD))
+ os.chdir(testD)
+ os.system(self.launchcmd)
+ os.chdir(self.exptdir)
+ # Now start monitoring the workflows
+ self.checkTests()
+
+ def checkTests(self):
+ while(len(self.testDirectories) > 0):
+ # Only continue running launch command for workflows that aren't complete
+ # so check for any that have failed or completed and cull them from the list
+ cmdstring="grep -L 'wflow_status =' */log.launch_FV3LAM_wflow | xargs dirname"
+ try:
+ status= subprocess.check_output(cmdstring,shell=True).strip().decode('utf-8')
+ except:
+ print("Tests have all completed")
+ return
+ self.testDirectories = status.split("\n")
+ # continue looping through directories
+ for testD in self.testDirectories:
+ os.chdir(testD)
+ os.system(self.launchcmd)
+ os.chdir(self.exptdir)
+ print("calling launch_FV3LAM_wflow.sh from {}".format(testD))
+ time.sleep(5.0)
+ time.sleep(30.0)
+
+if __name__ == "__main__":
+ parser = argparse.ArgumentParser(description='Run through a set of SRW WE2E tests until they are complete')
+ parser.add_argument('-e','--exptdir', help='directory where experiments have been staged', required=False,default=os.getcwd())
+ args = vars(parser.parse_args())
+
+ test = SRWTest(args['exptdir'])
+
From 98edca0bc25875432936944e8ef87acdc5e34781 Mon Sep 17 00:00:00 2001
From: Gillian Petro <96886803+gspetro-NOAA@users.noreply.github.com>
Date: Tue, 29 Nov 2022 09:38:14 -0500
Subject: [PATCH 003/124] [develop] Pull in relevant doc updates from release
v2.1.0 (#498)
* update build process img
* update img
* Revert changes
This reverts commit b839658fec9240b8e20d8ace2a70162219ed7906.
* BuildSRW updates - esp mac/linux
* remove Contributor's Guide
* update Glossary and RunSRW
* update build/run process images
* remove outdated images; add new
* Components & CCPP updates, misc
* ConfigWflow updates
* Container updates
* update FAQ
* graphics updates
* I/O updates
* Intro updates
* Grid ch updates
* Grid ch updates
* Quickstart Updates
* Rocoto edits
* WE2E ch
* update supported tests
* update fix file list
* minor update
* minor edits
* minor edits
* misc minor fixes
* change /home to /Users/gillianpetro
* fix geopotential height
* update hpc-stack doc link
* nperlin changes
* nperlin changes
Co-authored-by: gspetro
---
README.md | 2 +-
docs/UsersGuide/source/BuildSRW.rst | 94 ++--
docs/UsersGuide/source/Components.rst | 37 +-
docs/UsersGuide/source/ConfigWorkflow.rst | 18 +-
.../UsersGuide/source/ContainerQuickstart.rst | 18 +-
docs/UsersGuide/source/ContributorsGuide.rst | 463 ------------------
docs/UsersGuide/source/FAQ.rst | 173 +++++--
docs/UsersGuide/source/Glossary.rst | 31 +-
docs/UsersGuide/source/Graphics.rst | 114 ++---
docs/UsersGuide/source/InputOutputFiles.rst | 121 +++--
docs/UsersGuide/source/Introduction.rst | 149 +++---
docs/UsersGuide/source/LAMGrids.rst | 199 ++------
docs/UsersGuide/source/Quickstart.rst | 67 +--
docs/UsersGuide/source/RocotoInfo.rst | 14 +-
docs/UsersGuide/source/RunSRW.rst | 293 +++++------
docs/UsersGuide/source/WE2Etests.rst | 49 +-
.../source/_static/FV3LAM_wflow_flowchart.png | Bin 56074 -> 0 bytes
.../_static/FV3LAM_wflow_flowchart_v2.png | Bin 561715 -> 0 bytes
.../_static/FV3LAM_wflow_input_path.png | Bin 64507 -> 0 bytes
.../source/_static/FV3LAM_wflow_overall.png | Bin 52544 -> 0 bytes
.../_static/FV3regional_workflow_gen.png | Bin 81430 -> 0 bytes
.../_static/FV3regional_workflow_gen_v2.png | Bin 550227 -> 0 bytes
.../source/_static/SRW_build_process.png | Bin 59669 -> 60664 bytes
.../_static/SRW_overall_workflow_run.png | Bin 84193 -> 0 bytes
.../source/_static/SRW_run_process.png | Bin 0 -> 81455 bytes
.../source/_static/SRW_wflow_input_path.png | Bin 0 -> 141217 bytes
docs/UsersGuide/source/conf.py | 2 +-
docs/UsersGuide/source/index.rst | 1 -
docs/UsersGuide/source/tables/.DS_Store | Bin 6148 -> 6148 bytes
docs/UsersGuide/source/tables/CCPPUpdates.rst | 58 +++
docs/UsersGuide/source/tables/Tests.csv | 64 +--
docs/UsersGuide/source/tables/Tests.rst | 6 +-
.../source/tables/fix_file_list.rst | 26 +-
33 files changed, 757 insertions(+), 1242 deletions(-)
delete mode 100644 docs/UsersGuide/source/ContributorsGuide.rst
delete mode 100644 docs/UsersGuide/source/_static/FV3LAM_wflow_flowchart.png
delete mode 100644 docs/UsersGuide/source/_static/FV3LAM_wflow_flowchart_v2.png
delete mode 100644 docs/UsersGuide/source/_static/FV3LAM_wflow_input_path.png
delete mode 100644 docs/UsersGuide/source/_static/FV3LAM_wflow_overall.png
delete mode 100644 docs/UsersGuide/source/_static/FV3regional_workflow_gen.png
delete mode 100644 docs/UsersGuide/source/_static/FV3regional_workflow_gen_v2.png
delete mode 100644 docs/UsersGuide/source/_static/SRW_overall_workflow_run.png
create mode 100644 docs/UsersGuide/source/_static/SRW_run_process.png
create mode 100644 docs/UsersGuide/source/_static/SRW_wflow_input_path.png
create mode 100644 docs/UsersGuide/source/tables/CCPPUpdates.rst
diff --git a/README.md b/README.md
index f746ae5c78..bcb75fe601 100644
--- a/README.md
+++ b/README.md
@@ -1,6 +1,6 @@
# UFS Short-Range Weather Application
-The Unified Forecast System (UFS) is a community-based, coupled, comprehensive Earth modeling system. It is designed to be the source system for NOAA’s operational numerical weather prediction applications while enabling research, development, and contribution opportunities for the broader weather enterprise. For more information about the UFS, visit the UFS Portal at https://ufscommunity.org/.
+The Unified Forecast System (UFS) is a community-based, coupled, comprehensive Earth modeling system. It is designed to be the source system for NOAA’s operational numerical weather prediction applications while enabling research, development, and contribution opportunities for the broader Weather Enterprise. For more information about the UFS, visit the UFS Portal at https://ufscommunity.org/.
The UFS includes multiple applications (see a complete list at https://ufscommunity.org/science/aboutapps/) that support different forecast durations and spatial domains. This documentation describes the development branch of the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes to several days. The development branch of the application is continually evolving as the system undergoes open development. The latest SRW App release (v2.1.0) represents a snapshot of this continuously evolving system.
diff --git a/docs/UsersGuide/source/BuildSRW.rst b/docs/UsersGuide/source/BuildSRW.rst
index 85790b482c..4cf6ff71ae 100644
--- a/docs/UsersGuide/source/BuildSRW.rst
+++ b/docs/UsersGuide/source/BuildSRW.rst
@@ -30,18 +30,18 @@ To build the SRW App, users will complete the following steps:
.. _HPCstackInfo:
-Install the HPC-Stack
-========================
+Install the Prerequisite Software Stack
+==========================================
-.. Attention::
- Skip the HPC-Stack installation if working on a `Level 1 system `_ (e.g., Cheyenne, Hera, Orion, NOAA Cloud).
+Currently, installation of the prerequisite software stack is supported via HPC-Stack. :term:`HPC-Stack` is a repository that provides a unified, shell script-based system to build the software stack required for `UFS `__ applications such as the SRW App.
-**Definition:** :term:`HPC-Stack` is a repository that provides a unified, shell script-based build system to build the software stack required for `UFS `_ applications such as the SRW App.
+.. Attention::
+ Skip the HPC-Stack installation if working on a `Level 1 system `__ (e.g., Cheyenne, Hera, Orion, NOAA Cloud), and :ref:`continue to the next section `.
Background
----------------
-The UFS Weather Model draws on over 50 code libraries to run its applications. These libraries range from libraries developed in-house at NOAA (e.g., NCEPLIBS, FMS) to libraries developed by NOAA's partners (e.g., PIO, ESMF) to truly third party libraries (e.g., NETCDF). Individual installation of these libraries is not practical, so the `HPC-Stack `__ was developed as a central installation system to ensure that the infrastructure environment across multiple platforms is as similar as possible. Installation of the HPC-Stack is required to run the SRW App.
+The UFS Weather Model draws on over 50 code libraries to run its applications. These libraries range from libraries developed in-house at NOAA (e.g., NCEPLIBS, FMS) to libraries developed by NOAA's partners (e.g., PIO, ESMF) to truly third party libraries (e.g., netCDF). Individual installation of these libraries is not practical, so the `HPC-Stack `__ was developed as a central installation system to ensure that the infrastructure environment across multiple platforms is as similar as possible. Installation of the HPC-Stack is required to run the SRW App.
Instructions
-------------------------
@@ -57,6 +57,9 @@ Users working on systems that fall under `Support Levels 2-4 `.
+.. attention::
+ Although HPC-Stack is the fully-supported option as of the v2.1.0 release, UFS applications are gradually shifting to :term:`spack-stack`, which is a :term:`Spack`-based method for installing UFS prerequisite software libraries. The spack-stack is currently used on NOAA Cloud platforms and in containers, while HPC-Stack is still used on other Level 1 systems and is the software stack validated by the UFS Weather Model as of the v2.1.0 release. Users are encouraged to check out `spack-stack `__ to prepare for the upcoming shift in support from HPC-Stack to spack-stack.
+
After completing installation, continue to the next section (:numref:`Section %s: Download the UFS SRW Application Code `).
.. _DownloadSRWApp:
@@ -114,7 +117,7 @@ The cloned repository contains the configuration files and sub-directories shown
| | configuration files such as ``model_configure``, |
| | ``diag_table``, and ``field_table``. |
+--------------------------------+-----------------------------------------------------------+
- | README.md | Getting Started Guide |
+ | README.md | Contains SRW App introductory information |
+--------------------------------+-----------------------------------------------------------+
| rename_model.sh | Used to rename the model before it becomes NCO |
| | operational. The SRW App is a generic app that is the base|
@@ -160,7 +163,7 @@ Run the executable that pulls in SRW App components from external repositories:
.. code-block:: console
- cd $SRW
+ cd
./manage_externals/checkout_externals
The script should output dialogue indicating that it is retrieving different code repositories. It may take several minutes to download these repositories.
@@ -169,7 +172,7 @@ To see more options for the ``checkout_externals`` script, users can run ``./man
* ``-S``: Outputs the status of the repositories managed by ``checkout_externals``. By default only summary information is provided. Use with the ``-v`` (verbose) option to see details.
* ``-x [EXCLUDE [EXCLUDE ...]]``: allows users to exclude components when checking out externals.
- * ``-o``: By default only the required externals are checked out. This flag will also checkout the optional externals.
+ * ``-o``: By default only the required externals are checked out. This flag will also check out the optional externals.
Generally, users will not need to use the options and can simply run the script, but the options are available for those who are curious.
@@ -189,10 +192,10 @@ On Level 1 systems for which a modulefile is provided under the ``modulefiles``
./devbuild.sh --platform=
-where ```` is replaced with the name of the platform the user is working on. Valid values are: ``cheyenne`` | ``gaea`` | ``hera`` | ``jet`` | ``linux`` | ``macos`` | ``noaacloud`` | ``odin`` | ``orion`` | ``singularity`` | ``wcoss2``
+where ```` is replaced with the name of the platform the user is working on. Valid values include: ``cheyenne`` | ``gaea`` | ``hera`` | ``jet`` | ``linux`` | ``macos`` | ``noaacloud`` | ``orion``
.. note::
- Although build modulefiles exist for generic Linux and MacOS machines, users will need to alter these according to the instructions in Sections :numref:`%s ` & :numref:`%s `. Users on these systems may have more success building the SRW App with the :ref:`CMake Approach ` instead.
+ Although build modulefiles exist for generic Linux and MacOS machines, users will need to alter these according to the instructions in Sections :numref:`%s ` & :numref:`%s `. Users on these systems may have more success building the SRW App with the :ref:`CMake Approach ` instead.
If compiler auto-detection fails for some reason, specify it using the ``--compiler`` argument. For example:
@@ -375,7 +378,7 @@ Set Up the Build Environment
.. attention::
* If users successfully built the executables in :numref:`Table %s `, they should skip to step :numref:`Chapter %s `.
- * Users who want to build the SRW App on a generic MacOS should skip to :numref:`Section %s ` and follow the approach there.
+ * Users who want to build the SRW App on MacOS or generic Linux systems should skip to :numref:`Section %s ` and follow the approach there.
If the ``devbuild.sh`` approach failed, users need to set up their environment to run a workflow on their specific platform. First, users should make sure ``Lmod`` is the app used for loading modulefiles. This is the case on most Level 1 systems; however, on systems such as Gaea/Odin, the default modulefile loader is from Cray and must be switched to Lmod. For example, on Gaea, users can run one of the following two commands depending on whether they have a bash or csh shell, respectively:
@@ -384,9 +387,11 @@ If the ``devbuild.sh`` approach failed, users need to set up their environment t
source etc/lmod-setup.sh gaea
source etc/lmod-setup.csh gaea
-If users execute one of the above commands on systems that don't need it, it will not cause any problems (it will simply do a ``module purge``).
+.. note::
-From here on, ``Lmod`` is ready to load the modulefiles needed by the SRW App. These modulefiles are located in the ``modulefiles`` directory. To load the necessary modulefile for a specific ```` using a given ````, run:
+ If users execute one of the above commands on systems that don't need it, it will not cause any problems (it will simply do a ``module purge``).
+
+From here, ``Lmod`` is ready to load the modulefiles needed by the SRW App. These modulefiles are located in the ``modulefiles`` directory. To load the necessary modulefile for a specific ```` using a given ````, run:
.. code-block:: console
@@ -423,7 +428,7 @@ From the build directory, run the following commands to build the pre-processing
cmake .. -DCMAKE_INSTALL_PREFIX=.. -DCMAKE_INSTALL_BINDIR=exec ..
make -j 4 >& build.out &
-``-DCMAKE_INSTALL_PREFIX`` specifies the location in which the ``exec``, ``include``, ``lib``, and ``share`` directories will be created. These directories will contain various components of the SRW App. Its recommended value ``..`` denotes one directory up from the build directory. In the next line, the ``make`` argument ``-j 4`` indicates that the build will run in parallel with 4 threads. Although users can specify a larger or smaller number of threads (e.g., ``-j 8``, ``-j 2``), it is highly recommended to use at least 4 parallel threads to prevent overly long installation times.
+``-DCMAKE_INSTALL_PREFIX`` specifies the location where the ``exec``, ``include``, ``lib``, and ``share`` directories will be created. These directories will contain various components of the SRW App. Its recommended value ``..`` denotes one directory up from the build directory. In the next line, the ``make`` argument ``-j 4`` indicates that the build will run in parallel with 4 threads. Although users can specify a larger or smaller number of threads (e.g., ``-j 8``, ``-j 2``), it is highly recommended to use at least 4 parallel threads to prevent overly long installation times.
The build will take a few minutes to complete. When it starts, a random number is printed to the console, and when it is done, a ``[1]+ Done`` message is printed to the console. ``[1]+ Exit`` indicates an error. Output from the build will be in the ``ufs-srweather-app/build/build.out`` file. When the build completes, users should see the forecast model executable ``ufs_model`` and several pre- and post-processing executables in the ``ufs-srweather-app/exec`` directory. These executables are described in :numref:`Table %s `.
@@ -431,62 +436,41 @@ The build will take a few minutes to complete. When it starts, a random number i
If you see the ``build.out`` file, but there is no ``ufs-srweather-app/exec`` directory, wait a few more minutes for the build to complete.
-.. _MacDetails:
+.. _MacLinuxDetails:
-Additional Details for Building on MacOS
-------------------------------------------
+Additional Details for Building on MacOS or Generic Linux
+------------------------------------------------------------
.. note::
- Users who are **not** building the SRW App on a MacOS machine may skip to the :numref:`Section %s ` to finish building the SRW App or continue to :numref:`Chapter %s ` to configure and run an experiment.
+ Users who are **not** building the SRW App on MacOS or generic Linux platforms may skip to :numref:`Section %s ` to finish building the SRW App or continue to :numref:`Chapter %s ` to configure and run an experiment.
-The SRW App can be built on MacOS machines, presuming HPC-Stack has already been installed successfully. The following two options have been tested:
+The SRW App can be built on MacOS and generic Linux machines after the prerequisite software has been installed on these systems (via :term:`HPC-Stack` or :term:`spack-stack`). The installation for MacOS is architecture-independent and has been tested using both x86_64 and M1 chips (running natively). The following configurations for MacOS have been tested:
-* **Option 1:** MacBookAir 2020, M1 chip (arm64, running natively), 4+4 cores, Big Sur 11.6.4, GNU compiler suite v.11.2.0_3 (gcc, gfortran, g++); no MPI pre-installed
+ #. MacBookPro 2019, 2.4 GHz 8-core Intel Core i9 (x86_64), Monterey Sur 12.1, GNU compiler suite v.11.3.0 (gcc, gfortran, g++); mpich 3.3.2 or openmpi/4.1.2
+ #. MacBookAir 2020, M1 chip (arm64, running natively), 4+4 cores, Big Sur 11.6.4, GNU compiler suite v.11.3.0 (gcc, gfortran, g++); mpich 3.3.2 or openmpi/4.1.2
+ #. MacBook Pro 2015, 2.8 GHz Quad-Core Intel Core i7 (x86_64), Catalina OS X 10.15.7, GNU compiler suite v.11.2.0_3 (gcc, gfortran, g++); mpich 3.3.2 or openmpi/4.1.2
-* **Option 2:** MacBook Pro 2015, 2.8 GHz Quad-Core Intel Core i7 (x86_64), Catalina OS X 10.15.7, GNU compiler suite v.11.2.0_3 (gcc, gfortran, g++); no MPI pre-installed
+Several Linux builds have been tested on systems with x86_64 architectures.
-The ``build_macos_gnu`` modulefile initializes the module environment, lists the location of HPC-Stack modules, loads the meta-modules and modules, and sets compilers, additional flags, and environment variables needed for building the SRW App. The modulefile must be modified to include the absolute path to the user's HPC-Stack installation and ``ufs-srweather-app`` directories. In particular, the following section must be modified:
+The ``./modulefiles/build__gnu.lua`` modulefile (where ```` is ``macos`` or ``linux``) is written as a Lmod module in the Lua language, and it can be loaded once the Lmod module environment has been initialized (which should have happened even prior to :ref:`installing HPC-Stack `). This module lists the location of the HPC-Stack modules, loads the meta-modules and modules, sets serial and parallel compilers, additional flags, and any environment variables needed for building the SRW App. The modulefile must be modified to include the absolute path to the user's HPC-Stack installation:
.. code-block:: console
- # This path should point to your HPCstack installation directory
- setenv HPCstack "/Users/username/hpc-stack/install"
-
- # This path should point to your SRW Application directory
- setenv SRW "/Users/username/ufs-srweather-app"
+ - This path should point to your HPCstack installation directory
+ local HPCstack="/Users/username/hpc-stack/install"
-An excerpt of the ``build_macos_gnu`` contents appears below for Option 1. To use Option 2, the user will need to comment out the lines specific to Option 1 (using a double hyphen) and uncomment the lines specific to Option 2 in the ``build_macos_gnu`` modulefile. Additionally, users need to verify that all file paths reflect their system's configuration and that the correct version numbers for software libraries appear in the modulefile.
-
-.. code-block:: console
+Linux users need to configure the ``ufs-srweather-app/etc/lmod-setup.sh`` file for the ``linux`` case and set the ``BASH_ENV`` variable to point to the Lmod initialization script. There is no need to modify this script for the ``macos`` case presuming that Lmod followed a standard installation procedure using the Homebrew package manager for MacOS.
- -- Option 1 compiler paths:
- setenv("CC", "/opt/homebrew/bin/gcc")
- setenv("FC", "/opt/homebrew/bin/gfortran")
- setenv("CXX", "/opt/homebrew/bin/g++")
-
- -- Option 2 compiler paths:
- --[[
- setenv("CC", "/usr/local/bin/gcc")
- setenv("FC", "/usr/local/bin/gfortran")
- setenv("CXX", "/usr/local/bin/g++")
- --]]
-
-Then, users must source the Lmod setup file, just as they would on other systems, and load the modulefiles needed for building and running the SRW App:
+Next, users must source the Lmod setup file, just as they would on other systems, and load the modulefiles needed for building and running the SRW App:
.. code-block:: console
-
- source etc/lmod-setup.sh macos
+
+ source etc/lmod-setup.sh
module use
- module load build_macos_gnu
- export LDFLAGS="-L${MPI_ROOT}/lib"
-
-In a csh/tcsh shell, users would run ``source etc/lmod-setup.csh macos`` in place of the first line in the code block above.
-
-Additionally, for Option 1 systems, set the variable ``ENABLE_QUAD_PRECISION`` to ``OFF`` in the ``$SRW/src/ufs-weather-model/FV3/atmos_cubed_sphere/CMakeLists.txt`` file. This change is optional if using Option 2 to build the SRW App. To make this change using a streamline editor (`sed`), run:
-
-.. code-block:: console
+ module load build__gnu
+ export LDFLAGS+=" -L${MPI_ROOT}/lib "
- sed -i .bak 's/QUAD_PRECISION\" ON)/QUAD_PRECISION\" OFF)/' $SRW/src/ufs-weather-model/FV3/atmos_cubed_sphere/CMakeLists.txt
+In a csh/tcsh shell, users would run ``source etc/lmod-setup.csh `` in place of the first line in the code block above. The last line is primarily needed for the MacOS platforms.
Proceed to building the executables using the process outlined in :numref:`Step %s `.
diff --git a/docs/UsersGuide/source/Components.rst b/docs/UsersGuide/source/Components.rst
index b187db6680..2b6ff1ddd8 100644
--- a/docs/UsersGuide/source/Components.rst
+++ b/docs/UsersGuide/source/Components.rst
@@ -8,11 +8,11 @@ The SRW Application assembles a variety of components, including:
* Pre-processor Utilities & Initial Conditions
* UFS Weather Forecast Model
-* Unified Post-Processor
+* Unified Post Processor
* Visualization Examples
* Build System and Workflow
-These components are documented within this User's Guide and supported through a `community forum `_.
+These components are documented within this User's Guide and supported through the `GitHub Discussions `__ forum.
.. _Utils:
@@ -30,25 +30,23 @@ The SRW Application can be initialized from a range of operational initial condi
Forecast Model
==============
-The prognostic atmospheric model in the UFS SRW Application is the Finite-Volume Cubed-Sphere
-(:term:`FV3`) dynamical core configured with a Limited Area Model (:term:`LAM`) capability :cite:`BlackEtAl2021`. The :term:`dynamical core` is the computational part of a model that solves the equations of fluid motion. A User’s Guide for the UFS :term:`Weather Model` is `here `__.
+The prognostic atmospheric model in the UFS SRW Application is the Finite-Volume Cubed-Sphere (:term:`FV3`) dynamical core configured with a Limited Area Model (:term:`LAM`) capability :cite:`BlackEtAl2021`. The :term:`dynamical core` is the computational part of a model that solves the equations of fluid motion. A User's Guide for the UFS Weather Model can be accessed `here `__.
Supported model resolutions in this release include 3-, 13-, and 25-km predefined contiguous U.S. (:term:`CONUS`) domains, each with 127 vertical levels. Preliminary tools for users to define their own domain are also available in the release with full, formal support of these tools to be provided in future releases. The Extended Schmidt Gnomonic (ESG) grid is used with the FV3-LAM, which features relatively uniform grid cells across the entirety of the domain. Additional information about the FV3 dynamical core can be found in the `scientific documentation `__, the `technical documentation `__, and on the `NOAA Geophysical Fluid Dynamics Laboratory website `__.
-Interoperable atmospheric physics, along with various land surface model options, are supported through the Common Community Physics Package (CCPP), described `here `__. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. There will be four physics suites supported for the SRW App v2.0.0 release. The first is the FV3_RRFS_v1beta physics suite, which is being tested for use in the future operational implementation of the Rapid Refresh Forecast System (RRFS) planned for 2023-2024, and the second is an updated version of the physics suite used in the operational Global Forecast System (GFS) v16. Additionally, FV3_WoFS_v0 and FV3_HRRR will be supported. A scientific description of the CCPP parameterizations and suites can be found in the `CCPP Scientific Documentation `__, and CCPP technical aspects are described in the `CCPP Technical Documentation `__. The model namelist has many settings beyond the physics options that can optimize various aspects of the model for use with each of the supported suites. Additional information on Stochastic Physics options is available `here `__.
+Interoperable atmospheric physics, along with various land surface model options, are supported through the Common Community Physics Package (CCPP), described `here `__. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. There are four physics suites supported as of the SRW App v2.1.0 release. The first is the FV3_RRFS_v1beta physics suite, which is being tested for use in the future operational implementation of the Rapid Refresh Forecast System (:term:`RRFS`) planned for 2023-2024, and the second is an updated version of the physics suite used in the operational Global Forecast System (GFS) v16. Additionally, FV3_WoFS_v0 and FV3_HRRR are supported. A detailed list of CCPP updates since the SRW App v2.0.0 release is available :ref:`here `. A full scientific description of CCPP parameterizations and suites can be found in the `CCPP Scientific Documentation `__, and CCPP technical aspects are described in the `CCPP Technical Documentation `__. The model namelist has many settings beyond the physics options that can optimize various aspects of the model for use with each of the supported suites. Additional information on Stochastic Physics options is available `here `__.
.. note::
SPP is currently only available for specific physics schemes used in the RAP/HRRR physics suite. Users need to be aware of which physics suite definition file (:term:`SDF`) is chosen when turning this option on. Among the supported physics suites, the full set of parameterizations can only be used with the ``FV3_HRRR`` option for ``CCPP_PHYS_SUITE``.
The SRW App supports the use of both :term:`GRIB2` and :term:`NEMSIO` input data. The UFS Weather Model ingests initial and lateral boundary condition files produced by :term:`chgres_cube` and outputs files in netCDF format on a specific projection (e.g., Lambert Conformal) in the horizontal direction and model levels in the vertical direction.
-Post-processor
+Post Processor
==============
The SRW Application is distributed with the Unified Post Processor (:term:`UPP`) included in the workflow as a way to convert the netCDF output on the native model grid to :term:`GRIB2` format on standard isobaric vertical coordinates. The UPP can also be used to compute a variety of useful diagnostic fields, as described in the `UPP User's Guide `__.
-Output from UPP can be used with visualization, plotting, and verification packages or in
-further downstream post-processing (e.g., statistical post-processing techniques).
+Output from UPP can be used with visualization, plotting, and verification packages or in further downstream post-processing (e.g., statistical post-processing techniques).
.. _MetplusComponent:
@@ -57,39 +55,38 @@ METplus Verification Suite
The enhanced Model Evaluation Tools (`METplus `__) verification system has been integrated into the SRW App to facilitate forecast evaluation. METplus is a verification framework that spans a wide range of temporal scales (warn-on-forecast to climate) and spatial scales (storm to global). It is supported by the `Developmental Testbed Center (DTC) `__.
-METplus *installation* is not included as part of the build process for the most recent release of the SRW App. However, METplus is preinstalled on many `Level 1 & 2 `__ systems; existing builds can be viewed `here `__. METplus can be installed on other systems individually or as part of :term:`HPC-Stack` installation. Users on systems without a previous installation of METplus can follow the `MET Installation Guide `__ and `METplus Installation Guide `__ for individual installation. Currently, METplus *installation* is not a supported feature for this release of the SRW App. However, METplus *use* is supported on systems with a functioning METplus installation.
+METplus *installation* is not included as part of the build process for the most recent release of the SRW App. However, METplus is preinstalled on many `Level 1 & 2 `__ systems; existing builds can be viewed `here `__.
-The core components of the METplus framework include the statistical driver, MET, the associated database and display systems known as METviewer and METexpress, and a suite of Python wrappers to provide low-level automation and examples, also called use-cases. MET is a set of verification tools developed for use by the :term:`NWP` community. It matches up grids with either gridded analyses or point observations and applies configurable methods to compute statistics and diagnostics. Extensive documentation is available in the `METplus User’s Guide `__ and `MET User’s Guide `__. Documentation for all other components of the framework can be found at the Documentation link for each component on the METplus `downloads `__ page.
+METplus can be installed on other systems individually or as part of :term:`HPC-Stack` installation. Users on systems without a previous installation of METplus can follow the `MET Installation Guide `__ and `METplus Installation Guide `__ for individual installation. Currently, METplus *installation* is not a supported feature for this release of the SRW App. However, METplus *use* is supported on systems with a functioning METplus installation.
-Among other techniques, MET provides the capability to compute standard verification scores for comparing deterministic gridded model data to point-based and gridded observations. It also provides ensemble and probabilistic verification methods for comparing gridded model data to point-based or gridded observations. Verification tasks to accomplish these comparisons are defined in the SRW App in :numref:`Table %s `. Currently, the SRW App supports the use of :term:`NDAS` observation files in `prepBUFR format `__ (which include conventional point-based surface and upper-air data) for point-based verification. It also supports gridded Climatology-Calibrated Precipitation Analysis (:term:`CCPA`) data for accumulated precipitation evaluation and Multi-Radar/Multi-Sensor (:term:`MRMS`) gridded analysis data for composite reflectivity and :term:`echo top` verification.
+The core components of the METplus framework include the statistical driver, MET, the associated database and display systems known as METviewer and METexpress, and a suite of Python wrappers to provide low-level automation and examples, also called use cases. MET is a set of verification tools developed for use by the :term:`NWP` community. It matches up grids with either gridded analyses or point observations and applies configurable methods to compute statistics and diagnostics. Extensive documentation is available in the `METplus User's Guide `__ and `MET User's Guide `__. Documentation for all other components of the framework can be found at the Documentation link for each component on the METplus `downloads `__ page.
+
+Among other techniques, MET provides the capability to compute standard verification scores for comparing deterministic gridded model data to point-based and gridded observations. It also provides ensemble and probabilistic verification methods for comparing gridded model data to point-based or gridded observations. Verification tasks to accomplish these comparisons are defined in the SRW App in :numref:`Table %s `. Currently, the SRW App supports the use of :term:`NDAS` observation files (which include conventional point-based surface and upper-air data) in `prepBUFR format `__ for point-based verification. It also supports gridded Climatology-Calibrated Precipitation Analysis (:term:`CCPA`) data for accumulated precipitation evaluation and Multi-Radar/Multi-Sensor (:term:`MRMS`) gridded analysis data for composite reflectivity and :term:`echo top` verification.
METplus is being actively developed by :term:`NCAR`/Research Applications Laboratory (RAL), NOAA/Earth Systems Research Laboratories (ESRL), and NOAA/Environmental Modeling Center (:term:`EMC`), and it is open to community contributions.
Visualization Example
=====================
-A Python script is provided to create basic visualizations of the model output. The script
-is designed to output graphics in PNG format for 14 standard meteorological variables
+A Python script is provided to create basic visualizations of the model output. The script is designed to output graphics in PNG format for several standard meteorological variables
when using the pre-defined :term:`CONUS` domain. A difference plotting script is also included to visually compare two runs for the same domain and resolution. These scripts are provided only as an example for users familiar with Python. They may be used to perform a visual check to verify that the application is producing reasonable results.
-After running ``manage_externals/checkout_externals``, the visualization scripts will be available in the ``ufs-srweather-app/ush/Python`` directory. Usage information and instructions are described in :numref:`Chapter %s ` and are also included at the top of the script.
+After running ``manage_externals/checkout_externals``, the visualization scripts will be available in the ``ufs-srweather-app/ush/Python`` directory. Usage information and instructions are described in :numref:`Chapter %s ` and are also included at the top of each script.
Build System and Workflow
=========================
The SRW Application has a portable build system and a user-friendly, modular, and expandable workflow framework.
-An umbrella CMake-based build system is used for building the components necessary for running the end-to-end SRW Application, including the UFS Weather Model and the pre- and post-processing software. Additional libraries necessary for the application (e.g., :term:`NCEPLIBS-external` and :term:`NCEPLIBS`) are not included in the SRW Application build system but are available pre-built on pre-configured platforms. On other systems, they can be installed via the HPC-Stack (see :doc:`HPC-Stack Documentation `). There is a small set of system libraries and utilities that are assumed to be present on the target computer: the CMake build software, a Fortran, C, and C++ compiler, and an :term:`MPI` library.
+An umbrella CMake-based build system is used for building the components necessary for running the end-to-end SRW Application, including the UFS Weather Model and the pre- and post-processing software. Additional libraries necessary for the application (e.g., :term:`NCEPLIBS-external` and :term:`NCEPLIBS`) are not included in the SRW Application build system but are available pre-built on pre-configured platforms. On other systems, they can be installed via the HPC-Stack (see :doc:`HPC-Stack Documentation `). There is a small set of system libraries and utilities that are assumed to be present on the target computer: the CMake build software; a Fortran, C, and C++ compiler; and an :term:`MPI` library.
Once built, the provided experiment generator script can be used to create a Rocoto-based
-workflow file that will run each task in the system in the proper sequence (see :numref:`Chapter %s ` or the `Rocoto documentation `_ for more information on Rocoto). If Rocoto and/or a batch system is not present on the available platform, the individual components can be run in a stand-alone, command line fashion with provided run scripts. The generated namelist for the atmospheric model can be modified in order to vary settings such as forecast starting and ending dates, forecast length hours, the :term:`CCPP` physics suite, integration time step, history file output frequency, and more. It also allows for configuration of other elements of the workflow; for example, users can choose whether to run some or all of the pre-processing, forecast model, and post-processing steps.
+workflow file that will run each task in the system in the proper sequence (see :numref:`Chapter %s ` or the `Rocoto documentation `__ for more information on Rocoto). If Rocoto and/or a batch system is not present on the available platform, the individual components can be run in a stand-alone, command line fashion with provided run scripts. The generated namelist for the atmospheric model can be modified in order to vary settings such as forecast starting and ending dates, forecast length hours, the :term:`CCPP` physics suite, integration time step, history file output frequency, and more. It also allows for configuration of other elements of the workflow; for example, users can choose whether to run some or all of the pre-processing, forecast model, and post-processing steps.
-The latest SRW Application release has been tested on a variety of platforms widely used by
-researchers, such as the NOAA Research and Development High-Performance Computing Systems
-(RDHPCS), including Hera, Orion, and Jet; the National Center for Atmospheric Research (:term:`NCAR`) Cheyenne system; the National Severe Storms Laboratory (NSSL) HPC machine, Odin; the National Science Foundation Stampede2 system; and generic Linux and MacOS systems using Intel and GNU compilers. Four `levels of support `_ have been defined for the SRW Application, including pre-configured (Level 1), configurable (Level 2), limited test platforms (Level 3), and build only platforms (Level 4). Each level is further described below.
+The latest SRW Application release has been tested on a variety of platforms widely used by researchers, such as the NOAA Research and Development High-Performance Computing Systems (RDHPCS), including Hera, Orion, and Jet; the National Center for Atmospheric Research (:term:`NCAR`) Cheyenne system; and generic Linux and MacOS systems using Intel and GNU compilers. Four `levels of support `__ have been defined for the SRW Application, including pre-configured (Level 1), configurable (Level 2), limited-test (Level 3), and build-only (Level 4) platforms. Each level is further described below.
On pre-configured (Level 1) computational platforms, all the required libraries for building the SRW Application are available in a central place. That means bundled libraries (NCEPLIBS) and third-party libraries (NCEPLIBS-external) have both been built. The SRW Application is expected to build and run out-of-the-box on these pre-configured platforms.
A few additional computational platforms are considered configurable for the SRW Application release. Configurable platforms (Level 2) are platforms where all of the required libraries for building the SRW Application are expected to install successfully but are not available in a central location. Applications and models are expected to build and run once the required bundled libraries (e.g., NCEPLIBS) and third-party libraries (e.g., NCEPLIBS-external) are built.
-Limited-Test (Level 3) and Build-Only (Level 4) computational platforms are those in which the developers have built the code but little or no pre-release testing has been conducted, respectively. A complete description of the levels of support, along with a list of preconfigured and configurable platforms can be found in the `SRW Application Wiki `_.
+Limited-Test (Level 3) and Build-Only (Level 4) computational platforms are those in which the developers have built the code but little or no pre-release testing has been conducted, respectively. A complete description of the levels of support, along with a list of preconfigured and configurable platforms can be found in the `SRW Application Wiki `__.
diff --git a/docs/UsersGuide/source/ConfigWorkflow.rst b/docs/UsersGuide/source/ConfigWorkflow.rst
index eb0407e954..ecde1be5d3 100644
--- a/docs/UsersGuide/source/ConfigWorkflow.rst
+++ b/docs/UsersGuide/source/ConfigWorkflow.rst
@@ -488,8 +488,7 @@ Verification Tasks
``RUN_TASK_VX_ENSPOINT``: (Default: false)
Flag that determines whether to run the ensemble point verification task. If this flag is set, both ensemble-stat point verification and point verification of ensemble-stat output is computed. The :ref:`MET Ensemble-Stat tool ` provides verification statistics for ensemble forecasts and can be used in conjunction with the :ref:`MET Point-Stat tool `. See :numref:`Section %s ` for additional parameters related to this task. Valid values: ``True`` | ``False``
-..
- COMMENT: Might be worth defining "ensemble-stat verification for gridded data," "ensemble point verification," "ensemble-stat point verification," and "point verification of ensemble-stat output"
+.. COMMENT: COMMENT: Define "ensemble-stat verification for gridded data," "ensemble point verification," "ensemble-stat point verification," and "point verification of ensemble-stat output"?
.. _make-grid:
@@ -917,10 +916,12 @@ Non-default parameters for the ``make_lbcs`` task are set in the ``task_make_lbc
``OMP_STACKSIZE_MAKE_LBCS``: (Default: "1024m")
Controls the size of the stack for threads created by the OpenMP implementation.
+.. _FcstConfigParams:
+
FORECAST Configuration Parameters
=====================================
-Non-default parameters for the ``run_fcst`` task are set in the ``task_run_fcst`` section of the ``config.yaml`` file.
+Non-default parameters for the ``run_fcst`` task are set in the ``task_run_fcst:`` section of the ``config.yaml`` file.
Basic Task Parameters
---------------------------------
@@ -961,13 +962,15 @@ For each workflow task, certain parameter values must be passed to the job sched
``OMP_STACKSIZE_RUN_FCST``: (Default: "1024m")
Controls the size of the stack for threads created by the OpenMP implementation.
+.. _ModelConfigParams:
+
Model Configuration Parameters
----------------------------------
These parameters set values in the Weather Model's ``model_configure`` file.
``DT_ATMOS``: (Default: "")
- Time step for the outermost atmospheric model loop in seconds. This corresponds to the frequency at which the physics routines and the top level dynamics routine are called. (Note that one call to the top-level dynamics routine results in multiple calls to the horizontal dynamics, :term:`tracer` transport, and vertical dynamics routines; see the `FV3 dycore scientific documentation `__ for details.) Must be set. Takes an integer value. In the SRW App, a default value for ``DT_ATMOS`` appears in the ``set_predef_grid_params.yaml`` script, but a different value can be set in ``config.yaml``.
+ Time step for the outermost atmospheric model loop in seconds. This corresponds to the frequency at which the physics routines and the top level dynamics routine are called. (Note that one call to the top-level dynamics routine results in multiple calls to the horizontal dynamics, :term:`tracer` transport, and vertical dynamics routines; see the `FV3 dycore scientific documentation `__ for details.) Must be set. Takes an integer value. In the SRW App, a default value for ``DT_ATMOS`` appears in the ``set_predef_grid_params.yaml`` script, but a different value can be set in ``config.yaml``. In general, the smaller the grid cell size is, the smaller this value needs to be in order to avoid numerical instabilities during the forecast.
``RESTART_INTERVAL``: (Default: 0)
Frequency of the output restart files in hours. Using the default interval (0), restart files are produced at the end of a forecast run. When ``RESTART_INTERVAL: 1``, restart files are produced every hour with the prefix "YYYYMMDD.HHmmSS." in the ``RESTART`` directory.
@@ -1012,7 +1015,7 @@ Write-Component (Quilting) Parameters
Flag that determines whether to output extra (debugging) information from :term:`ESMF` routines. Note that the write component uses ESMF library routines to interpolate from the native forecast model grid to the user-specified output grid (which is defined in the model configuration file ``model_configure`` in the forecast run directory). Valid values: ``True`` | ``False``
``WRTCMP_write_groups``: (Default: 1)
- The number of write groups (i.e., groups of :term:`MPI` tasks) to use in the write component.
+ The number of write groups (i.e., groups of :term:`MPI` tasks) to use in the write component. Each write group will write to one set of output files (a ``dynf${fhr}.nc`` and a ``phyf${fhr}.nc`` file, where ``${fhr}`` is the forecast hour). Each write group contains ``WRTCMP_write_tasks_per_group`` tasks. Usually, one write group is sufficient. This may need to be increased if the forecast is proceeding so quickly that a single write group cannot complete writing to its set of files before there is a need/request to start writing the next set of files at the next output time.
``WRTCMP_write_tasks_per_group``: (Default: 20)
The number of MPI tasks to allocate for each write group.
@@ -1912,11 +1915,6 @@ The parameters below turn on SPP in Noah or RUC LSM (support for Noah MP is in p
``LSM_SPP_MAG_LIST``: (Default: [ 0.017, 0.001, 0.001, 0.001, 0.001, 0.001, 0.2 ] )
Sets the maximum random pattern amplitude for each of the LSM perturbations.
-.. COMMENT: This variable no longer appears and was going to be removed. See if anything has replaced it.
- ``LSM_SPP_EACH_STEP``: (Default: "true")
- When set to "TRUE", it sets ``lndp_each_step=.true.`` and perturbs each time step.
-
-
.. _HaloBlend:
Halo Blend Parameter
diff --git a/docs/UsersGuide/source/ContainerQuickstart.rst b/docs/UsersGuide/source/ContainerQuickstart.rst
index 8b72c079e5..025893a840 100644
--- a/docs/UsersGuide/source/ContainerQuickstart.rst
+++ b/docs/UsersGuide/source/ContainerQuickstart.rst
@@ -4,7 +4,7 @@
Container-Based Quick Start Guide
====================================
-This Container-Based Quick Start Guide will help users build and run the "out-of-the-box" case for the Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application using a `Singularity `__ container. The :term:`container` approach provides a uniform enviroment in which to build and run the SRW App. Normally, the details of building and running the SRW App vary from system to system due to the many possible combinations of operating systems, compilers, :term:`MPI`'s, and package versions available. Installation via Singularity container reduces this variability and allows for a smoother SRW App build experience. Normally, containers can only run on a single compute node and are not compatible with the `Rocoto workflow manager `__, so users must run each task in the workflow manually. However, the Singularity container described in this chapter has been adapted such that it is able to run across multiple nodes using Rocoto. This makes it an excellent starting point for beginners. The :ref:`non-container build approach ` may still be more appropriate for users who desire additional customizability, particularly if they already have experience running the SRW App.
+This Container-Based Quick Start Guide will help users build and run the "out-of-the-box" case for the Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application using a `Singularity `__ container. The :term:`container` approach provides a uniform enviroment in which to build and run the SRW App. Normally, the details of building and running the SRW App vary from system to system due to the many possible combinations of operating systems, compilers, :term:`MPIs `, and package versions available. Installation via Singularity container reduces this variability and allows for a smoother SRW App build experience. Normally, containers can only run on a single compute node and are not compatible with the `Rocoto workflow manager `__, so users must run each task in the workflow manually. However, the Singularity container described in this chapter has been adapted such that it is able to run across multiple nodes using Rocoto. This makes it an excellent starting point for beginners. The :ref:`non-container build approach ` may still be more appropriate for users who desire additional customizability, particularly if they already have experience running the SRW App.
The "out-of-the-box" SRW App case described in this User's Guide builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (:term:`CONUS`) grid (RRFS_CONUS_25km), the Global Forecast System (:term:`GFS`) version 16 physics suite (FV3_GFS_v16 :term:`CCPP`), and :term:`FV3`-based GFS raw external model data for initialization.
@@ -28,7 +28,7 @@ Users must have an **Intel** compiler and :term:`MPI` (available for free `here
Install Singularity
^^^^^^^^^^^^^^^^^^^^^^^
-To build and run the SRW App using a Singularity container, first install the Singularity package according to the `Singularity Installation Guide `__. This will include the installation of dependencies and the installation of the Go programming language. SingularityCE Version 3.7 or above is recommended.
+To build and run the SRW App using a Singularity container, first install the Singularity package according to the `Singularity Installation Guide `__. This will include the installation of dependencies and the installation of the Go programming language. SingularityCE Version 3.7 or above is recommended.
.. warning::
Docker containers can only be run with root privileges, and users cannot have root privileges on :term:`HPCs `. Therefore, it is not possible to build the SRW App, which uses the HPC-Stack, inside a Docker container on an HPC system. However, a Singularity image may be built directly from a Docker image for use on the system.
@@ -124,11 +124,11 @@ On non-Level 1 systems, users should build the container in a writable sandbox:
Some users may prefer to issue the command without the ``sudo`` prefix. Whether ``sudo`` is required is system-dependent.
.. note::
- Users can choose to build a release version of the container (SRW App v2.0.0) using a similar command:
+ Users can choose to build a release version of the container (SRW App v2.1.0) using a similar command:
.. code-block:: console
- sudo singularity build --sandbox ubuntu20.04-intel-srwapp docker://noaaepic/ubuntu20.04-intel22-ufs-srwapp:release-public-v2
+ sudo singularity build --sandbox ubuntu20.04-intel-srwapp docker://noaaepic/ubuntu20.04-intel-srwapp:release-public-v2.1.0
.. _WorkOnHPC:
@@ -210,7 +210,7 @@ Users can run ``exit`` to exit the shell.
Download and Stage the Data
============================
-The SRW App requires input files to run. These include static datasets, initial and boundary condition files, and model configuration files. On Level 1 systems, the data required to run SRW App tests are already available as long as the bind argument (starting with ``-B``) in :numref:`Step %s ` included the directory with the input model data. For Level 2-4 systems, the data must be added manually by the user. Detailed instructions on how to add the data can be found in :numref:`Section %s `. Sections :numref:`%s ` and :numref:`%s ` contain useful background information on the input and output files used in the SRW App.
+The SRW App requires input files to run. These include static datasets, initial and boundary condition files, and model configuration files. On Level 1 systems, the data required to run SRW App tests are already available as long as the bind argument (starting with ``-B``) in :numref:`Step %s ` included the directory with the input model data. See :numref:`Table %s ` for Level 1 data locations. For Level 2-4 systems, the data must be added manually by the user. Detailed instructions on how to add the data can be found in :numref:`Section %s `. Sections :numref:`%s ` and :numref:`%s ` contain useful background information on the input and output files used in the SRW App.
.. _GenerateForecastC:
@@ -246,7 +246,7 @@ To activate the regional workflow, run the following commands:
where:
- * ```` is replaced with the actual path to the modulefiles on the user's system (often ``$PWD/modulefiles``), and
+ * ```` is replaced with the actual path to the modulefiles on the user's local system (often ``$PWD/modulefiles``), and
* ```` is a valid, lowercased machine/platform name (see the ``MACHINE`` variable in :numref:`Section %s `).
The ``wflow_`` modulefile will then output instructions to activate the regional workflow. The user should run the commands specified in the modulefile output. For example, if the output says:
@@ -310,7 +310,7 @@ From here, users can follow the steps below to configure the out-of-the-box SRW
.. code-block:: console
USE_CRON_TO_RELAUNCH: TRUE
- CRON_RELAUNCH_INTVL_MNTS: 02
+ CRON_RELAUNCH_INTVL_MNTS: 3
There are instructions for running the experiment via additional methods in :numref:`Section %s `. However, this technique (automation via :term:`crontab`) is the simplest option.
@@ -351,7 +351,7 @@ Run the following command to generate the workflow:
.. code-block:: console
- python generate_FV3LAM_wflow.py
+ ./generate_FV3LAM_wflow.py
This workflow generation script creates an experiment directory and populates it with all the data needed to run through the workflow. The last line of output from this script should start with ``*/3 * * * *`` (or similar).
@@ -387,7 +387,7 @@ where:
New Experiment
===============
-To run a new experiment in the container at a later time, users will need to rerun the commands in :numref:`Section %s ` to reactivate the regional workflow. Then, users can configure a new experiment by updating the environment variables in ``config.yaml`` to reflect the desired experiment configuration. Basic instructions appear in :numref:`Section %s ` above, and detailed instructions can be viewed in :numref:`Section %s `. After adjusting the configuration file, regenerate the experiment by running ``python generate_FV3LAM_wflow.py``.
+To run a new experiment in the container at a later time, users will need to rerun the commands in :numref:`Section %s ` to reactivate the regional workflow. Then, users can configure a new experiment by updating the environment variables in ``config.yaml`` to reflect the desired experiment configuration. Basic instructions appear in :numref:`Section %s ` above, and detailed instructions can be viewed in :numref:`Section %s `. After adjusting the configuration file, regenerate the experiment by running ``./generate_FV3LAM_wflow.py``.
Plot the Output
===============
diff --git a/docs/UsersGuide/source/ContributorsGuide.rst b/docs/UsersGuide/source/ContributorsGuide.rst
deleted file mode 100644
index efb90ca4cc..0000000000
--- a/docs/UsersGuide/source/ContributorsGuide.rst
+++ /dev/null
@@ -1,463 +0,0 @@
-
-.. _ContributorsGuide:
-
-==============================
-SRW App Contributor's Guide
-==============================
-
-.. _Background:
-
-Background
-===========
-
-Authoritative branch
------------------------
-
-The ``ufs-srweather-app`` repository maintains a main branch for development called ``develop``. The HEAD of ``develop`` reflects the latest development changes. It points to regularly updated hashes for individual sub-components. Pull requests (PRs) will be merged to ``develop``.
-
-The ``develop`` branch is protected by the code management team:
- #. Pull requests for this branch require approval by at least two code reviewers.
- #. A code manager should perform at least one of the reviews and the merge, but other contributors are welcome to provide comments/suggestions.
-
-
-Code Management Team
---------------------------
-
-Scientists from across multiple labs and organizations have volunteered to review pull requests for the ``develop`` branch:
-
-.. table::
-
- +------------------+------------------------------------------------+---------------------------------------------------------------------------------------------+
- | **Organization** | **Reviewers** | **Areas of Expertise** |
- +==================+================================================+=============================================================================================+
- | EMC | Chan-Hoo Jeon (@chan-hoo) | Workflow, Operational platform testing (WCOSS/NCO), and Air quality modeling (Online-CMAQ) |
- | +------------------------------------------------+---------------------------------------------------------------------------------------------+
- | | Ben Blake (@BenjaminBlake-NOAA) | Output visualization, Rocoto |
- | +------------------------------------------------+---------------------------------------------------------------------------------------------+
- | | Ratko Vasic (@RatkoVasic-NOAA) | Workflow, NCO requirements, and operational platform testing |
- +------------------+------------------------------------------------+---------------------------------------------------------------------------------------------+
- | EPIC | Mark Potts (@mark-a-potts) | HPC systems |
- | +------------------------------------------------+---------------------------------------------------------------------------------------------+
- | | Jong Kim (@jkbk2004) | UFS Weather Model configuration, forecast sensitivity analysis, data assimilation |
- | +------------------------------------------------+---------------------------------------------------------------------------------------------+
- | | Michael Lueken (@MichaelLueken) | SRW App code management |
- | +------------------------------------------------+---------------------------------------------------------------------------------------------+
- | | Natalie Perlin (@natalie-perlin) | Generic Linux/Mac installations, hpc-stack/spack-stack |
- | +------------------------------------------------+---------------------------------------------------------------------------------------------+
- | | Gillian Petro (@gspetro-NOAA) | Documentation |
- | +------------------------------------------------+---------------------------------------------------------------------------------------------+
- | | Edward Snyder (@EdwardSnyder-NOAA) | WE2E testing, input data |
- +------------------+------------------------------------------------+---------------------------------------------------------------------------------------------+
- | GLERL | David Wright (@dmwright526) | FVCOM integration, output visualization, preprocessing tasks |
- +------------------+------------------------------------------------+---------------------------------------------------------------------------------------------+
- | GSL | Jeff Beck (@JeffBeck-NOAA) | SRW App configuration/workflow, code management, meteorological evaluation |
- | +------------------------------------------------+---------------------------------------------------------------------------------------------+
- | | Gerard Ketefian (@gsketefian) | regional workflow scripts, jinja templates, and verification tasks |
- | +------------------------------------------------+---------------------------------------------------------------------------------------------+
- | | Linlin Pan (@panll) | Workflow, CCPP/physics, verification |
- | +------------------------------------------------+---------------------------------------------------------------------------------------------+
- | | Christina Holt (@christinaholtNOAA) | Workflow, conda environment support, testing, and code management |
- | +------------------------------------------------+---------------------------------------------------------------------------------------------+
- | | Christopher Harrop (@christopherwharrop-noaa) | Rocoto, code management, and testing |
- | +------------------------------------------------+---------------------------------------------------------------------------------------------+
- | | Daniel Abdi (@danielabdi-noaa) | Workflow generation, testing RRFS on the cloud, environment modules |
- +------------------+------------------------------------------------+---------------------------------------------------------------------------------------------+
- | NCAR | Mike Kavulich (@mkavulich) | CCPP/physics |
- | +------------------------------------------------+---------------------------------------------------------------------------------------------+
- | | Will Mayfield (@willmayfield) | Verification/METplus tasks, regional workflow (esp. on Cheyenne) |
- +------------------+------------------------------------------------+---------------------------------------------------------------------------------------------+
- | NSSL | Yunheng Wang (@ywangwof) | HPC systems, code management and regional workflow especially on Stampede, Jet |
- | | | and NSSL computers |
- +------------------+------------------------------------------------+---------------------------------------------------------------------------------------------+
-
-.. _ContribProcess:
-
-Contribution Process
-========================
-
-The steps below should be followed in order to make changes to the ``develop`` branch of the ``ufs-srweather-app`` repository. Communication with code managers and the code management team throughout the process is encouraged.
-
- #. **Issue** - Open an issue to document changes. Click `here `__ to open a new ``ufs-srweather-app`` issue or see :numref:`Step %s ` for detailed instructions.
- #. **GitFlow** - Follow `GitFlow `__ procedures for development.
- #. **Fork the repository** - Read more `here `__ about forking in GitHub.
- #. **Create a branch** - Create a branch in your fork of the authoritative repository. Follow `GitFlow `__ conventions when creating the branch. All development should take place on a branch, *not* on ``develop``. Branches should be named as follows, where [name] is a one-word description of the branch:
-
- * **bugfix/[name]:** Fixes a demonstrably incorrect portion of code
- * **feature/[name]:** Adds a new feature to the code or improves an existing portion of the code
- * **text/[name]:** Changes elements of the repository that do not impact program output or log files (e.g., changes to README, documentation, comments, changing quoted Registry elements, white space alignment). Any change that does not impact the compiled code in any way should fall under this category.
-
- #. **Development** - Perform and test changes in the branch (not on ``develop``!). Document work in the issue and mention the issue number in commit messages to link your work to the issue (e.g., ``commit -m "Issue #23 - "``). Test code modifications on as many platforms as possible, and request help with further testing from the code management team when unable to test on all Level 1 platforms. Document changes to the workflow and capabilities in the ``.rst`` files so that the SRW App documentation stays up-to-date.
- #. **Pull request** - When ready to merge changes back to the ``develop`` branch, the code developer should initiate a pull request (PR) of the feature branch into the ``develop`` branch. Read `here `__ about pull requests in GitHub. When a PR is initiated, the :ref:`PR Template ` autofills. Developers should use the template to provide information about the PR in the proper fields. See the guidelines in :numref:`Section %s ` for more details on making a good pull request. Developers should also tag all relevant reviewers from the code management team on the PR.
- #. **Merge** - When review and testing are complete, a code manager will merge the PR into the target branch (usually ``develop``). PRs that are not ready for merging should have a "Work in Progress" label on them. Users who lack the permissions required to add the label can request in their PR that a code manager do so.
- #. **Cleanup** - After the PR is merged, the code developer should delete the branch on their fork and close the issue.
-
-.. note::
- Feature branches are intended to be short-lived, concentrated on code with one sole purpose, and applicable to a single PR. These branches should be deleted once merged, and a new feature branch should be created when subsequent code development continues.
-
-.. _Issue:
-
-Opening an Issue
--------------------
-
-All changes should be associated with a GitHub Issue. If developers are working on a change, they should search the existing issues in the ``ufs-srweather-app`` repository. If an issue does not exist for the work they are doing, they should create one prior to opening a new pull request.
-
-To open an issue, click on `New Issue `__ within the ``ufs-srweather-app`` GitHub repository.
-
-Choose from three options:
- #. `Bug Report `__: Report specific problems ("bugs") in the code using the following template:
-
- .. code-block:: console
-
-
-
- Your bug may already be reported!
- Please search on the [Issue tracker](https://github.com/ufs-community/ufs-srweather-app/issues) before creating
- a new issue.
- If an issue already exists, please use that issue to add any additional information.
-
- ## Expected behavior
-
-
- ## Current behavior
-
-
- ## Machines affected
-
-
-
- ## Steps To Reproduce
-
-
- ## Detailed Description of Fix (optional)
-
-
- ## Additional Information (optional)
-
-
- ## Possible Implementation (optional)
-
-
- ## Output (optional)
-
-
- #. `Feature Request `__: New features and feature enhancements fall under this category. Propose features and enhancements using the following template. Optional sections may be deleted.
-
- .. code-block:: console
-
-
-
- Your issue may already be reported!
- Please search on the [Issue tracker](https://github.com/ufs-community/ufs-srweather-app/issues) before creating
- a new issue.
- If an issue already exists, please use that issue to add any additional information.
-
-
- ## Description
-
-
-
-
- ## Solution
-
-
- ## Requirements**
-
-
- ## Acceptance Criteria (Definition of Done)
-
-
- ## Dependencies (optional)
-
-
-
- ## Alternative Solutions (optional)
-
-
- #. `Text-Only Changes `__: Propose text-only changes using the "Text-only request" template. Optional sections may be deleted.
-
- .. code-block:: console
-
- ## Description
-
-
- ## Solution
-
-
- ## Alternatives (optional)
-
-
- ## Related to (optional)
-
-
- #. `Other `__: Open a blank issue, and use the "Feature Request" template above as a starting point to describe the issue.
-
-For all issue reports, indicate whether this is an issue that you plan to work on and eventually submit a PR for or whether you are merely making a suggestion. After filling out the issue report, click on "Submit new issue."
-
-.. _GoodPR:
-
-Making a Pull Request
----------------------------
-
-All changes to the SRW App ``develop`` branch should be handled via GitHub's “Pull Request” (PR) functionality from a branch in the developer's fork. Developers must follow the template PR instructions (see :numref:`Step %s ` below) and provide links to the relevant GitHub issue(s). They must also indicate which tests were run on which machines.
-
-Pull requests will be reviewed and approved by at least two code reviewers, at least one of whom must be a code manager. When a PR has met the contribution and testing requirements and has been approved by two code reviewers, a code manager will merge the PR.
-
-
-.. note::
-
- * If a developer wants to make use of automated testing, any ``ufs-srweather-app`` dependencies must be opened in PRs from dependent branches of forks belonging to the same user.
- * The ``Externals.cfg`` file should point to any dependent branches in component repositories while those branches are under review. Once the PR has been merged in the dependent repository, the developer should update the references in their ``Externals.cfg`` file to reflect the appropriate hashes in the authoritative repositories.
- * Developers should mention in their ``ufs-srweather-app`` PR description that they are temporarily pointing to a branch/hash in their fork of a component repository and that it will be updated once the corresponding PR is merged.
-
-
-.. _Template:
-
-PR Template
-^^^^^^^^^^^^^^^^
-
-Here is the template that is provided when developers click "Create pull request:"
-
-.. code-block:: console
-
- - Update develop to head at ufs-community
- - Use this template to give a detailed message describing the change you want to make to the code.
- - You may delete any sections labeled "optional" and any instructions within .
- - If you are unclear on what should be written here, see https://github.com/wrf-model/WRF/wiki/Making-a-good-pull-request-message
- for some guidance and review the Code Contributor's Guide
- (https://github.com/ufs-community/ufs-srweather-app/wiki/Contributor's-Guide).
- - Code reviewers will assess the PR based on the criteria laid out in the Code Reviewer's Guide
- (https://github.com/ufs-community/ufs-srweather-app/wiki/Code-Manager's-Guide).
- - The title of this pull request should be a brief summary (ideally less than 100 characters) of the changes included
- in this PR. Please also include the branch to which this PR is being issued (e.g., "[develop]: Updated UFS_UTILS hash").
- - Use the "Preview" tab to see what your PR will look like when you hit "Create pull request"
-
- # --- Delete this line and those above before hitting "Create pull request" ---
-
- ## DESCRIPTION OF CHANGES:
-
-
- ### Type of change
-
- - [ ] Bug fix (non-breaking change which fixes an issue)
- - [ ] New feature (non-breaking change which adds functionality)
- - [ ] Breaking change (fix or feature that would cause existing functionality
- to not work as expected)
- - [ ] This change requires a documentation update
-
- ## TESTS CONDUCTED:
-
-
-
- - [ ] hera.intel
- - [ ] orion.intel
- - [ ] cheyenne.intel
- - [ ] cheyenne.gnu
- - [ ] gaea.intel
- - [ ] jet.intel
- - [ ] wcoss2.intel
- - [ ] NOAA Cloud (indicate which platform)
- - [ ] Jenkins
- - [ ] fundamental test suite
- - [ ] comprehensive tests (specify *which* if a subset was used)
-
- ## DEPENDENCIES:
-
-
- ## DOCUMENTATION:
-
-
- ## ISSUE:
-
-
- ## CHECKLIST
-
- - [ ] My code follows the style guidelines in the Contributor's Guide
- - [ ] I have performed a self-review of my own code using the Code Reviewer's Guide
- - [ ] I have commented my code, particularly in hard-to-understand areas
- - [ ] My changes need updates to the documentation. I have made corresponding changes
- to the documentation
- - [ ] My changes do not require updates to the documentation (explain).
- - [ ] My changes generate no new warnings
- - [ ] New and existing tests pass with my changes
- - [ ] Any dependent changes have been merged and published
-
- ## LABELS (optional):
-
- A Code Manager needs to add the following labels to this PR:
- - [ ] Work In Progress
- - [ ] bug
- - [ ] enhancement
- - [ ] documentation
- - [ ] release
- - [ ] high priority
- - [ ] run_ci
- - [ ] run_we2e_fundamental_tests
- - [ ] run_we2e_comprehensive_tests
- - [ ] Needs Cheyenne test
- - [ ] Needs Jet test
- - [ ] Needs Hera test
- - [ ] Needs Orion test
- - [ ] help wanted
-
- ## CONTRIBUTORS (optional):
-
-
-Additional Guidance
-^^^^^^^^^^^^^^^^^^^^^^^^
-
-**TITLE:** Titles should start with the target branch name in brackets and should give code reviewers a clear idea of what the change will do in approximately 5-10 words. Some good examples:
-
- * [develop] Make thompson_mynn_lam3km ccpp suite available
- * [release/public-v2] Add a build_linux_compiler modulefile
- * [develop] Fix module loads on Hera
- * [develop] Add support for Rocoto with generic LINUX platform
-
-All of the above examples concisely describe the changes contained in the pull request. The title will not get cut off in emails and web pages. In contrast, here are some made-up (but plausible) examples of BAD pull request titles:
-
- * Bug fixes (Bug fixes on what part of the code?)
- * Changes to surface scheme (What kind of changes? Which surface scheme?)
-
-**DESCRIPTION OF CHANGES:** The first line of the description should be a single-line "purpose" for this change. Note the type of change (i.e., bug fix, feature/enhancement, text-only). Summarize the problem, proposed solution, and required changes. If this is an enhancement or new feature, describe why the change is important.
-
-**DOCUMENTATION:** Developers should include documentation on new capabilities and enhancements by updating the appropriate ``.rst`` documentation files in their fork prior to the PR. These documentation updates should be noted in the "Documentation" section of the PR message. If necessary, contributors may submit the ``.rst`` documentation in a subsequent PR. In these cases, the developers should include any existing documentation in the "Documentation" section of the initial PR message or as a file attachment to the PR. Then, the contributor should open an issue (see :numref:`Step %s `) reflecting the need for official ``.rst`` documentation updates and include the issue number and explanation in the "Documentation" section of the initial PR template.
-
-
-Tips, Best Practices, and Protocols to Follow When Submitting a PR
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-* **Label PR status appropriately.** If the PR is not completely ready to be merged, please add a "Work in Progress" label. Urgent PRs should be marked "high priority." All PRs should have a type label (e.g., "bug," "enhancement"). Labels can be added on the right-hand side of a submitted PR request by clicking on the gear icon beside "Labels" (below the list of reviewers). If users do not have the permissions to add a label to their PR, they should request in their PR description that a code manager add the appropriate labels.
-* **Indicate urgency.** If a PR is particularly urgent, this information should be provided in the PR "Description" section, and multiple code management team members should be tagged to draw attention to this PR. After submitting the PR, a "high priority" label should be added to it (see below).
-* **Indicate the scope of the PR.** If the PR is extremely minor (e.g., change to the README file), indicate this in the PR message. If it is an extensive PR, the developer should test it on as many platforms as possible and stress the necessity that it be tested on systems for which they do not have access.
-* **Clarify in the PR message where the code has been tested.** At a minimum, code should be tested on the platform where code modification has taken place. It should also be tested on machines where code modifications will impact results. If the developer does not have access to these platforms, this should be noted in the PR.
-* **Follow separation of concerns.** For example, module loads are only handled in the appropriate modulefiles, Rocoto always sets the work directory, j-jobs make the work directory, and ex-scripts require the work directory to exist.
-* **Target subject matter experts (SMEs) among the code management team.** When possible, tag team members who are familiar with the modifications made in the PR so that the code management team can provide effective and streamlined PR reviews and approvals. Developers can tag SMEs by selecting the gear icon next to "Assignees" (under the Reviewers list) and adding the appropriate names.
-* **Schedule a live code review** if the PR is exceptionally complex in order to brief members of the code management team on the PR either in-person or through a teleconference. Developers should indicate in the PR message that they are interested in a live code review if they believe that it would be beneficial.
-
-
-.. _ContribStandards:
-
-Code and Configuration Standards
-===================================
-
-General
------------
-
-* Platform-specific settings should be handled only through configuration and modulefiles, not in code or scripts.
-* For changes to the ``scripts``, ``ush``, or ``jobs`` directories, developers should follow the `NCO Guidelines `__ for what is incorporated into each layer.
-* Developers should ensure that their contributions work with the most recent version of the ``ufs-srweather-app``, including all the specific up-to-date hashes of each subcomponent.
-* Modifications should not break any existing supported capabilities on any supported platforms.
-* Update the ``.rst`` documentation files where appropriate as part of the PR. The documentation files reside in ``ufs-srweather-app/docs``. If necessary, contributors may update the documentation in a subsequent PR. In these cases, the contributor should open an issue reflecting the need for documentation (see :numref:`Step %s `) and include the issue number and explanation in the Documentation section of their initial PR.
-
-SRW Application
-------------------
-
-**Externals.cfg**
- * All externals live in a single ``Externals.cfg`` file.
- * Only a single hash will be maintained for any given external code base. All externals should point to this static hash (not to the top of a branch).
- * All new entries in ``Externals.cfg`` must point only to authoritative repositories. In other words, entries must point to either a `UFS Community GitHub organization `__ repository or another NOAA project organization repository.
-
- * Temporary exceptions are made for a PR into the ``develop`` branch of ``ufs-srweather-app`` that is dependent on another PR (e.g., a ``ufs-weather-model`` PR from the same contributor). When the component PR is merged, the contributor must update the corresponding ``ufs-srweather-app`` PR with the hash of the component's authoritative repository.
-
-
-**Build System**
- * Each component must build with CMake
- * Each component must build with Intel compilers on official `Level 1 `__ platforms and with GNU or Intel compilers on other platforms.
- * Each component must have a mechanism for platform independence (i.e., no hard-coded machine-specific settings outside of established environment, configuration, and modulefiles).
- * Each component must build using the standard supported NCEPLIBS environment (currently `HPC-Stack `__).
-
-**Modulefiles**
- * Each component must build using the common modules located in the ``modulefiles/srw_common`` file.
-
-Regional Workflow
---------------------
-
-**General Coding Standards:**
- * The ``regional_workflow`` repository must not contain source code for compiled programs. Only scripts and configuration files should reside in this repository.
- * All bash scripts must explicitly be ``#!/bin/bash`` scripts. They should *not* be login-enabled (i.e., scripts should *not* use the ``-l`` flag).
- * MacOS does not have all Linux utilities by default. Developers should ensure that they do not break any MacOS capabilities with their contribution.
- * All code must be indented appropriately and conform to the style of existing scripts (e.g., local variables should be lowercase, global variables should be uppercase).
-
-**Python Coding Standards:**
- * All Python code contributions should come with an appropriate ``environment.yaml`` file for the feature.
- * Keep the use of external Python packages to a minimum for necessary workflow tasks. Currently, the required external Python packages are: ``f90nml``, ``pyyaml``, and ``Jinja2``.
-
-**Workflow Design:** Follow the `NCO Guidelines `__ for what is incorporated in each layer of the workflow. This is particularly important in the ``scripts`` directory.
-
-**Modulefiles:** All official platforms should have a modulefile that can be sourced to provide the appropriate python packages and other settings for the platform.
-
-**Management of the Configuration File:** New configurable options must be consistent with existing configurable options and be documented in ``ufs-srweather-app/docs/UsersGuide/source/ConfigWorkflow.rst``. Add necessary checks on acceptable options where applicable. Add appropriate default values in ``config_defaults.sh``.
-
-**Management of Template Files:** If a new configurable option is required in an existing template, it must be handled similarly to its counterparts in the scripts that fill in the template. For example, if a new type of namelist is introduced for a new application component, it should make use of the existing ``jinja`` framework for populating namelist settings.
-
-**Namelist Management:** Namelists in ``ufs-srweather-app`` are generated using a Python tool and managed by setting YAML configuration parameters. This allows for the management of multiple configuration settings with maximum flexibility and minimum duplication of information.
-
-.. _Testing:
-
-Testing
-===============
-
-The ``ufs-srweather-app`` repository uses the established workflow end-to-end (WE2E) testing framework (see :numref:`Chapter %s `) to implement two tiers of testing: fundamental and comprehensive. **Fundamental testing** consists of a lightweight set of tests that can be automated and run regularly on each `Level 1 `__ platform. These tests verify that there are no major, obvious faults in the underlying code when running common combinations of grids, input data, and physics suites. **Comprehensive testing** includes the entire set of WE2E tests and covers a broader range of capabilities, configurations, and components. Eventually, new tests will be added, including regression tests and unit tests.
-
-Before opening a PR, a minimum set of tests should be run:
- * Developers should run the fundamental test suite manually on at least one supported platform and report on the outcome in the PR template (see :numref:`Section %s `).
-
- * Developers will not be required to run tests on *all* supported platforms, but if a failure is pointed out by another reviewer (or by automated testing), then the developer should work with reviewers and code managers to ensure that the problem is resolved prior to merging.
-
- * If the PR impacts functionality contained within comprehensive WE2E tests not included in the fundamental test suite, the developer must run those tests on the PR.
- * Any new functionality must be tested explicitly, and any new tests should be described in detail in the PR message. Depending on the impact of this functionality, new tests should be added to the suite of comprehensive WE2E tests, followed by a discussion with code managers on whether they should also be included as fundamental tests.
-
- * In some cases, it may be possible to modify a current test instead of creating a completely new test. Code developers introducing new capabilities should work with code managers to provide the proper configuration files, data, and other information necessary to create new tests for these capabilities.
-
- * **Coming Soon:** When the above tests are complete, a code manager will add the ``run_we2e_comprehensive_tests`` to initiate fundamental testing on all Level 1 platforms via Jenkins (see :ref:`below `).
-
-
-.. _jenkins:
-
-Automated Testing on Jenkins
--------------------------------
-
-`Jenkins `__ is an "open source automation server" that allows users to automate code testing. In the SRW App, developers with write, maintain, or admin `roles `__ on the SRW App repository can add labels to their PR that automatically initiate particular test suites.
-
-The following automated testing labels are available (or will be soon) for the SRW App:
- * ``run_ci``
- * *Coming Soon:* ``run_we2e_comprehensive_tests``
- * *Coming Soon:* ``run_we2e_fundamental_tests``
-
-Once a testing label is added to the PR and the tests are run, the results of the tests can be viewed by anyone on GitHub. Users must scroll down to the bottom of the PR, where PR approvals, checks, and conflicts are listed. Under checks, GitHub will list which checks have passed and which have failed. Users can click on "Details" beside each check to see the Jenkins log files (see :numref:`Figure %s `). This will take users to the Jenkins page with information on their PR's tests.
-
-.. _JenkinsCheck:
-
-.. figure:: _static/JenkinsCICD.png
- :alt: Screenshot of the bottom of the PR where information on Jenkins tests is located.
-
- *Sample of Jenkins Test Results*
-
-Once on the `Jenkins `__ page specific to the PR check in question, users can view all of the testing output, including "artifacts" from the build. To do this, users must click on the arrow icon in the top right corner of the page. Then, in the left navigation menu, they can click on *S3 Artifacts* and download any files listed there for in-depth review.
-
-
-
-
-
diff --git a/docs/UsersGuide/source/FAQ.rst b/docs/UsersGuide/source/FAQ.rst
index 69e171d69c..05059b023e 100644
--- a/docs/UsersGuide/source/FAQ.rst
+++ b/docs/UsersGuide/source/FAQ.rst
@@ -4,11 +4,58 @@
FAQ
****
-* :ref:`How do I turn on/off the cycle-independent workflow tasks? `
* :ref:`How do I define an experiment name? `
* :ref:`How do I change the Physics Suite Definition File (SDF)? `
-* :ref:`How do I restart a DEAD task? `
* :ref:`How do I change the grid? `
+* :ref:`How do I turn on/off the cycle-independent workflow tasks? `
+* :ref:`How do I know if I correctly modified config.yaml? `
+* :ref:`How do I restart a DEAD task? `
+* :ref:`How can I clean up the SRW App code if something went wrong? `
+* :ref:`How do I run a new experiment? `
+
+.. _DefineExptName:
+
+====================================
+How do I define an experiment name?
+====================================
+
+The name of the experiment is set in the ``workflow:`` section of the ``config.yaml`` file using the variable ``EXPT_SUBDIR``.
+See :numref:`Section %s ` and/or :numref:`Section %s ` for more details.
+
+.. _ChangePhysics:
+
+=========================================================
+How do I change the Physics Suite Definition File (SDF)?
+=========================================================
+
+The SDF is set in the ``workflow:`` section of the ``config.yaml`` file using the variable ``CCPP_PHYS_SUITE``. The four supported physics suites for the SRW Application as of the v2.1.0 release are:
+
+.. code-block:: console
+
+ FV3_GFS_v16
+ FV3_RRFS_v1beta
+ FV3_HRRR
+ FV3_WoFS_v0
+
+When users run the ``generate_FV3LAM_wflow.py`` script, the SDF file is copied from its location in the forecast
+model directory to the experiment directory ``$EXPTDIR``. For more information on the :term:`CCPP` physics suite parameters, see :numref:`Section %s `.
+
+.. _ChangeGrid:
+
+===========================
+How do I change the grid?
+===========================
+
+To change the predefined grid, modify the ``PREDEF_GRID_NAME`` variable in the ``task_run_fcst:`` section of the ``config.yaml`` script (see :numref:`Section %s ` for details on creating and modifying the ``config.yaml`` file). The four supported predefined grids as of the SRW Application v2.1.0 release are:
+
+.. code-block:: console
+
+ RRFS_CONUS_3km
+ RRFS_CONUS_13km
+ RRFS_CONUS_25km
+ SUBCONUS_Ind_3km
+
+However, users can choose from a variety of predefined grids listed in :numref:`Section %s `. An option also exists to create a user-defined grid, with information available in :numref:`Chapter %s `. However, the user-defined grid option is not fully supported as of the v2.1.0 release and is provided for informational purposes only.
.. _CycleInd:
@@ -21,44 +68,45 @@ are :term:`cycle-independent`, meaning that they only need to be run once per ex
grid, orography, and surface climatology files that these tasks generate are already
available (e.g., from a previous experiment that used the same grid as the current experiment), then
these tasks can be skipped, and the workflow can use those pre-generated files. This
-can be done by adding the following lines to the ``config.sh`` script before running
-the ``generate_FV3LAM_wflow.py`` script:
+can be done by adding the following parameters to the appropriate sections of the ``config.yaml`` script before running ``generate_FV3LAM_wflow.py``:
.. code-block:: console
- RUN_TASK_MAKE_GRID="FALSE"
- GRID_DIR="/path/to/directory/containing/grid/files"
- RUN_TASK_MAKE_OROG="FALSE"
- OROG_DIR="/path/to/directory/containing/orography/files"
- RUN_TASK_MAKE_SFC_CLIMO="FALSE"
- SFC_CLIMO_DIR="/path/to/directory/containing/surface/climatology/files"
-
-The ``RUN_TASK_MAKE_GRID``, ``RUN_TASK_MAKE_OROG``, and ``RUN_TASK_MAKE_SFC_CLIMO`` flags
-disable their respective tasks. ``GRID_DIR``, ``OROG_DIR``, and ``SFC_CLIMO_DIR``
+ workflow_switches:
+ RUN_TASK_MAKE_GRID: false
+ RUN_TASK_MAKE_OROG: false
+ RUN_TASK_MAKE_SFC_CLIMO: false
+ task_make_grid:
+ GRID_DIR: /path/to/directory/containing/grid/files
+ task_make_orog:
+ OROG_DIR: /path/to/directory/containing/orography/files
+ task_make_sfc_climo:
+ SFC_CLIMO_DIR: /path/to/directory/containing/surface/climatology/files
+
+The ``RUN_TASK_MAKE_GRID``, ``RUN_TASK_MAKE_OROG``, and ``RUN_TASK_MAKE_SFC_CLIMO`` flags disable their respective tasks. ``GRID_DIR``, ``OROG_DIR``, and ``SFC_CLIMO_DIR``
specify the directories where pre-generated grid, orography, and surface climatology files are located (all
three sets of files *may* be placed in the same directory location). By default, the ``RUN_TASK_MAKE_*``
-flags are set to ``TRUE`` in ``config_defaults.sh``. This means that the workflow will
+flags are set to true in ``config_defaults.yaml``. This means that the workflow will
run the ``make_grid``, ``make_orog``, and ``make_sfc_climo`` tasks by default.
-.. _DefineExptName:
+.. _CorrectConfig:
-===================================
-How do I define an experiment name?
-===================================
+=========================================================
+How do I know if I correctly modified ``config.yaml``?
+=========================================================
-The name of the experiment is set in the ``config.sh`` file using the variable ``EXPT_SUBDIR``.
-See :numref:`Section %s ` and/or :numref:`Section %s ` for more details.
+To determine whether ``config.yaml`` file adjustments are valid, users can run the following script from the ``ush`` directory after loading the regional workflow:
+.. code-block:: console
-.. _ChangePhysics:
+ ./config_utils.py -c $PWD/config.yaml -v $PWD/config_defaults.yaml
-=========================================================
-How do I change the Physics Suite Definition File (SDF)?
-=========================================================
+A correct ``config.yaml`` file will output a ``SUCCESS`` message. A ``config.yaml`` file with problems will output a ``FAILURE`` message describing the problem. For example:
-The SDF is set in the ``config.sh`` file using the variable ``CCPP_PHYS_SUITE``. When users run the
-``generate_FV3LAM_wflow.py`` script, the SDF file is copied from its location in the forecast
-model directory to the experiment directory ``EXPTDIR``. For more information on the :term:`CCPP` physics suite parameters, see :numref:`Section %s `
+.. code-block:: console
+
+ INVALID ENTRY: EXTRN_MDL_FILES_ICS=[]
+ FAILURE
.. _RestartTask:
@@ -66,44 +114,79 @@ model directory to the experiment directory ``EXPTDIR``. For more information on
How do I restart a DEAD task?
=============================
-On platforms that utilize Rocoto workflow software (such as NCAR’s Cheyenne machine), if
-something goes wrong with the workflow, a task may end up in the DEAD state:
+On platforms that utilize Rocoto workflow software (such as NCAR's Cheyenne machine), if something goes wrong with the workflow, a task may end up in the DEAD state:
.. code-block:: console
rocotostat -w FV3SAR_wflow.xml -d FV3SAR_wflow.db -v 10
CYCLE TASK JOBID STATE EXIT STATUS TRIES DURATION
=================================================================================
- 201905200000 make_grid 9443237 QUEUED - 0 0.0
- 201905200000 make_orog - - - - -
- 201905200000 make_sfc_climo - - - - -
- 201905200000 get_extrn_ics 9443293 DEAD 256 3 5.0
+ 201906151800 make_grid 9443237 QUEUED - 0 0.0
+ 201906151800 make_orog - - - - -
+ 201906151800 make_sfc_climo - - - - -
+ 201906151800 get_extrn_ics 9443293 DEAD 256 3 5.0
This means that the dead task has not completed successfully, so the workflow has stopped. Once the issue
-has been identified and fixed (by referencing the log files), users can re-run the failed task using the ``rocotorewind`` command:
+has been identified and fixed (by referencing the log files in ``$EXPTDIR/log``), users can re-run the failed task using the ``rocotorewind`` command:
.. code-block:: console
- rocotorewind -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 -c 201905200000 -t get_extrn_ics
+ rocotorewind -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 -c 201906151800 -t get_extrn_ics
where ``-c`` specifies the cycle date (first column of rocotostat output) and ``-t`` represents the task name
(second column of rocotostat output). After using ``rocotorewind``, the next time ``rocotorun`` is used to
advance the workflow, the job will be resubmitted.
-.. _ChangeGrid:
+.. _CleanUp:
-===========================
-How do I change the grid?
-===========================
+===============================================================
+How can I clean up the SRW App code if something went wrong?
+===============================================================
-To change the predefined grid, modify the ``PREDEF_GRID_NAME`` variable in the ``config.sh`` script (see :numref:`Section %s ` for details on creating and modifying the ``config.sh`` file). The four supported predefined grids for the SRW Application v2.0.0 release were:
+The ``ufs-srweather-app`` repository contains a ``devclean.sh`` convenience script. This script can be used to clean up code if something goes wrong when checking out externals or building the application. To view usage instructions and to get help, run with the ``-h`` flag:
.. code-block:: console
+
+ ./devclean.sh -h
- RRFS_CONUS_3km
- RRFS_CONUS_13km
- RRFS_CONUS_25km
- SUBCONUS_Ind_3km
+To remove the ``build`` directory, run:
+
+.. code-block:: console
+
+ ./devclean.sh --remove
+
+To remove all build artifacts (including ``build``, ``exec``, ``lib``, and ``share``), run:
+
+.. code-block:: console
+
+ ./devclean.sh --clean
+ OR
+ ./devclean.sh -a
+
+To remove external submodules, run:
+
+.. code-block:: console
+
+ ./devclean.sh --sub-modules
+
+Users will need to check out the external submodules again before building the application.
+
+In addition to the options above, many standard terminal commands can be run to remove unwanted files and directories (e.g., ``rm -rf expt_dirs``). A complete explanation of these options is beyond the scope of this User's Guide.
+
+.. _NewExpt:
+
+==================================
+How can I run a new experiment?
+==================================
+
+To run a new experiment at a later time, users need to rerun the commands in :numref:`Section %s ` that reactivate the regional workflow python environment:
+
+.. code-block:: console
+
+ source
+ module use
+ module load wflow_
-However, users can choose from a variety of predefined grids listed in :numref:`Section %s `. An option also exists to create a user-defined grid, with information available in :numref:`Chapter %s `. However, the user-defined grid option is not fully-supported for this release and is provided for informational purposes only.
+Follow any instructions output by the console.
+Then, users can configure a new experiment by updating the environment variables in ``config.yaml`` to reflect the desired experiment configuration. Detailed instructions can be viewed in :numref:`Section %s `. Parameters and valid values are listed in :numref:`Chapter %s `. After adjusting the configuration file, generate the new experiment by running ``./generate_FV3LAM_wflow.py``. Check progress by navigating to the ``$EXPTDIR`` and running ``rocotostat -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10``.
diff --git a/docs/UsersGuide/source/Glossary.rst b/docs/UsersGuide/source/Glossary.rst
index f7de1fe3f9..2386d83177 100644
--- a/docs/UsersGuide/source/Glossary.rst
+++ b/docs/UsersGuide/source/Glossary.rst
@@ -26,21 +26,21 @@ Glossary
chgres_cube
The preprocessing software used to create initial and boundary condition files to
- “coldstart” the forecast model.
+ “cold start” the forecast model. It is part of :term:`UFS_UTILS`.
CIN
Convective Inhibition.
CMAQ
- The `Community Multiscale Air Quality Model `__ (CMAQ, pronounced "cee-mak") is a numerical air quality model that predicts the concentration of airborne gases and particles and the deposition of these pollutants back to Earth's surface. The purpose of CMAQ is to provide fast, technically sound estimates of ozone, particulates, toxics, and acid deposition. CMAQ is an active open-source development project of the U.S. Environmental Protection Agency (EPA). Code is publicly availably at https://github.com/USEPA/CMAQ.
+ The `Community Multiscale Air Quality Model `__ (CMAQ, pronounced "cee-mak") is a numerical air quality model that predicts the concentration of airborne gases and particles and the deposition of these pollutants back to Earth's surface. The purpose of CMAQ is to provide fast, technically sound estimates of ozone, particulates, toxics, and acid deposition. CMAQ is an active open-source development project of the U.S. Environmental Protection Agency (EPA). Code is publicly available at https://github.com/USEPA/CMAQ.
cron
crontab
cron table
- Cron is a job scheduler accessed through the command-line on UNIX-like operating systems. It is useful for automating tasks such as the ``rocotorun`` command, which launches each workflow task in the SRW App. Cron periodically checks a cron table (aka crontab) to see if any tasks are are ready to execute. If so, it runs them.
+ Cron is a job scheduler accessed through the command-line on UNIX-like operating systems. It is useful for automating tasks such as the ``rocotorun`` command, which launches each workflow task in the SRW App (see :numref:`Chapter %s ` for details). Cron periodically checks a cron table (aka crontab) to see if any tasks are are ready to execute. If so, it runs them.
CRTM
- `Community Radiative Transfer Model `__. CRTM is a fast and accurate radiative transfer model developed at the `Joint Center for Satellite Data Assimilation `__ (JCSDA) in the United States. It is a sensor-based radiative transfer model and supports more than 100 sensors, including sensors on most meteorological satellites and some from other remote sensing satellites.
+ The `Community Radiative Transfer Model `__ (CRTM) is a fast and accurate radiative transfer model developed at the `Joint Center for Satellite Data Assimilation `__ (JCSDA) in the United States. It is a sensor-based radiative transfer model and supports more than 100 sensors, including sensors on most meteorological satellites and some from other remote sensing satellites.
Component
A software element that has a clear function and interface. In Earth system models, components are often single portions of the Earth system (e.g. atmosphere, ocean, or land surface) that are assembled to form a whole.
@@ -94,7 +94,7 @@ Glossary
`Finite Volume Community Ocean Model `__. FVCOM is used in modeling work for the `Great Lakes Coastal Forecasting System (next-gen FVCOM) `__ conducted by the `Great Lakes Environmental Research Laboratory `__.
GFS
- `Global Forecast System `_. The GFS is a National Centers for Environmental Prediction (NCEP) weather forecast model that generates data for dozens of atmospheric and land-soil variables, including temperatures, winds, precipitation, soil moisture, and atmospheric ozone concentration. The system couples four separate models (atmosphere, ocean, land/soil, and sea ice) that work together to accurately depict weather conditions.
+ `Global Forecast System `_. The GFS is a National Centers for Environmental Prediction (:term:`NCEP`) weather forecast model that generates data for dozens of atmospheric and land-soil variables, including temperatures, winds, precipitation, soil moisture, and atmospheric ozone concentration. The system couples four separate models (atmosphere, ocean, land/soil, and sea ice) that work together to accurately depict weather conditions.
GRIB2
The second version of the World Meterological Organization's (WMO) standard for distributing gridded data.
@@ -112,7 +112,7 @@ Glossary
High-Performance Computing.
HPC-Stack
- The `HPC-Stack `__ is a repository that provides a unified, shell script-based build system for building the software stack required for numerical weather prediction (NWP) tools such as the `Unified Forecast System (UFS) `__ and the `Joint Effort for Data assimilation Integration (JEDI) `__ framework.
+ The `HPC-Stack `__ is a repository that provides a unified, shell script-based build system for building the software stack required for numerical weather prediction (NWP) tools such as the `Unified Forecast System (UFS) `__ and the `Joint Effort for Data assimilation Integration (JEDI) `__ framework. View the HPC-Stack documentation :doc:`here `.
HPSS
High Performance Storage System (HPSS).
@@ -136,7 +136,7 @@ Glossary
The `Modern-Era Retrospective analysis for Research and Applications, Version 2 `__ provides satellite observation data back to 1980. According to NASA, "It was introduced to replace the original MERRA dataset because of the advances made in the assimilation system that enable assimilation of modern hyperspectral radiance and microwave observations, along with GPS-Radio Occultation datasets. It also uses NASA's ozone profile observations that began in late 2004. Additional advances in both the GEOS model and the GSI assimilation system are included in MERRA-2. Spatial resolution remains about the same (about 50 km in the latitudinal direction) as in MERRA."
MPI
- MPI stands for Message Passing Interface. An MPI is a standardized communication system used in parallel programming. It establishes portable and efficient syntax for the exchange of messages and data between multiple processors that are used by a single computer program. An MPI is required for high-performance computing (HPC).
+ MPI stands for Message Passing Interface. An MPI is a standardized communication system used in parallel programming. It establishes portable and efficient syntax for the exchange of messages and data between multiple processors that are used by a single computer program. An MPI is required for high-performance computing (HPC) systems.
MRMS
Multi-Radar/Multi-Sensor (MRMS) System Analysis data. This data is required for METplus composite reflectivity or :term:`echo top` verification tasks within the SRW App. A two-day archive of precipitation, radar, and aviation and severe weather fields is publicly available and can be accessed `here `__.
@@ -145,7 +145,7 @@ Glossary
`North American Mesoscale Forecast System `_. NAM generates multiple grids (or domains) of weather forecasts over the North American continent at various horizontal resolutions. Each grid contains data for dozens of weather parameters, including temperature, precipitation, lightning, and turbulent kinetic energy. NAM uses additional numerical weather models to generate high-resolution forecasts over fixed regions, and occasionally to follow significant weather events like hurricanes.
namelist
- A namelist defines a group of variables or arrays. Namelists are an I/O feature for format-free input and output of variables by key-value assignments in FORTRAN compilers. Fortran variables can be read from and written to plain-text files in a standardised format, usually with a ``.nml`` file ending.
+ A namelist defines a group of variables or arrays. Namelists are an I/O feature for format-free input and output of variables by key-value assignments in Fortran compilers. Fortran variables can be read from and written to plain-text files in a standardised format, usually with a ``.nml`` file ending.
NCAR
The `National Center for Atmospheric Research `__.
@@ -156,7 +156,7 @@ Glossary
NCEPLIBS
The software libraries created and maintained by :term:`NCEP` that are required for running
- :term:`chgres_cube`, the UFS Weather Model, and :term:`UPP`. They are included in the `HPC-Stack `__.
+ :term:`chgres_cube`, the UFS Weather Model, and the :term:`UPP`. They are included in the `HPC-Stack `__.
NCEPLIBS-external
A collection of third-party libraries required to build :term:`NCEPLIBS`, :term:`chgres_cube`,
@@ -195,19 +195,22 @@ Glossary
Simplified functions that approximate the effects of small-scale processes (e.g., microphysics, gravity wave drag) that cannot be explicitly resolved by a model grid’s representation of the earth.
RAP
- `Rapid Refresh `__. The continental-scale NOAA hourly-updated assimilation/modeling system operational at NCEP. RAP covers North America and is comprised primarily of a numerical forecast model and an analysis/assimilation system to initialize that model. RAP is complemented by the higher-resolution 3km High-Resolution Rapid Refresh (:term:`HRRR`) model.
+ `Rapid Refresh `__. The continental-scale NOAA hourly-updated assimilation/modeling system operational at :term:`NCEP`. RAP covers North America and is comprised primarily of a numerical forecast model and an analysis/assimilation system to initialize that model. RAP is complemented by the higher-resolution 3km High-Resolution Rapid Refresh (:term:`HRRR`) model.
Repository
A central location in which files (e.g., data, code, documentation) are stored and managed.
RRFS
- The `Rapid Refresh Forecast System `__ is NOAA's next-generation convection-allowing, rapidly-updated, ensemble-based data assimilation and forecasting system currently scheduled for operational implementation in 2024. It is designed to run forecasts on a 3-km :term:`CONUS` domain.
+ The `Rapid Refresh Forecast System `__ (RRFS) is NOAA's next-generation convection-allowing, rapidly-updated, ensemble-based data assimilation and forecasting system currently scheduled for operational implementation in 2024. It is designed to run forecasts on a 3-km :term:`CONUS` domain.
SDF
Suite Definition File. An external file containing information about the construction of a physics suite. It describes the schemes that are called, in which order they are called, whether they are subcycled, and whether they are assembled into groups to be called together.
+ Spack
+ `Spack `__ is a package management tool designed to support multiple versions and configurations of software on a wide variety of platforms and environments. It was designed for large supercomputing centers, where many users and application teams share common installations of software on clusters with exotic architectures.
+
spack-stack
- The `spack-stack `__ is a collaborative effort between the NOAA Environmental Modeling Center (EMC), the UCAR Joint Center for Satellite Data Assimilation (JCSDA), and the Earth Prediction Innovation Center (EPIC). *spack-stack* is a repository that provides a Spack-based method for building the software stack required for numerical weather prediction (NWP) tools such as the `Unified Forecast System (UFS) `__ and the `Joint Effort for Data assimilation Integration (JEDI) `__ framework. spack-stack uses the Spack package manager along with custom Spack configuration files and Python scripts to simplify installation of the libraries required to run various applications. The *spack-stack* can be installed on a range of platforms and comes pre-configured for many systems. Users can install the necessary packages for a particular application and later add the missing packages for another application without having to rebuild the entire stack.
+ The `spack-stack `__ is a collaborative effort between the NOAA Environmental Modeling Center (EMC), the UCAR Joint Center for Satellite Data Assimilation (JCSDA), and the Earth Prediction Innovation Center (EPIC). *spack-stack* is a repository that provides a :term:`Spack`-based method for building the software stack required for numerical weather prediction (NWP) tools such as the `Unified Forecast System (UFS) `__ and the `Joint Effort for Data assimilation Integration (JEDI) `__ framework. *spack-stack* uses the Spack package manager along with custom Spack configuration files and Python scripts to simplify installation of the libraries required to run various applications. The *spack-stack* can be installed on a range of platforms and comes pre-configured for many systems. Users can install the necessary packages for a particular application and later add the missing packages for another application without having to rebuild the entire stack.
tracer
According to the American Meteorological Society (AMS) `definition `__, a tracer is "Any substance in the atmosphere that can be used to track the history [i.e., movement] of an air mass." Tracers are carried around by the motion of the atmosphere (i.e., by :term:`advection`). These substances are usually gases (e.g., water vapor, CO2), but they can also be non-gaseous (e.g., rain drops in microphysics parameterizations). In weather models, temperature (or potential temperature), absolute humidity, and radioactivity are also usually treated as tracers. According to AMS, "The main requirement for a tracer is that its lifetime be substantially longer than the transport process under study."
@@ -221,7 +224,7 @@ Glossary
A collection of code used by multiple :term:`UFS` apps (e.g., the UFS Short-Range Weather App,
the UFS Medium-Range Weather App). The grid, orography, surface climatology, and initial
and boundary condition generation codes used by the UFS Short-Range Weather App are all
- part of this collection.
+ part of this collection. The code is `publicly available `__ on Github.
Umbrella repository
A repository that houses external code, or "externals," from additional repositories.
@@ -236,7 +239,7 @@ Glossary
Weather Model
A prognostic model that can be used for short- and medium-range research and
operational forecasts. It can be an atmosphere-only model or an atmospheric
- model coupled with one or more additional components, such as a wave or ocean model. The SRW App uses the fully-coupled `UFS Weather Model `__.
+ model coupled with one or more additional components, such as a wave or ocean model. The SRW App uses the `UFS Weather Model `__.
Workflow
The sequence of steps required to run an experiment from start to finish.
\ No newline at end of file
diff --git a/docs/UsersGuide/source/Graphics.rst b/docs/UsersGuide/source/Graphics.rst
index 945ee13569..0614dd4267 100644
--- a/docs/UsersGuide/source/Graphics.rst
+++ b/docs/UsersGuide/source/Graphics.rst
@@ -19,58 +19,23 @@ output over the :term:`CONUS` for a number of variables, including:
The Python scripts are located under ``ufs-srweather-app/ush/Python``.
The script ``plot_allvars.py`` plots the output from a single cycle within an experiment, while
-the script ``plot_allvars_diff.py`` plots the difference between the same cycle from two different
-experiments (e.g., the experiments may differ in some aspect such as the physics suite used). If
+the script ``plot_allvars_diff.py`` plots the difference between the same cycle from two different experiments. When
plotting the difference, the two experiments must be on the same domain and available for
-the same cycle starting date/time and forecast hours.
+the same cycle starting date/time and forecast hours. Other parameters may differ (e.g., the experiments may use different physics suites).
Loading the Environment
==========================
-To use the plotting scripts, the appropriate environment must be loaded. The scripts require Python 3 with the ``scipy``, ``matplotlib``, ``pygrib``, ``cartopy``, and ``pillow`` packages. This Python environment has already been set up on Level 1 platforms and can be activated as follows:
-
-On Cheyenne:
-
-.. code-block:: console
-
- module load ncarenv
- module load conda/latest
- conda activate /glade/p/ral/jntp/UFS_SRW_app/conda/python_graphics
-
-On Hera and Jet:
-
-.. code-block:: console
-
- module use -a /contrib/miniconda3/modulefiles
- module load miniconda3
- conda activate pygraf
-
-On Orion:
-
-.. code-block:: console
-
- module use -a /apps/contrib/miniconda3-noaa-gsl/modulefiles
- module load miniconda3
- conda activate pygraf
-
-On Gaea:
-
-.. code-block:: console
-
- module use /lustre/f2/pdata/esrl/gsd/contrib/modulefiles
- module load miniconda3/4.8.3-regional-workflow
-
-On NOAA Cloud:
+To use the plotting scripts, the regional workflow environment, which includes the required ``scipy``, ``matplotlib``, ``pygrib``, ``cartopy``, and ``pillow`` packages, must be loaded. To activate the regional workflow, see :numref:`Section %s `, or use the following summary:
.. code-block:: console
+
+ cd
+ source ../../etc/lmod-setup.sh
+ module use ../../modulefiles
+ module load wflow_
- module use /contrib/GST/miniconda3/modulefiles
- module load miniconda3/4.10.3
- conda activate regional_workflow
-
-.. note::
-
- If using one of the batch submission scripts described :ref:`below `, the user does not need to manually load an environment because the scripts perform this task.
+where ```` refers to a valid machine name (see :numref:`Section %s `). Then users should follow the instructions output by the console (e.g., ``conda activate regional_workflow``).
.. _Cartopy:
@@ -81,42 +46,25 @@ The Python plotting scripts also require a path to the directory where the Carto
The full set of Cartopy shapefiles can be downloaded `here `__. For convenience, the small subset of files required for these Python scripts can be obtained from the `SRW Data Bucket `__. They are also available on all `Level 1 `__ platforms in the following locations:
-On Cheyenne:
-
-.. code-block:: console
-
- /glade/p/ral/jntp/UFS_SRW_App/v2p0/NaturalEarth
-
-On Hera:
-
-.. code-block:: console
-
- /scratch2/BMC/det/UFS_SRW_App/v2p0/NaturalEarth
-
-On Jet:
-
-.. code-block:: console
-
- /mnt/lfs4/BMC/wrfruc/UFS_SRW_App/v2p0/NaturalEarth
-
-On Orion:
-
-.. code-block:: console
-
- /work/noaa/fv3-cam/UFS_SRW_App/v2p0/NaturalEarth
-
-On Gaea:
-
-.. code-block:: console
-
- /lustre/f2/pdata/ncep/UFS_SRW_App/v2p0/NaturalEarth
-
-On NOAA Cloud:
-
-.. code-block:: console
-
- /contrib/EPIC/UFS_SRW_App/v2p0/NaturalEarth
-
+.. _CartopyData:
+
+.. table:: Cartopy Shapefile Locations for Level 1 Systems
+
+ +--------------+-----------------------------------------------------------------+
+ | Machine | File location |
+ +==============+=================================================================+
+ | Cheyenne | /glade/p/ral/jntp/UFS_SRW_App/develop/NaturalEarth |
+ +--------------+-----------------------------------------------------------------+
+ | Gaea | /lustre/f2/pdata/ncep/UFS_SRW_App/develop/NaturalEarth |
+ +--------------+-----------------------------------------------------------------+
+ | Hera | /scratch2/BMC/det/UFS_SRW_App/develop/NaturalEarth |
+ +--------------+-----------------------------------------------------------------+
+ | Jet | /mnt/lfs4/BMC/wrfruc/UFS_SRW_App/develop/NaturalEarth |
+ +--------------+-----------------------------------------------------------------+
+ | NOAA Cloud | /contrib/EPIC/UFS_SRW_App/develop/NaturalEarth |
+ +--------------+-----------------------------------------------------------------+
+ | Orion | /work/noaa/fv3-cam/UFS_SRW_App/develop/NaturalEarth |
+ +--------------+-----------------------------------------------------------------+
Running the Plotting Scripts
======================================
@@ -139,7 +87,7 @@ following command line arguments:
#. Ending forecast hour
#. Forecast hour increment
#. The top level of the experiment directory ``$EXPTDIR`` containing the post-processed data. The script will look for the data files in the directory ``$EXPTDIR/CDATE/postprd``.
-#. The base directory ``CARTOPY_DIR`` of the cartopy shapefiles. The script will look for the shapefiles (``*.shp``) in the directory ``$CARTOPY_DIR/shapefiles/natural_earth/cultural``.
+#. The base directory ``CARTOPY_DIR`` of the cartopy shapefiles. The script will look for the shapefiles (``*.shp``) in the directory ``$CARTOPY_DIR/shapefiles/natural_earth/cultural``. See :numref:`Table %s ` for the correct ``$CARTOPY_DIR`` locations on Level 1 systems.
#. The name ``POST_OUTPUT_DOMAIN_NAME`` of the native grid used in the forecast
.. note::
@@ -151,7 +99,7 @@ is as follows:
.. code-block:: console
- python plot_allvars.py 2019061500 0 12 6 /path-to/expt_dirs/test_CONUS_25km_GFSv16 /path-to/NaturalEarth RRFS_CONUS_25km
+ python plot_allvars.py 2019061518 0 12 6 /path-to/expt_dirs/test_community /path-to/NaturalEarth RRFS_CONUS_25km
The output files (in ``.png`` format) will be located in the directory ``$EXPTDIR/CDATE/postprd``,
where in this case ``$EXPTDIR`` is ``/path-to/expt_dirs/test_CONUS_25km_GFSv16`` and ``$CDATE``
@@ -172,7 +120,7 @@ command line arguments:
#. The base directory ``CARTOPY_DIR`` of the cartopy shapefiles. The script will look for the shapefiles (``*.shp``) in the directory ``$CARTOPY_DIR/shapefiles/natural_earth/cultural``.
#. The name ``POST_OUTPUT_DOMAIN_NAME`` of the native grid used in the forecasts (this must be the same for the two forecasts)
-An example of plotting differences from two experiments for the same date and predefined domain where one uses the "FV3_GFS_v16" suite definition file (SDF) and one using the "FV3_RRFS_v1beta" SDF is as follows:
+An example of plotting differences from two experiments for the same date and predefined domain where one uses the ``FV3_GFS_v16`` suite definition file (SDF) and one uses the ``FV3_RRFS_v1beta`` SDF is as follows:
.. code-block:: console
diff --git a/docs/UsersGuide/source/InputOutputFiles.rst b/docs/UsersGuide/source/InputOutputFiles.rst
index 640a2a598a..8456049c93 100644
--- a/docs/UsersGuide/source/InputOutputFiles.rst
+++ b/docs/UsersGuide/source/InputOutputFiles.rst
@@ -4,14 +4,14 @@
Input and Output Files
=======================
This chapter provides an overview of the input and output files needed by the components
-of the UFS SRW Application. Links to more detailed documentation for each of the components (e.g., UFS_UTILS, the UFS Weather Model, and the UPP) are provided in the sections below. For SRW App users who want to jump straight to downloading and staging the files, see :numref:`Section %s `.
+of the UFS SRW Application. Links to more detailed documentation for each of the components (e.g., UFS_UTILS, the UFS Weather Model, and the UPP) are provided in the sections below. For SRW App users who want to jump straight to downloading and staging the required files, see :numref:`Section %s `.
.. _Input:
Input Files
===========
The SRW Application requires numerous input files to run: static datasets (fix files
-containing climatological information, terrain and land use data), initial and boundary
+containing climatological information, terrain, and land use data), initial and boundary
conditions files, and model configuration files (such as namelists).
Initial and Boundary Condition Files
@@ -27,11 +27,11 @@ The data format for these files can be :term:`GRIB2` or :term:`NEMSIO`. More inf
Pre-processing (UFS_UTILS)
---------------------------
-When a user generates the regional workflow, as described in :numref:`Section %s `, the workflow generation script links the input data for the pre-processing utilities to the experiment directory. The pre-processing utilities use many different datasets to create grids and to generate model input datasets from the external model files. A detailed description of the input files for the pre-processing utilities can be found in the UFS_UTILS `Technical Documentation `__ and `Scientific Documentation `__.
+When a user generates the regional workflow as described in :numref:`Section %s `, the workflow generation script links the input data for the pre-processing utilities to the experiment directory. The pre-processing utilities use many different datasets to create grids and to generate model input datasets from the external model files. A detailed description of the input files for the pre-processing utilities can be found in the UFS_UTILS `Technical Documentation `__ and `Scientific Documentation `__.
UFS Weather Model
-----------------
-The input files for the Weather Model include both static (fixed) files and grid- and date-specific files (terrain, initial conditions, boundary conditions, etc). The static fix files
+The input files for the UFS Weather Model include both static (fixed) files and grid- and date-specific files (terrain, initial conditions, boundary conditions, etc). The static fix(ed) files
must be staged by the user unless the user is running on a `Level 1/pre-configured `__ platform, in which case users can link to the existing copy of the data on their machine. See :numref:`Section %s ` for instructions. The workflow scripts link the static, grid, and date-specific files in the experiment directory. An extensive description of the input files for the Weather Model can be found in the `UFS Weather Model User's Guide `__. The namelists and configuration files for the SRW Application are created from templates by the workflow generation script, as described in :numref:`Section %s `.
Unified Post Processor (UPP)
@@ -44,11 +44,11 @@ Documentation for the UPP input files can be found in the `UPP User's Guide
Workflow
---------
The SRW Application uses a series of template files, combined with user-selected settings,
-to create the required namelists and parameter files needed by the Application workflow. (See :numref:`Figure %s ` for a visual summary of the workflow generation process, including template use.) These templates can be reviewed to see which defaults are used and where configuration parameters from the ``config.yaml`` file are assigned.
+to create the required namelists and parameter files needed by the SRW Application workflow. (See :numref:`Figure %s ` for a visual summary of the workflow generation process, including template use.) These templates can be reviewed to see which defaults are used and where configuration parameters from the ``config.yaml`` file are assigned.
List of Template Files
^^^^^^^^^^^^^^^^^^^^^^^^
-The template files for the SRW Application are located in ``parm``
+The template files for the SRW Application are located in the ``parm`` directory
and are shown in :numref:`Table %s `.
.. _TemplateFiles:
@@ -62,11 +62,11 @@ and are shown in :numref:`Table %s `.
| | reads in at the start of each forecast. It is an empty file. |
| | No need to change. |
+-----------------------------+--------------------------------------------------------------+
- | diag_table_[CCPP] | File specifying the output fields of the forecast model. |
+ | diag_table.[CCPP] | File specifying the output fields of the forecast model. |
| | A different ``diag_table`` may be configured for different |
| | :term:`CCPP` suites. |
+-----------------------------+--------------------------------------------------------------+
- | field_table_[CCPP] | :term:`Cycle-independent` file that the forecast model |
+ | field_table.[CCPP] | :term:`Cycle-independent` file that the forecast model |
| | reads in at the start of each forecast. It specifies the |
| | :term:`tracers ` that the forecast model will |
| | :term:`advect`. A different ``field_table`` may be needed |
@@ -97,20 +97,18 @@ and are shown in :numref:`Table %s `.
| README.xml_templating.md | Instructions for Rocoto XML templating with Jinja. |
+-----------------------------+--------------------------------------------------------------+
-Additional information related to ``diag_table_[CCPP]``, ``field_table_[CCPP]``, ``input.nml.FV3``, ``model_conigure``, and ``nems.configure`` can be found in the `UFS Weather Model User's Guide `__, while information on ``regional_grid.nml`` options can be found in the `UFS_UTILS Technical Documentation `__.
+Additional information related to ``diag_table.[CCPP]``, ``field_table.[CCPP]``, ``input.nml.FV3``, ``model_configure``, and ``nems.configure`` can be found in the `UFS Weather Model User's Guide `__, while information on ``regional_grid.nml`` options can be found in the `UFS_UTILS Technical Documentation `__.
Migratory Route of the Input Files in the Workflow
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-:numref:`Figure %s ` shows how the input files in the template directory (``ufs-srweather-app/parm``) flow to the experiment directory. First, the CCPP physics suite is specified in the configuration file. The template input files corresponding to the selected physics suite, such as ``field_table_[CCPP]`` and ``nems.configure_[CCPP]``, are copied to the experiment directory (``$EXPTDIR``). Additionally, the namelist file of the Weather Model (``input.nml``) is created from the ``input.nml.FV3`` and ``FV3.input.yml`` files by running the workflow generation script. While running the ``RUN_FCST`` task in the regional workflow as shown in :numref:`Figure %s `, the ``field_table``, ``nems.configure``, and ``input.nml`` files, located in ``$EXPTDIR``, are linked to the cycle directory (``$CYCLE_DIR``). Additionally, ``diag_table`` and ``model_configure`` are copied from the ``parm`` directory. Finally, these files are updated with the variables specified in ``var_defn.sh``.
-
-.. COMMENT: Update image!
+:numref:`Figure %s ` shows how the input files in the template directory (``ufs-srweather-app/parm``) flow to the experiment directory. First, the CCPP physics suite is specified in the configuration file. The template input files corresponding to the selected physics suite, such as ``field_table.[CCPP]`` and ``nems.configure_[CCPP]``, are copied to the experiment directory (``$EXPTDIR``). Additionally, the namelist file of the Weather Model (``input.nml``) is created from the ``input.nml.FV3`` and ``FV3.input.yml`` files by running the workflow generation script. While running the ``RUN_FCST`` task in the regional workflow as shown in :numref:`Figure %s `, the ``field_table``, ``nems.configure``, and ``input.nml`` files, located in ``$EXPTDIR``, are linked to the cycle directory (``$CYCLE_DIR``). Additionally, ``diag_table`` and ``model_configure`` are copied from the ``parm`` directory. Finally, these files are updated with the variables specified in ``var_defn.sh``.
.. _MigratoryRoute:
-.. figure:: _static/FV3LAM_wflow_input_path.png
- :alt: Flowchart showing how information on the physics suite travels from the config shell file to the setup shell file to the workflow generation script to the run forecast ex-script. As this information is fed from one script to the next, file paths and variables required for workflow execution are set.
+.. figure:: _static/SRW_wflow_input_path.png
+ :alt: Flowchart showing how information from the physics suite travels from the configuration file to the setup file to the workflow generation script to the run forecast ex-script. As this information is fed from one file to the next, file paths and variables required for workflow execution are set.
- *Migratory route of input files*
+ *Migratory Route of Input Files*
.. _OutputFiles:
@@ -143,7 +141,7 @@ experiment directory (``$EXPTDIR/YYYYMMDDHH/INPUT``) and consist of the followin
* ``tmp_ICS``
* ``tmp_LBCS``
-These output files are used as inputs for the UFS Weather Model, and are described in the `UFS Weather Model User's Guide
+These output files are used as inputs for the UFS Weather Model and are described in the `UFS Weather Model User's Guide
`__. ``gfs_bndy.tile7.HHH.nc`` refers to a series of IC/LBC files where ``HHH`` is the 3-digit hour of the forecast.
UFS Weather Model
@@ -165,17 +163,19 @@ Unified Post Processor (UPP)
----------------------------
Documentation for the UPP output files can be found in the `UPP User's Guide `__.
-For the SRW Application, the weather model netCDF output files are written to ``$EXPTDIR/YYYYMMDDHH/postprd`` and have the naming convention (file->linked to):
+For the SRW Application, the Weather Model netCDF output files are written to ``$EXPTDIR/YYYYMMDDHH/postprd`` and have the naming convention (file->linked to):
* ``NATLEV_{YY}{JJJ}{hh}{mm}f{fhr}00 -> {domain}.t{cyc}z.natlevf{fhr}.tmXX.grib2``
* ``PRSLEV_{YY}{JJJ}{hh}{mm}f{fhr}00 -> {domain}.t{cyc}z.prslevf{fhr}.tmXX.grib2``
The default setting for the output file names uses ``rrfs`` for ``{domain}``. This may be overridden by the user in the ``config.yaml`` settings.
+.. _ModifyUPPOutput:
+
Modifying the UPP Output
^^^^^^^^^^^^^^^^^^^^^^^^^^^
-If users wish to modify the fields or levels that are output from the UPP, they will need to make modifications to ``fv3lam.xml``, which resides in the UPP repository distributed with the UFS SRW Application. If the code was cloned into the directory ``ufs-srweather-app``, the file will be located in ``ufs-srweather-app/src/UPP/parm``.
+If users wish to modify the fields or levels that are output from the UPP, they will need to make modifications to ``fv3lam.xml``, which resides in the UPP repository distributed with the UFS SRW Application. If the code was cloned into the directory ``ufs-srweather-app``, the file will be located in ``ufs-srweather-app/sorc/UPP/parm``.
.. note::
This process requires advanced knowledge of which fields can be output for the UFS Weather Model.
@@ -190,10 +190,10 @@ After creating the new flat text file to reflect the changes, users will need to
.. code-block:: console
- USE_CUSTOM_POST_CONFIG_FILE="TRUE"
- CUSTOM_POST_CONFIG_PATH=""
+ USE_CUSTOM_POST_CONFIG_FILE: true
+ CUSTOM_POST_CONFIG_FP:
-which tells the workflow to use the custom file located in the user-defined path. The path should include the filename. If ``USE_CUSTOM_POST_CONFIG_FILE`` is set to "TRUE", but the file path is not found, then an error will occur when trying to generate the SRW Application workflow.
+which tells the workflow to use the custom file located in the user-defined path. The path should include the filename. If ``USE_CUSTOM_POST_CONFIG_FILE`` is set to true, but the file path is not found, then an error will occur when trying to generate the SRW Application workflow.
Users may then start their experiment workflow as usual, and the UPP will use the new flat ``*.txt`` file.
@@ -202,24 +202,24 @@ Users may then start their experiment workflow as usual, and the UPP will use th
Outputting Satellite Products from UPP
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-Synthetic satellite products for several instruments and channels (e.g., GOES 16/17) may be output through the UPP using the Community Radiative Transfer Model (:term:`CRTM`). External CRTM coefficient files, available through the UPP stand-alone release, will need to be manually downloaded before running the workflow. These instructions assume that the UPP configuration file has already been set up to output satellite products.
+Synthetic satellite products for several instruments and channels (e.g., GOES 16/17) may be output through the UPP using the Community Radiative Transfer Model (:term:`CRTM`). External CRTM coefficient files, available through the UPP stand-alone release, will need to be manually downloaded before running the workflow. These instructions assume that the UPP configuration file (``postxconfig-NT-fv3lam.txt``) has already been set up to output satellite products using the process described above in :numref:`Section %s`.
Download and unpack the external files:
.. code-block:: console
mkdir crtm && cd crtm
- wget https://github.com/NOAA-EMC/EMC_post/releases/download/upp_v10.1.0/fix.tar.gz
+ wget https://github.com/NOAA-EMC/UPP/releases/download/upp_v11.0.0/fix.tar.gz
tar -xzf fix.tar.gz
Modify the ``config.yaml`` file to include the following lines:
.. code-block:: console
- USE_CRTM="TRUE"
- CRTM_DIR=""
+ USE_CRTM: true
+ CRTM_DIR:
-By setting ``USE_CRTM`` to "TRUE", the workflow will use the path defined in ``CRTM_DIR`` to link the necessary coefficient files to the working directory at runtime. Otherwise, it is assumed that no satellite fields are being requested in the UPP configuration. ``CRTM_DIR`` should point to the top CRTM directory where the fix files are located.
+By setting ``USE_CRTM`` to true, the workflow will use the path defined in ``CRTM_DIR`` to link the necessary coefficient files to the working directory at runtime. Otherwise, it is assumed that no satellite fields are being requested in the UPP configuration. ``CRTM_DIR`` should point to the top CRTM directory where the fix files are located.
.. note::
Dependencies for outputting synthetic satellite products may exist based on model configuration (e.g., model physics).
@@ -243,15 +243,15 @@ Static files are available in the `"fix" directory `__ of the SRW Data Bucket using the ``wget`` command for each required file. A list of ``wget`` commands with links is provided :ref:`here ` for the release v2.0.0 fix file data. Users will need to create an appropriate directory structure for the files when downloading them individually. The best solution is to download the files into directories that mirror the structure of the `Data Bucket `__.
+Alternatively, users can download the static files individually from the `"fix" directory `__ of the SRW Data Bucket using the ``wget`` command for each required file. A list of ``wget`` commands with links is provided :ref:`here ` for the release v2.1.0 fix file data. Users will need to create an appropriate directory structure for the files when downloading them individually. The best solution is to download the files into directories that mirror the structure of the `Data Bucket `__.
-The environment variables ``FIXgsm``, ``TOPO_DIR``, and ``SFC_CLIMO_INPUT_DIR`` indicate the path to the directories where the static files are located. After downloading the experiment data, users must set the paths to the files in ``config.yaml``. Add the following code to the ``config.yaml`` file, and alter the variable paths accordingly:
+The environment variables ``FIXgsm``, ``TOPO_DIR``, and ``SFC_CLIMO_INPUT_DIR`` indicate the path to the directories where the static files are located. After downloading the experiment data, users must set the paths to the files in ``config.yaml``. Add the following code to the ``task_run_fcst:`` section of the ``config.yaml`` file, and alter the variable paths accordingly:
.. code-block:: console
- FIXgsm=""
- TOPO_DIR=""
- SFC_CLIMO_INPUT_DIR=""
+ FIXgsm:
+ TOPO_DIR:
+ SFC_CLIMO_INPUT_DIR:
.. _InitialConditions:
@@ -268,18 +268,25 @@ To download the model input data for the 12-hour "out-of-the-box" experiment con
To download data for different dates, model types, and formats, users can explore the ``input_model_data`` section of the data bucket and replace the links above with ones that fetch their desired data.
+.. _ICS-LBCS:
+
Initial and Lateral Boundary Condition Organization
----------------------------------------------------
+----------------------------------------------------
-The paths to ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS`` must be set in the ``config.yaml`` file as follows:
+The paths to ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS`` must be set in the appropriate sections of the ``config.yaml`` file:
.. code-block:: console
- USE_USER_STAGED_EXTRN_FILES="TRUE"
- EXTRN_MDL_SOURCE_BASEDIR_ICS=""
- EXTRN_MDL_SOURCE_BASEDIR_LBCS=""
+ task_get_extrn_ics:
+ USE_USER_STAGED_EXTRN_FILES: true
+ EXTRN_MDL_SOURCE_BASEDIR_ICS:
+ EXTRN_MDL_DATA_STORES: disk
+ task_get_extrn_lbcs:
+ USE_USER_STAGED_EXTRN_FILES: true
+ EXTRN_MDL_SOURCE_BASEDIR_LBCS:
+ EXTRN_MDL_DATA_STORES: disk
-These last two variables describe where the :term:`IC ` and :term:`LBC ` file directories are located, respectively. For ease of reusing ``config.yaml`` across experiments, it is recommended that users set up the raw :term:`IC/LBC ` file paths to include the model name (e.g., FV3GFS, NAM, RAP, HRRR), data format (e.g., grib2, nemsio), and date (in ``YYYYMMDDHH`` format). For example: ``/path-to/input_model_data/FV3GFS/grib2/2019061518/``. While there is flexibility to modify these settings, this structure will provide the most reusability for multiple dates when using the SRW Application workflow.
+The two ``EXTRN_MDL_SOURCE_BASEDIR_*CS`` variables describe where the :term:`IC ` and :term:`LBC ` file directories are located, respectively. For ease of reusing ``config.yaml`` across experiments, it is recommended that users set up the raw :term:`IC/LBC ` file paths to include the model name (e.g., FV3GFS, NAM, RAP, HRRR), data format (e.g., grib2, nemsio), and date (in ``YYYYMMDDHH`` format). For example: ``/path-to/input_model_data/FV3GFS/grib2/2019061518/``. While there is flexibility to modify these settings, this structure will provide the most reusability for multiple dates when using the SRW Application workflow.
When files are pulled from NOAA :term:`HPSS` (rather than downloaded from the data bucket), the naming convention looks something like:
@@ -297,20 +304,28 @@ where:
* ``{cycle}`` corresponds to the 2-digit hour of the day when the forecast cycle starts, and
* ``{fhr}`` corresponds to the 2- or 3-digit nth hour of the forecast (3-digits for FV3GFS data and 2 digits for RAP/HRRR data).
-For example, a forecast using FV3GFS GRIB2 data that starts at 18h00 UTC would have a {cycle} value of 18, which is the 000th forecast hour. The LBCS file for 21h00 UTC would be named ``gfs.t18z.pgrb2.0p25.f003``. An example ``config.yaml`` setting using HRRR and RAP data appears below:
+For example, a forecast using FV3GFS GRIB2 data that starts at 18h00 UTC would have a {cycle} value of 18, which is the 000th forecast hour. The LBCS file for 21h00 UTC would be named ``gfs.t18z.pgrb2.0p25.f003``.
+
+In some cases, it may be necessary to specify values for ``EXTRN_MDL_FILES_*CS``variables. This is often the case with HRRR and RAP data. An example ``config.yaml`` excerpt using HRRR and RAP data appears below:
.. code-block:: console
- USE_USER_STAGED_EXTRN_FILES="TRUE"
- EXTRN_MDL_SOURCE_BASEDIR_ICS="/path-to/input_model_data/HRRR/grib2/2020081012"
- EXTRN_MDL_FILES_ICS=( "hrrr.t12z.wrfprsf00.grib2" )
- EXTRN_MDL_SOURCE_BASEDIR_LBCS="/path-to/input_model_data/RAP/grib2/2020081012"
- EXTRN_MDL_FILES_LBCS=( "rap.t12z.wrfprsf03.grib2" "rap.t12z.wrfprsf06.grib2" )
+ task_get_extrn_ics:
+ EXTRN_MDL_NAME_ICS: HRRR
+ USE_USER_STAGED_EXTRN_FILES: true
+ EXTRN_MDL_FILES_ICS:
+ - '{yy}{jjj}{hh}00{fcst_hr:02d}00'
+ task_get_extrn_lbcs:
+ EXTRN_MDL_NAME_LBCS: RAP
+ LBC_SPEC_INTVL_HRS: 3
+ USE_USER_STAGED_EXTRN_FILES: true
+ EXTRN_MDL_FILES_LBCS:
+ - '{yy}{jjj}{hh}00{fcst_hr:02d}00'
Default Initial and Lateral Boundary Conditions
-----------------------------------------------
-The default initial and lateral boundary condition files are set to be a severe weather case
-from 20190615 at 18 UTC. FV3GFS GRIB2 files are the default model and file format. A tar file
+
+The default initial and lateral boundary condition files are set to be a severe weather case from June 15, 2019 (20190615) at 18 UTC. FV3GFS GRIB2 files are the default model and file format. A tar file
(``gst_data.tgz``) containing the model data for this case is available in the `UFS SRW App Data Bucket `__.
Running the App for Different Dates
@@ -324,7 +339,7 @@ Staging Initial Conditions Manually
-----------------------------------
If users want to run the SRW Application with raw model files for dates other than those that
are currently available on the preconfigured platforms, they need to stage the data manually.
-The data should be placed in ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS`` (which may be the same directory). The path to these variables can be set in the ``config.yaml`` file. Raw model files are available from a number of sources. A few examples are provided here for convenience.
+The data should be placed in ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS`` (which may be the same directory). The path to these variables can be set in the ``config.yaml`` file as shown :ref:`above `. Raw model files are available from a number of sources. A few examples are provided here for convenience.
NOMADS: https://nomads.ncep.noaa.gov/pub/data/nccf/com/{model}/prod, where model may be:
@@ -337,7 +352,7 @@ NOMADS: https://nomads.ncep.noaa.gov/pub/data/nccf/com/{model}/prod, where model
* HRRR - available for the last 2 days
https://nomads.ncep.noaa.gov/pub/data/nccf/com/hrrr/prod/
-AWS S3:
+AWS S3 Data Buckets:
* GFS: https://registry.opendata.aws/noaa-gfs-bdp-pds/
* HRRR: https://registry.opendata.aws/noaa-hrrr-pds/ (necessary fields for initializing available for dates 2015 and newer)
@@ -346,7 +361,7 @@ Google Cloud:
* HRRR: https://console.cloud.google.com/marketplace/product/noaa-public/hrrr
-FTP Data Repository: (data for SRW Release v1.0.0 & v1.0.1)
+FTP Data Repository (data for SRW Release v1.0.0 & v1.0.1):
* https://ftp.emc.ncep.noaa.gov/EIB/UFS/SRW/v1p0/fix/
* https://ftp.emc.ncep.noaa.gov/EIB/UFS/SRW/v1p0/simple_test_case/
@@ -364,15 +379,15 @@ It is recommended that users have a separate directory for each file format if t
.. code-block:: console
- /path-to/model_data/FV3GFS/grib2/YYYYMMDDHH
- /path-to/model_data/FV3GFS/nemsio/YYYYMMDDHH
+ /path-to/input_model_data/FV3GFS/grib2/YYYYMMDDHH
+ /path-to/input_model_data/FV3GFS/nemsio/YYYYMMDDHH
Additionally, users must set the following environment variables if they plan to use GRIB2-formatted files for FV3GFS:
.. code-block:: console
- FV3GFS_FILE_FMT_ICS="grib2"
- FV3GFS_FILE_FMT_LBCS="grib2"
+ FV3GFS_FILE_FMT_ICS: grib2
+ FV3GFS_FILE_FMT_LBCS: grib2
This is ONLY necessary when using FV3GFS GRIB2 files. These settings may be removed when initializing from the default NEMSIO format for FV3GFS files.
@@ -384,4 +399,4 @@ that the users share the same ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_S
directories. That way, if raw model input files are already on disk for a given date, they do not
need to be replicated.
-The files in the subdirectories of the ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS`` directories should be write-protected. This prevents these files from being accidentally modified or deleted. The directories should generally be group writable so the directory can be shared among multiple users.
+The files in the subdirectories of the ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS`` directories should be write-protected. This prevents these files from being accidentally modified or deleted. The directories should generally be group-writable so the directory can be shared among multiple users.
diff --git a/docs/UsersGuide/source/Introduction.rst b/docs/UsersGuide/source/Introduction.rst
index 476f013838..d2d2b93096 100644
--- a/docs/UsersGuide/source/Introduction.rst
+++ b/docs/UsersGuide/source/Introduction.rst
@@ -6,15 +6,27 @@ Introduction
The Unified Forecast System (:term:`UFS`) is a community-based, coupled, comprehensive Earth modeling system. NOAA's operational model suite for numerical weather prediction (:term:`NWP`) is quickly transitioning to the UFS from a number of different modeling systems. The UFS enables research, development, and contribution opportunities within the broader :term:`Weather Enterprise` (including government, industry, and academia). For more information about the UFS, visit the `UFS Portal `__.
-The UFS includes `multiple applications `__ that support different forecast durations and spatial domains. This documentation describes the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes to several days. The SRW Application v2.1.0 release includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. These components are documented within this User's Guide and supported through a `community forum `__. New and improved capabilities for this release include the addition of a verification package (METplus) for both deterministic and ensemble simulations and support for four stochastically perturbed physics schemes. Future work will expand the capabilities of the application to include data assimilation (DA) and a forecast restart/cycling capability.
+The UFS includes `multiple applications `__ that support different forecast durations and spatial domains. This documentation describes the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes to several days. The SRW Application v2.1.0 release includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. These components are documented within this User's Guide and supported through the `GitHub Discussions `__ forum. New and improved capabilities for the v2.0.0 release included the addition of a verification package (METplus) for both deterministic and ensemble simulations and support for four stochastically perturbed physics schemes. Additions for the v2.1.0 release included:
-This documentation provides a :ref:`Quick Start Guide ` designed for use on `Level 1 systems `__ or as an overview of the workflow. It also provides a :ref:`Container-Based Quick Start Guide ` for running the SRW Application in a container and detailed chapters on :ref:`building ` and :ref:`running ` the SRW App on any supported platform. Additionally, this User's Guide provides an overview of the :ref:`release components ` and details on how to customize or modify different portions of the workflow.
+ * Bug fixes since the v2.0.0 release
+ * Conversion to a Python workflow (from the former shell workflow)
+ * Improved container support, including the option to run across compute nodes using Rocoto (see :numref:`Chapter %s `)
+ * Updates to :term:`CCPP` that target the top of the ``main`` branch (which is ahead of CCPP v6.0.0). See :ref:`this page ` for a detailed summary of updates.
+ * Support for the :term:`UPP` inline post option (see :ref:`here `)
+ * Addition of a multi-purpose code clean-up script (``devclean.sh``) (see :numref:`Section %s `)
+ * Documentation updates to reflect the changes above
+
+This documentation provides:
+
+ * A :ref:`Quick Start Guide ` designed for use on `Level 1 systems `__ or as an overview of the workflow
+ * A :ref:`Container-Based Quick Start Guide ` for running the SRW Application in a container
+ * Detailed chapters on :ref:`building ` and :ref:`running ` the SRW App on any supported platform
+ * An overview of the :ref:`release components ` and details on how to customize or modify different portions of the workflow
The SRW App v2.1.0 citation is as follows and should be used when presenting results based on research conducted with the App:
UFS Development Team. (2022, Nov. 17). Unified Forecast System (UFS) Short-Range Weather (SRW) Application (Version v2.1.0). Zenodo. https://doi.org/10.5281/zenodo.7277602
-
How to Use This Document
========================
@@ -22,12 +34,11 @@ This guide instructs both novice and experienced users on downloading, building,
.. code-block:: console
- Throughout the guide, this presentation style indicates shell commands and options,
- code examples, etc.
+ Throughout the guide, this presentation style indicates shell commands and options, code examples, etc.
Variables presented as ``AaBbCc123`` in this User's Guide typically refer to variables in scripts, names of files, or directories.
-File paths or code that include angle brackets (e.g., ``build__``) indicate that users should insert options appropriate to their SRW App configuration (e.g., ``build_orion_intel``).
+File paths and code that include angle brackets (e.g., ``build__``) indicate that users should insert options appropriate to their SRW App configuration (e.g., ``build_orion_intel``).
.. hint::
* To get started with the SRW App, users have a few options:
@@ -58,7 +69,7 @@ The instructions in this documentation assume that users have certain background
Additional background knowledge in the following areas could be helpful:
-* High-Performance Computing (HPC) Systems for those running the SRW App on an HPC system
+* High-Performance Computing (HPC) Systems (for those running the SRW App on an HPC system)
* Programming (particularly Python) for those interested in contributing to the SRW App code
* Creating an SSH Tunnel to access HPC systems from the command line
* Containerization
@@ -67,7 +78,7 @@ Additional background knowledge in the following areas could be helpful:
Software/Operating System Requirements
-----------------------------------------
-The UFS SRW Application has been designed so that any sufficiently up-to-date machine with a UNIX-based operating system should be capable of running the application. NOAA `Level 1 & 2 systems `__ already have these prerequisites installed. However, users working on other systems must ensure that the following requirements are installed on their system:
+The UFS SRW Application has been designed so that any sufficiently up-to-date machine with a UNIX-based operating system should be capable of running the application. SRW App `Level 1 & 2 systems `__ already have these prerequisites installed. However, users working on other systems must ensure that the following requirements are installed on their system:
**Minimum Platform Requirements:**
@@ -76,7 +87,7 @@ The UFS SRW Application has been designed so that any sufficiently up-to-date ma
* >82 GB disk space
* 53 GB input data for a standard collection of global data, or "fix" file data (topography, climatology, observational data) for a short 12-hour test forecast on the :term:`CONUS` 25km domain. See data download instructions in :numref:`Section %s `.
- * 8 GB for :term:`HPC-Stack` full installation
+ * 8 GB for full :term:`HPC-Stack` installation
* 3 GB for ``ufs-srweather-app`` installation
* 1 GB for boundary conditions for a short 12-hour test forecast on the CONUS 25km domain. See data download instructions in :numref:`Section %s `.
* 17 GB for a 12-hour test forecast on the CONUS 25km domain, with model output saved hourly.
@@ -89,7 +100,7 @@ The UFS SRW Application has been designed so that any sufficiently up-to-date ma
* gcc v9+, ifort v18+, and clang v9+ (macOS, native Apple clang, LLVM clang, GNU) have been tested
-* Python v3.6+, including prerequisite packages ``jinja2``, ``pyyaml`` and ``f90nml``
+* Python v3.6+, including prerequisite packages ``jinja2``, ``pyyaml``, and ``f90nml``
* Python packages ``scipy``, ``matplotlib``, ``pygrib``, ``cartopy``, and ``pillow`` are required for users who would like to use the provided graphics scripts.
@@ -108,11 +119,11 @@ The following software is also required to run the SRW Application, but the :ter
* CMake v3.20+
-* MPI (MPICH, OpenMPI, or other implementation)
+* :term:`MPI` (MPICH, OpenMPI, or other implementation)
* Only **MPICH** or **OpenMPI** can be built with HPC-Stack. Other implementations must be installed separately by the user (if desired).
-For MacOS systems, some additional software packages are needed. When possible, it is recommended that users install and/or upgrade this software (along with software listed above) using the `Homebrew `__ package manager for MacOS. See :ref:`Chapter 3 ` and :numref:`Chapter %s ` for further guidance on installing these prerequisites on MacOS.
+For MacOS systems, some additional software packages are needed. When possible, it is recommended that users install and/or upgrade this software (along with software listed above) using the `Homebrew `__ package manager for MacOS. See :doc:`HPC-Stack Documentation: Chapter 3 ` and :numref:`Chapter %s ` for further guidance on installing these prerequisites on MacOS.
* bash v4.x
* GNU compiler suite v11 or higher with gfortran
@@ -133,10 +144,10 @@ Optional but recommended prerequisites for all systems:
SRW App Components Overview
==============================
-Pre-processor Utilities and Initial Conditions
+Pre-Processor Utilities and Initial Conditions
------------------------------------------------
-The SRW Application includes a number of pre-processing utilities that initialize and prepare the model. Tasks include generating a regional grid along with :term:`orography` and surface climatology files for that grid. One pre-processing utility converts the raw external model data into initial and lateral boundary condition files in netCDF format. Later, these files are used as input to the atmospheric model (FV3-LAM). Additional information about the pre-processor utilities can be found in :numref:`Chapter %s `, in the `UFS_UTILS Technical Documentation `__, and in the `UFS_UTILS Scientific Documentation `__.
+The SRW Application includes a number of pre-processing utilities that initialize and prepare the model. Tasks include generating a regional grid along with :term:`orography` and surface climatology files for that grid. Additional information about the pre-processor utilities can be found in :numref:`Chapter %s `, in the `UFS_UTILS Technical Documentation `__, and in the `UFS_UTILS Scientific Documentation `__.
Forecast Model
-----------------
@@ -145,12 +156,12 @@ Atmospheric Model
^^^^^^^^^^^^^^^^^^^^^^
The prognostic atmospheric model in the UFS SRW Application is the Finite-Volume Cubed-Sphere
-(:term:`FV3`) dynamical core configured with a Limited Area Model (LAM) capability (:cite:t:`BlackEtAl2021`). The :term:`dynamical core` is the computational part of a model that solves the equations of fluid motion. A User's Guide for the UFS :term:`Weather Model` can be found `here `__.
+(:term:`FV3`) dynamical core configured with a Limited Area Model (:term:`LAM`) capability (:cite:t:`BlackEtAl2021`). The :term:`dynamical core` is the computational part of a model that solves the equations of fluid motion. A User's Guide for the UFS :term:`Weather Model` can be found `here `__.
Common Community Physics Package
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-The `Common Community Physics Package `__ (:term:`CCPP`) supports interoperable atmospheric physics and land surface model options. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. The most recent release includes four supported physics suites.
+The `Common Community Physics Package `__ (:term:`CCPP`) supports interoperable atmospheric physics and land surface model options. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. The most recent SRW App release includes four supported physics suites.
Data Format
^^^^^^^^^^^^^^^^^^^^^^
@@ -161,7 +172,7 @@ The SRW App supports the use of external model data in :term:`GRIB2`, :term:`NEM
Unified Post-Processor (UPP)
--------------------------------
-The `Unified Post Processor `__ (:term:`UPP`) processes raw output from a variety of numerical weather prediction (:term:`NWP`) models. In the SRW App, it converts data output from netCDF format to GRIB2 format. The UPP can also be used to compute a variety of useful diagnostic fields, as described in the `UPP User’s Guide `__.
+The `Unified Post Processor `__ (:term:`UPP`) processes raw output from a variety of numerical weather prediction (:term:`NWP`) models. In the SRW App, it converts data output from netCDF format to GRIB2 format. The UPP can also be used to compute a variety of useful diagnostic fields, as described in the `UPP User's Guide `__.
METplus Verification Suite
------------------------------
@@ -180,7 +191,7 @@ The SRW Application has a portable CMake-based build system that packages togeth
The SRW Application allows for configuration of various elements of the workflow. For example, users can modify the parameters of the atmospheric model, such as start and end dates, duration, time step, and the physics suite used for the simulation. More information on how to do this is available in :numref:`Section %s `.
-The SRW Application has been tested on a variety of platforms widely used by researchers, including NOAA High-Performance Computing (HPC) systems (e.g., Hera, Orion), cloud environments, and generic Linux and MacOS systems. Four `levels of support `__ have been defined for the SRW Application. Preconfigured (Level 1) systems already have the required external libraries available in a central location (via :term:`HPC-Stack`). The SRW Application is expected to build and run out-of-the-box on these systems, and users can :ref:`download the SRW App code ` without first installing prerequisites. On other platforms (Levels 2-4), the SRW App can be :ref:`run within a container ` that includes the HPC-Stack, or the required libraries will need to be installed as part of the :ref:`SRW Application build ` process. Once these prerequisite libraries are installed, applications and models should build and run successfully. However, users may need to perform additional troubleshooting on Level 3 or 4 systems since little or no pre-release testing has been conducted on these systems.
+The SRW Application has been tested on a variety of platforms widely used by researchers, including NOAA High-Performance Computing (HPC) systems (e.g., Hera, Orion), cloud environments, and generic Linux and MacOS systems. Four `levels of support `__ have been defined for the SRW Application. Preconfigured (Level 1) systems already have the required external libraries available in a central location (via :term:`HPC-Stack`). The SRW Application is expected to build and run out-of-the-box on these systems, and users can :ref:`download the SRW App code ` without first installing prerequisites. On other platforms (Levels 2-4), the SRW App can be :ref:`run within a container ` that includes the prerequisite software; otherwise, the required libraries will need to be installed as part of the :ref:`SRW Application build ` process. Once these prerequisite libraries are installed, applications and models should build and run successfully. However, users may need to perform additional troubleshooting on Level 3 or 4 systems since little or no pre-release testing has been conducted on these systems.
@@ -245,25 +256,25 @@ The ``ufs-srweather-app`` :term:`umbrella repository` structure is determined by
├── (share)
├── scripts
├── sorc
- │ ├── CMakeLists.txt
- │ ├── (gsi)
- │ ├── (rrfs_utl)
- │ ├── (UPP)
- │ │ ├── parm
- │ │ └── sorc
- │ │ └── ncep_post.fd
- │ ├── (UFS_UTILS)
- │ │ ├── sorc
- │ │ │ ├── chgres_cube.fd
- │ │ │ ├── fre-nctools.fd
- │ │ │ ├── grid_tools.fd
- │ │ │ ├── orog_mask_tools.fd
- │ │ │ └── sfc_climo_gen.fd
- │ │ └── ush
- │ └── (ufs-weather-model)
- │ └── FV3
- │ ├── atmos_cubed_sphere
- │ └── ccpp
+ │ ├── CMakeLists.txt
+ │ ├── (gsi)
+ │ ├── (rrfs_utl)
+ │ ├── (UPP)
+ │ │ ├── parm
+ │ │ └── sorc
+ │ │ └── ncep_post.fd
+ │ ├── (UFS_UTILS)
+ │ │ ├── sorc
+ │ │ │ ├── chgres_cube.fd
+ │ │ │ ├── fre-nctools.fd
+ │ │ │ ├── grid_tools.fd
+ │ │ │ ├── orog_mask_tools.fd
+ │ │ │ └── sfc_climo_gen.fd
+ │ │ └── ush
+ │ └── (ufs-weather-model)
+ │ └── FV3
+ │ ├── atmos_cubed_sphere
+ │ └── ccpp
├── tests/WE2E
├── ush
│ ├── bash_utils
@@ -274,38 +285,38 @@ The ``ufs-srweather-app`` :term:`umbrella repository` structure is determined by
│ └── wrappers
└── versions
-SRW App Sub-Directories
+SRW App SubDirectories
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-:numref:`Table %s ` describes the contents of the most important sub-directories. :numref:`Table %s ` provides and in-depth explanation of the ``ufs-srweather-app`` directories.
+:numref:`Table %s ` describes the contents of the most important subdirectories. :numref:`Table %s ` provides an in-depth explanation of the ``ufs-srweather-app`` directories.
.. _Subdirectories:
-.. table:: Sub-directories of the regional workflow
-
- +-------------------------+---------------------------------------------------------+
- | **Directory Name** | **Description** |
- +=========================+=========================================================+
- | jobs | J-job scripts launched by Rocoto |
- +-------------------------+---------------------------------------------------------+
- | modulefiles | Files used to load modules needed for building and |
- | | running the workflow |
- +-------------------------+---------------------------------------------------------+
- | scripts | Run scripts launched by the J-jobs |
- +-------------------------+---------------------------------------------------------+
- | tests | Baseline experiment configuration |
- +-------------------------+---------------------------------------------------------+
- | ush | Utility scripts used by the workflow |
- +-------------------------+---------------------------------------------------------+
+.. table:: Subdirectories of the regional workflow
+
+ +-------------------------+----------------------------------------------------+
+ | **Directory Name** | **Description** |
+ +=========================+====================================================+
+ | jobs | J-job scripts launched by Rocoto |
+ +-------------------------+----------------------------------------------------+
+ | modulefiles | Files used to load modules needed for building and |
+ | | running the workflow |
+ +-------------------------+----------------------------------------------------+
+ | scripts | Scripts launched by the J-jobs |
+ +-------------------------+----------------------------------------------------+
+ | tests | Tests for baseline experiment configurations |
+ +-------------------------+----------------------------------------------------+
+ | ush | Utility scripts used by the workflow |
+ +-------------------------+----------------------------------------------------+
.. _ExperimentDirSection:
Experiment Directory Structure
--------------------------------
-When the user generates an experiment using the ``generate_FV3LAM_wflow.py`` script (:numref:`Step %s `), a user-defined experimental directory (``$EXPTDIR``) is created based on information specified in the ``config.yaml`` file. :numref:`Table %s ` shows the contents of the experiment directory before running the experiment workflow.
+When the user generates an experiment using the ``generate_FV3LAM_wflow.py`` script (:numref:`Step %s `), a user-defined experiment directory (``$EXPTDIR``) is created based on information specified in the ``config.yaml`` file. :numref:`Table %s ` shows the contents of the experiment directory before running the experiment workflow.
.. _ExptDirStructure:
-.. table:: Files and sub-directory initially created in the experimental directory
+.. table:: Files and subdirectory initially created in the experiment directory
:widths: 33 67
+---------------------------+--------------------------------------------------------------------------------------------------------------+
@@ -323,10 +334,9 @@ When the user generates an experiment using the ``generate_FV3LAM_wflow.py`` scr
| input.nml | :term:`Namelist` for the `UFS Weather Model |
| | `__ |
+---------------------------+--------------------------------------------------------------------------------------------------------------+
- | launch_FV3LAM_wflow.sh | Symlink to the shell script of |
- | | ``ufs-srweather-app/ush/launch_FV3LAM_wflow.sh``, |
+ | launch_FV3LAM_wflow.sh | Symlink to the ``ufs-srweather-app/ush/launch_FV3LAM_wflow.sh`` shell script, |
| | which can be used to (re)launch the Rocoto workflow. |
- | | Each time this script is called, it appends to a log |
+ | | Each time this script is called, it appends information to a log |
| | file named ``log.launch_FV3LAM_wflow``. |
+---------------------------+--------------------------------------------------------------------------------------------------------------+
| log.generate_FV3LAM_wflow | Log of the output from the experiment generation script |
@@ -335,7 +345,7 @@ When the user generates an experiment using the ``generate_FV3LAM_wflow.py`` scr
| nems.configure | See `NEMS configuration file |
| | `__ |
+---------------------------+--------------------------------------------------------------------------------------------------------------+
- | suite_{CCPP}.xml | :term:`CCPP` suite definition file used by the forecast model |
+ | suite_{CCPP}.xml | :term:`CCPP` suite definition file (:term:`SDF`) used by the forecast model |
+---------------------------+--------------------------------------------------------------------------------------------------------------+
| var_defns.sh | Shell script defining the experiment parameters. It contains all |
| | of the primary parameters specified in the default and |
@@ -358,7 +368,7 @@ In addition, running the SRW App in *community* mode creates the ``fix_am`` and
| **Directory Name** | **Description** |
+=========================+==========================================================+
| fix_am | Directory containing the global fix (time-independent) |
- | | data files. The experiment generation script copies |
+ | | data files. The experiment generation script symlinks |
| | these files from a machine-dependent system directory. |
+-------------------------+----------------------------------------------------------+
| fix_lam | Directory containing the regional fix (time-independent) |
@@ -367,7 +377,7 @@ In addition, running the SRW App in *community* mode creates the ``fix_am`` and
| | symlinks to pre-generated files. |
+-------------------------+----------------------------------------------------------+
-Once the Rocoto workflow is launched, several files and directories are generated. A log file named ``log.launch_FV3LAM_wflow`` will be created (unless it already exists) in ``$EXPTDIR``. The first several workflow tasks (i.e., ``make_grid``, ``make_orog``, ``make_sfc_climo``, ``get_extrn_ics``, and ``get_extrn_lbc``) are preprocessing tasks, and these tasks also result in the creation of new files and sub-directories, described in :numref:`Table %s `.
+Once the Rocoto workflow is launched, several files and directories are generated. A log file named ``log.launch_FV3LAM_wflow`` will be created (unless it already exists) in ``$EXPTDIR``. The first several workflow tasks (i.e., ``make_grid``, ``make_orog``, ``make_sfc_climo``, ``get_extrn_ics``, and ``get_extrn_lbcs``) are preprocessing tasks, and these tasks also result in the creation of new files and subdirectories, described in :numref:`Table %s `.
.. _CreatedByWorkflow:
@@ -390,8 +400,8 @@ Once the Rocoto workflow is launched, several files and directories are generate
| | for the experiment |
+---------------------------+--------------------------------------------------------------------+
| log | Contains log files generated by the overall workflow and by its |
- | | various tasks. Look in these files to determine why a task may |
- | | have failed. |
+ | | various tasks. View the files in this directory to determine why |
+ | | a task may have failed. |
+---------------------------+--------------------------------------------------------------------+
| orog | Directory generated by the ``make_orog`` task containing the |
| | orography files for the experiment |
@@ -400,10 +410,10 @@ Once the Rocoto workflow is launched, several files and directories are generate
| | surface climatology files for the experiment |
+---------------------------+--------------------------------------------------------------------+
| FV3LAM_wflow.db | Database files that are generated when Rocoto is called (by the |
- | FV3LAM_wflow_lock.db | launch script) to launch the workflow. |
+ | FV3LAM_wflow_lock.db | launch script) to launch the workflow |
+---------------------------+--------------------------------------------------------------------+
| log.launch_FV3LAM_wflow | The ``launch_FV3LAM_wflow.sh`` script appends its output to this |
- | | log file each time it is called. Take a look at the last 30–50 |
+ | | log file each time it is called. View the last several |
| | lines of this file to check the status of the workflow. |
+---------------------------+--------------------------------------------------------------------+
@@ -414,7 +424,7 @@ The workflow tasks are described in :numref:`Section %s `_ organized by topic provides a centralized location for UFS users and developers to post questions and exchange information.
+The SRW App's `GitHub Discussions `__ forum provides online support for UFS users and developers to post questions and exchange information.
A list of available documentation is shown in :numref:`Table %s `.
@@ -459,9 +469,8 @@ A list of available documentation is shown in :numref:`Table %s `. Contributions to the `ufs-srweather-app `__ repository should follow the guidelines contained in the :ref:`SRW App Contributor's Guide `.
+utilities, model code, and infrastructure. Users can post issues in the related GitHub repositories to report bugs or to announce upcoming contributions to the code base. For code to be accepted into the authoritative repositories, users must follow the code management rules of each UFS component repository. These rules are usually outlined in the User's Guide (see :numref:`Table %s `) or wiki for each respective repository (see :numref:`Table %s `). Contributions to the `ufs-srweather-app `__ repository should follow the guidelines contained in the `SRW App Contributor's Guide `__.
Future Direction
=================
diff --git a/docs/UsersGuide/source/LAMGrids.rst b/docs/UsersGuide/source/LAMGrids.rst
index 0706556234..2a8854f933 100644
--- a/docs/UsersGuide/source/LAMGrids.rst
+++ b/docs/UsersGuide/source/LAMGrids.rst
@@ -6,11 +6,11 @@ Limited Area Model (:term:`LAM`) Grids: Predefined and User-Generated Options
In order to set up the workflow and generate an experiment with the SRW Application, the user
must choose between various predefined :term:`FV3`-:term:`LAM` grids or generate a user-defined grid.
At this time, full support is only provided to those using one of the four predefined
-grids supported in the v2.0.0 release, but other predefined grids are available (see :numref:`Section %s ` for more detail). Preliminary information is also provided at the end of this chapter describing how users can leverage the SRW App workflow scripts to generate their own user-defined grid. Currently, this feature is not fully supported and is "use at your own risk."
+grids supported in the v2.1.0 release, but other predefined grids are available (see :numref:`Section %s ` for more detail). Preliminary information is also provided at the end of this chapter describing how users can leverage the SRW App workflow scripts to generate their own user-defined grid. Currently, this feature is not fully supported and is "use at your own risk."
Predefined Grids
=================
-The SRW App v2.0.0 release includes four predefined limited area model (:term:`LAM`) grids. To select a supported predefined grid, the ``PREDEF_GRID_NAME`` variable within the ``config.yaml`` script must be set to one of the following four options:
+The SRW App v2.1.0 release includes four predefined limited area model (:term:`LAM`) grids. To select a supported predefined grid, the ``PREDEF_GRID_NAME`` variable within the ``task_run_fcst:`` section of the ``config.yaml`` script must be set to one of the following four options:
* ``RRFS_CONUS_3km``
* ``RRFS_CONUS_13km``
@@ -45,12 +45,12 @@ These four options are provided for flexibility related to compute resources and
In theory, it is possible to run any of the supported physics suites with any of the predefined grids, but the results will be more accurate and meaningful with appropriate grid/physics pairings.
-The predefined :term:`CONUS` grids follow the naming convention (e.g., RRFS_CONUS_*km) of the prototype 3-km continental United States (CONUS) grid being tested for the Rapid Refresh Forecast System (RRFS). The RRFS will be a convection-allowing, hourly-cycled, :term:`FV3`-:term:`LAM`-based ensemble planned for operational implementation in 2024. All four supported grids were created to fit completely within the High Resolution Rapid Refresh (`HRRR `_) domain to allow for use of HRRR data to initialize the SRW App.
+The predefined :term:`CONUS` grids follow the naming convention (e.g., RRFS_CONUS_*km) of the prototype 3-km continental United States (CONUS) grid being tested for the Rapid Refresh Forecast System (:term:`RRFS`). The RRFS will be a convection-allowing, hourly-cycled, :term:`FV3`-:term:`LAM`-based ensemble planned for operational implementation in 2024. All four supported grids were created to fit completely within the High Resolution Rapid Refresh (`HRRR `_) domain to allow for use of HRRR data to initialize the SRW App.
Predefined 3-km CONUS Grid
-----------------------------
-The 3km CONUS domain is ideal for running the ``FV3_RRFS_v1beta`` physics suite, since this suite definition file (:term:`SDF`) was specifically created for convection-allowing scales and is the precursor to the operational physics suite that will be used in the RRFS. The 3-km domain can also be used with the ``FV3_HRRR`` and ``FV3_WoFS`` physics suites, which likewise do not include convective parameterization. In fact, the ``FV3_WoFS`` physics suite is configured to run at 3-km *or less* and could therefore run with even higher-resolution user-defined domains if desired. However, the ``FV3_GFS_v16`` suite generally should *not* be used with the 3-km domain because the cumulus physics used in that physics suite is not configured to run at the 3-km resolution.
+The 3-km CONUS domain is ideal for running the ``FV3_RRFS_v1beta`` physics suite, since this suite definition file (:term:`SDF`) was specifically created for convection-allowing scales and is the precursor to the operational physics suite that will be used in the RRFS. The 3-km domain can also be used with the ``FV3_HRRR`` and ``FV3_WoFS`` physics suites, which likewise do not include convective parameterizations. In fact, the ``FV3_WoFS`` physics suite is configured to run at 3-km *or less* and could therefore run with even higher-resolution user-defined domains if desired. However, the ``FV3_GFS_v16`` suite generally should *not* be used with the 3-km domain because the cumulus physics used in that physics suite is not configured to run at the 3-km resolution.
.. _RRFS_CONUS_3km:
@@ -59,7 +59,7 @@ The 3km CONUS domain is ideal for running the ``FV3_RRFS_v1beta`` physics suite,
*The boundary of the RRFS_CONUS_3km computational grid (red) and corresponding write-component grid (blue).*
-The boundary of the ``RRFS_CONUS_3km`` domain is shown in :numref:`Figure %s ` (in red), and the boundary of the :ref:`write-component grid ` sits just inside the computational domain (in blue). This extra grid is required because the post-processing utility (:term:`UPP`) is unable to process data on the native FV3 gnomonic grid (in red). Therefore, model data are interpolated to a Lambert conformal grid (the write component grid) in order for UPP to read in and correctly process the data.
+The boundary of the ``RRFS_CONUS_3km`` domain is shown in :numref:`Figure %s ` (in red), and the boundary of the :ref:`write-component grid ` sits just inside the computational domain (in blue). This extra grid is required because the post-processing utility (:term:`UPP`) is unable to process data on the native FV3 gnomonic grid (in red). Therefore, model data are interpolated to a Lambert conformal grid (the write component grid) in order for the UPP to read in and correctly process the data.
.. note::
While it is possible to initialize the FV3-LAM with coarser external model data when using the ``RRFS_CONUS_3km`` domain, it is generally advised to use external model data (such as HRRR or RAP data) that has a resolution similar to that of the native FV3-LAM (predefined) grid.
@@ -87,7 +87,7 @@ Predefined 13-km Grid
*The boundary of the RRFS_CONUS_13km computational grid (red) and corresponding write-component grid (blue).*
-The ``RRFS_CONUS_13km`` grid (:numref:`Fig. %s `) covers the full :term:`CONUS`. This grid is meant to be run with the ``FV3_GFS_v16`` physics suite. The ``FV3_GFS_v16`` physics suite uses convective :term:`parameterizations`, whereas the other supported suites do not. Convective parameterizations are necessary for low-resolution grids because convection occurs on scales smaller than 25km and 13km.
+The ``RRFS_CONUS_13km`` grid (:numref:`Fig. %s `) covers the full :term:`CONUS`. This grid is meant to be run with the ``FV3_GFS_v16`` physics suite. The ``FV3_GFS_v16`` physics suite uses convective :term:`parameterizations`, whereas the other supported suites do not. Convective parameterizations are necessary for low-resolution grids because convection occurs on scales smaller than 25-km and 13-km.
Predefined 25-km Grid
------------------------
@@ -110,38 +110,35 @@ Ultimately, the choice of grid is experiment-dependent and resource-dependent. F
Creating User-Generated Grids
===============================
While the four predefined grids available in this release are ideal for users just starting
-out with the SRW App, more advanced users may wish to create their own grid for testing over
+out with the SRW App, more advanced users may wish to create their own predefined grid for testing over
a different region and/or with a different resolution. Creating a user-defined grid requires
knowledge of how the SRW App workflow functions. In particular, it is important to understand the set of
scripts that handle the workflow and experiment generation (see :numref:`Figure %s ` and :numref:`Figure %s `). It is also important to note that user-defined grids are not a supported feature of the current release; however, information is being provided for the benefit of the FV3-LAM community.
-With those caveats in mind, this section provides instructions for adding a new grid to the FV3-LAM
-workflow that will be generated using the "ESGgrid" method (i.e., using the regional_esg_grid code
+With those caveats in mind, this section provides instructions for adding a new predefined grid to the FV3-LAM
+workflow that will be generated using the "ESGgrid" method (i.e., using the ``regional_esg_grid`` code
in the `UFS_UTILS `__ repository, where ESG stands for "Extended Schmidt Gnomonic"). We assume here that the grid to be generated covers a domain that (1) does not contain either of the poles and (2) does not cross the -180 deg --> +180 deg discontinuity in longitude near the international date line. Instructions for domains that do not have these restrictions will be provided in a future release.
The steps to add such a grid to the workflow are as follows:
#. Choose the name of the grid. For the purposes of this documentation, the grid will be called "NEW_GRID".
-
#. Add NEW_GRID to the array ``valid_vals_PREDEF_GRID_NAME`` in the ``ufs-srweather-app/ush/valid_param_vals.yaml`` file.
-#. In ``ufs-srweather-app/ush/set_predef_grid_params.py``, add a stanza to
- the case statement ``case ${PREDEF_GRID_NAME} in`` for NEW_GRID. An example of such a stanza
- is given :ref:`below ` along with comments describing the variables that need to be set.
+#. In ``ufs-srweather-app/ush/predef_grid_params.yaml``, add a stanza describing the parameters for NEW_GRID. An example of such a stanza is given :ref:`below `. For descriptions of the variables that need to be set, see Sections :numref:`%s ` and :numref:`%s `.
-To run a forecast experiment on NEW_GRID, start with a workflow configuration file for a successful experiment (e.g., ``config.yaml``, located in the ``ufs-srweather-app/ush`` subdirectory), and change the line for ``PREDEF_GRID_NAME`` to the following:
+To run a forecast experiment on NEW_GRID, start with a workflow configuration file for a successful experiment (e.g., ``config.community.yaml``, located in the ``ufs-srweather-app/ush`` subdirectory), and change the line for ``PREDEF_GRID_NAME`` in the ``task_run_fcst:`` section to ``NEW_GRID``:
.. code-block:: console
- PREDEF_GRID_NAME="NEW_GRID"
+ PREDEF_GRID_NAME: "NEW_GRID"
-Then, generate a new experiment/workflow using the ``generate_FV3LAM_wflow.py`` script in the :ref:`usual way `.
+Then, load the regional workflow python environment, specify the other experiment parameters in ``config.community.yaml``, and generate a new experiment/workflow using the ``generate_FV3LAM_wflow.py`` script (see :numref:`Chapter %s ` for details).
Code Example
---------------
-The following is an example of a code stanza for "NEW_GRID" to be added to ``set_predef_grid_params.py``:
+The following is an example of a code stanza for "NEW_GRID" to be added to ``predef_grid_params.yaml``:
.. _NewGridExample:
@@ -150,148 +147,54 @@ The following is an example of a code stanza for "NEW_GRID" to be added to ``set
#
#---------------------------------------------------------------------
#
- # Stanza for NEW_GRID. This grid covers [provide a description of the
- # domain that NEW_GRID covers, its grid cell size, etc].
+ # Stanza for NEW_GRID. This grid covers [description of the
+ # domain] with ~[size]-km cells.
#
#---------------------------------------------------------------------
- #
- "NEW_GRID")
- # The method used to generate the grid. This example is specifically
- # for the "ESGgrid" method.
-
- GRID_GEN_METHOD= "ESGgrid"
+ "NEW_GRID":
- # The longitude and latitude of the center of the grid, in degrees.
+ # The method used to generate the grid. This example is specifically for the "ESGgrid" method.
- ESGgrid_LON_CTR=-97.5
- ESGgrid_LAT_CTR=38.5
+ GRID_GEN_METHOD: "ESGgrid"
- # The grid cell sizes in the x and y directions, where x and y are the
- # native coordinates of any ESG grid. The units of x and y are in
- # meters. These should be set to the nominal resolution we want the
- # grid to have. The cells will have exactly these sizes in xy-space
- # (computational space) but will have varying size in physical space.
- # The advantage of the ESGgrid generation method over the GFDLgrid
- # method is that an ESGgrid will have a much smaller variation in grid
- # size in physical space than a GFDLgrid.
-
- ESGgrid_DELX="25000.0"
- ESGgrid_DELY="25000.0"
-
- # The number of cells along the x and y axes.
+ # ESGgrid parameters:
- ESGgrid_NX=200
- ESGgrid_NY=112
-
- # The width of the halo (in units of grid cells) that the temporary
- # wide-halo grid created during the grid generation task (make_grid)
- # will have. This wide-halo grid gets "shaved" down to obtain the
- # 4-cell-wide halo and 3-cell-wide halo grids that the forecast model
- # (as well as other codes) will actually use. Recall that the halo is
- # needed to provide lateral boundary conditions to the forecast model.
- # Usually, there is no need to modify this parameter.
-
- ESGgrid_WIDE_HALO_WIDTH=6
-
- # The default physics time step that the forecast model will use. This
- # is the (inverse) frequency with which (most of) the physics suite is
- # called. The smaller the grid cell size is, the smaller this value
- # needs to be in order to avoid numerical instabilities during the
- # forecast. The values specified below are used only if DT_ATMOS is
- # not explicitly set in the user-specified experiment configuration
- # file config.yaml. Note that this parameter may be suite dependent.
-
- if [ "${CCPP_PHYS_SUITE}" = "FV3_GFS_v16" ]; then
- DT_ATMOS=${DT_ATMOS:-"300"}
- elif [ "${CCPP_PHYS_SUITE}" = "FV3_RRFS_v1beta" ]; then
- DT_ATMOS=${DT_ATMOS:-"40"}
- else
- DT_ATMOS=${DT_ATMOS:-"40"}
- fi
-
- # Default MPI task layout (decomposition) along the x and y directions and
- # blocksize. The values specified below are used only if they are not explicitly
- # set in the user-specified experiment configuration file config.yaml.
+ ESGgrid_LON_CTR: -97.5
+ ESGgrid_LAT_CTR: 38.5
+ ESGgrid_DELX: 25000.0
+ ESGgrid_DELY: 25000.0
+ ESGgrid_NX: 200
+ ESGgrid_NY: 112
+ ESGgrid_PAZI: 0.0
+ ESGgrid_WIDE_HALO_WIDTH: 6
- LAYOUT_X=${LAYOUT_X:-"5"}
- LAYOUT_Y=${LAYOUT_Y:-"2"}
- BLOCKSIZE=${BLOCKSIZE:-"40"}
-
- # The parameters for the write-component (aka "quilting") grid. The
- # Unified Post Processor (called by the ``RUN_POST_TN`` task) cannot
- # process output on the native ESGgrid, so output fields are interpolated
- # to a **write-component grid** before writing them to an output file.
- # The output fields are not specified on the native grid
- # but are instead remapped to this write-component grid. The variable
- # "QUILTING", which specifies whether or not to use the
- # write-component grid, is by default set to "TRUE".
-
-
- if [ "$QUILTING" = "TRUE" ]; then
-
- # The number of "groups" of MPI tasks that may be running at any given
- # time to write out the output. Each write group will be writing to
- # one set of output files (a dynf${fhr}.nc and a phyf${fhr}.nc file,
- # where $fhr is the forecast hour). Each write group contains
- # WRTCMP_write_tasks_per_group tasks. Usually, it is sufficient to
- # have just one write group. This may need to be increased if the
- # forecast is proceeding so quickly that a single write group cannot
- # complete writing to its set of files before there is a need/request
- # to start writing the next set of files at the next output time (this
- # can happen, for instance, if the forecast model is trying to write
- # output at every time step).
-
- WRTCMP_write_groups="1"
-
- # The number of MPI tasks to allocate to each write group.
+ # Forecast configuration parameters:
- WRTCMP_write_tasks_per_group="2"
-
- # The coordinate system for the write-component grid
- # See the array valid_vals_WRTCMP_output_grid (defined in
- # the script valid_param_vals.yaml) for the values this can take on.
- # The following example is specifically for the Lambert conformal
- # coordinate system.
+ DT_ATMOS: 40
+ LAYOUT_X: 5
+ LAYOUT_Y: 2
+ BLOCKSIZE: 40
- WRTCMP_output_grid="lambert_conformal"
-
- # The longitude and latitude of the center of the write-component
- # grid.
+ # Parameters for the write-component (aka "quilting") grid.
- WRTCMP_cen_lon="${ESGgrid_LON_CTR}"
- WRTCMP_cen_lat="${ESGgrid_LAT_CTR}"
-
- # The first and second standard latitudes needed for the Lambert
- # conformal coordinate mapping.
+ QUILTING:
+ WRTCMP_write_groups: 1
+ WRTCMP_write_tasks_per_group: 2
+ WRTCMP_output_grid: "lambert_conformal"
+ WRTCMP_cen_lon: -97.5
+ WRTCMP_cen_lat: 38.5
+ WRTCMP_lon_lwr_left: -121.12455072
+ WRTCMP_lat_lwr_left: 23.89394570
- WRTCMP_stdlat1="${ESGgrid_LAT_CTR}"
- WRTCMP_stdlat2="${ESGgrid_LAT_CTR}"
-
- # The number of grid points in the x and y directions of the
- # write-component grid. Note that this xy coordinate system is that of
- # the write-component grid (which in this case is Lambert conformal).
- # Thus, it is in general different than the xy coordinate system of
- # the native ESG grid.
-
- WRTCMP_nx="197"
- WRTCMP_ny="107"
-
- # The longitude and latitude of the lower-left corner of the
- # write-component grid, in degrees.
-
- WRTCMP_lon_lwr_left="-121.12455072"
- WRTCMP_lat_lwr_left="23.89394570"
-
- # The grid cell sizes along the x and y directions of the
- # write-component grid. Units depend on the coordinate system used by
- # the grid (i.e., the value of WRTCMP_output_grid). For a Lambert
- # conformal write-component grid, the units are in meters.
-
- WRTCMP_dx="${ESGgrid_DELX}"
- WRTCMP_dy="${ESGgrid_DELY}"
-
- fi
- ;;
+ # Parameters required for the Lambert conformal grid mapping.
+ WRTCMP_stdlat1: 38.5
+ WRTCMP_stdlat2: 38.5
+ WRTCMP_nx: 197
+ WRTCMP_ny: 107
+ WRTCMP_dx: 25000.0
+ WRTCMP_dy: 25000.0
+.. note::
+ The process above explains how to create a new *predefined* grid, which can be used more than once. If a user prefers to create a custom grid for one-time use, the variables above can instead be specified in ``config.yaml``, and ``PREDEF_GRID_NAME`` can be set to a null string. In this case, it is not necessary to modify ``valid_param_vals.yaml`` or ``predef_grid_params.yaml``. Users can view an example configuration file for a custom grid `here `__.
\ No newline at end of file
diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart.rst
index 1046345d22..19e8aa91bf 100644
--- a/docs/UsersGuide/source/Quickstart.rst
+++ b/docs/UsersGuide/source/Quickstart.rst
@@ -4,12 +4,12 @@
Quick Start Guide
====================
-This chapter provides a brief summary of how to build and run the SRW Application. The steps will run most smoothly on `Level 1 `__ systems. Users should expect to reference other chapters of this User's Guide, particularly :numref:`Chapter %s ` and :numref:`Chapter %s `, for additional explanations regarding each step.
+This chapter provides a brief summary of how to build and run the SRW Application. The steps will run most smoothly on `Level 1 `__ systems. Users should expect to reference other chapters of this User's Guide, particularly :numref:`Chapter %s: Building the SRW App ` and :numref:`Chapter %s: Running the SRW App `, for additional explanations regarding each step.
Install the HPC-Stack
===========================
-SRW App users who are not working on a `Level 1 `__ platform will need to install the :term:`HPC-Stack` prior to building the SRW App on a new machine. Installation instructions appear in the :doc:`HPC-Stack documentation `. The steps will vary slightly depending on the user's platform. However, in all cases, the process involves cloning the `HPC-Stack repository `__, creating and entering a build directory, and invoking ``cmake`` and ``make`` commands to build the stack. This process will create a number of modulefiles and scripts that will be used for setting up the build environment for the SRW App.
+SRW App users who are not working on a `Level 1 `__ platform will need to install the prerequisite software stack via :term:`HPC-Stack` prior to building the SRW App on a new machine. Users can find installation instructions in the :doc:`HPC-Stack documentation `. The steps will vary slightly depending on the user's platform. However, in all cases, the process involves (1) cloning the `HPC-Stack repository `__, (2) reviewing/modifying the ``config/config_.sh`` and ``stack/stack_.yaml`` files, and (3) running the commands to build the stack. This process will create a number of modulefiles required for building the SRW App.
Once the HPC-Stack has been successfully installed, users can move on to building the SRW Application.
@@ -17,7 +17,7 @@ Once the HPC-Stack has been successfully installed, users can move on to buildin
Building and Running the UFS SRW Application
===============================================
-For a detailed explanation of how to build and run the SRW App on any supported system, see :numref:`Chapter %s: Building the SRW App ` and :numref:`Chapter %s: Running the SRW App `. The overall procedure for generating an experiment is shown in :numref:`Figure %s `, with the scripts to generate and run the workflow shown in red. An overview of the required steps appears below. However, users can expect to access other referenced sections of this User's Guide for more detail.
+For a detailed explanation of how to build and run the SRW App on any supported system, see :numref:`Chapter %s: Building the SRW App ` and :numref:`Chapter %s: Running the SRW App `. :numref:`Figure %s ` outlines the steps of the build process. The overall procedure for generating an experiment is shown in :numref:`Figure %s `, with the scripts to generate and run the workflow shown in red. An overview of the required steps appears below. However, users can expect to access other referenced sections of this User's Guide for more detail.
#. Clone the SRW App from GitHub:
@@ -32,43 +32,19 @@ For a detailed explanation of how to build and run the SRW App on any supported
cd ufs-srweather-app
./manage_externals/checkout_externals
- #. Set up the build environment and build the executables.
+ #. Set up the build environment and build the executables:
- * **Option 1:**
-
- .. code-block:: console
+ .. code-block:: console
- ./devbuild.sh --platform=
-
- where ```` is replaced with the name of the user's platform/system. Valid values are: ``cheyenne`` | ``gaea`` | ``hera`` | ``jet`` | ``macos`` | ``noaacloud`` | ``odin`` | ``orion`` | ``singularity`` | ``wcoss2``
-
- * **Option 2:**
-
- .. code-block:: console
-
- source etc/lmod-setup.sh
-
- where ```` refers to the user's platform (e.g., ``macos``, ``gaea``, ``odin``, ``singularity``).
-
- Users will also need to load the "build" modulefile appropriate to their system. On Level 3 & 4 systems, users can adapt an existing modulefile (such as ``build_macos_gnu``) to their system.
+ ./devbuild.sh --platform=
- .. code-block:: console
+ where ```` is replaced with the name of the user's platform/system. Valid values include: ``cheyenne`` | ``gaea`` | ``hera`` | ``jet`` | ``linux`` | ``macos`` | ``noaacloud`` | ``orion`` | ``wcoss2``
- module use
- module load build__
+ For additional details, see :numref:`Section %s `, or view :numref:`Section %s ` to try the CMake build approach instead.
- From the top-level ``ufs-srweather-app`` directory, run:
+ #. Users on a `Level 2-4 `__ system must download and stage data (both the fix files and the :term:`IC/LBC ` files) according to the instructions in :numref:`Section %s `. Standard data locations for Level 1 systems appear in :numref:`Table %s `.
- .. code-block:: console
-
- mkdir build
- cd build
- cmake .. -DCMAKE_INSTALL_PREFIX=..
- make -j 4 >& build.out &
-
- #. Download and stage data (both the fix files and the :term:`IC/LBC ` files) according to the instructions in :numref:`Section %s ` (if on a Level 2-4 system).
-
- #. Load the python environment for the regional workflow. Users on Level 2-4 systems will need to use one of the existing ``wflow_`` modulefiles (e.g., ``wflow_macos``) and adapt it to their system.
+ #. Load the python environment for the regional workflow. Users on Level 2-4 systems will need to use one of the existing ``wflow_`` modulefiles (e.g., ``wflow_macos``) and adapt it to their system. Then, run:
.. code-block:: console
@@ -86,27 +62,24 @@ For a detailed explanation of how to build and run the SRW App on any supported
then the user should run ``conda activate regional_workflow`` to activate the regional workflow environment.
.. note::
- If users source the lmod-setup file on a system that doesn't need it, it will not cause any problems (it will simply do a ``module purge``).
+ If users source the *lmod-setup* file on a system that doesn't need it, it will not cause any problems (it will simply do a ``module purge``).
- #. Configure the experiment parameters.
+ #. Configure the experiment:
.. code-block:: console
cd ush
cp config.community.yaml config.yaml
- Users will need to adjust the experiment parameters in the ``config.yaml`` file to suit the needs of their experiment (e.g., date, time, grid, physics suite, etc.). More detailed guidance is available in :numref:`Section %s `. Parameters and valid values are listed in :numref:`Chapter %s