diff --git a/README.md b/README.md index f746ae5c78..bcb75fe601 100644 --- a/README.md +++ b/README.md @@ -1,6 +1,6 @@ # UFS Short-Range Weather Application -The Unified Forecast System (UFS) is a community-based, coupled, comprehensive Earth modeling system. It is designed to be the source system for NOAA’s operational numerical weather prediction applications while enabling research, development, and contribution opportunities for the broader weather enterprise. For more information about the UFS, visit the UFS Portal at https://ufscommunity.org/. +The Unified Forecast System (UFS) is a community-based, coupled, comprehensive Earth modeling system. It is designed to be the source system for NOAA’s operational numerical weather prediction applications while enabling research, development, and contribution opportunities for the broader Weather Enterprise. For more information about the UFS, visit the UFS Portal at https://ufscommunity.org/. The UFS includes multiple applications (see a complete list at https://ufscommunity.org/science/aboutapps/) that support different forecast durations and spatial domains. This documentation describes the development branch of the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes to several days. The development branch of the application is continually evolving as the system undergoes open development. The latest SRW App release (v2.1.0) represents a snapshot of this continuously evolving system. diff --git a/docs/UsersGuide/source/BuildSRW.rst b/docs/UsersGuide/source/BuildSRW.rst index 85790b482c..4cf6ff71ae 100644 --- a/docs/UsersGuide/source/BuildSRW.rst +++ b/docs/UsersGuide/source/BuildSRW.rst @@ -30,18 +30,18 @@ To build the SRW App, users will complete the following steps: .. _HPCstackInfo: -Install the HPC-Stack -======================== +Install the Prerequisite Software Stack +========================================== -.. Attention:: - Skip the HPC-Stack installation if working on a `Level 1 system `_ (e.g., Cheyenne, Hera, Orion, NOAA Cloud). +Currently, installation of the prerequisite software stack is supported via HPC-Stack. :term:`HPC-Stack` is a repository that provides a unified, shell script-based system to build the software stack required for `UFS `__ applications such as the SRW App. -**Definition:** :term:`HPC-Stack` is a repository that provides a unified, shell script-based build system to build the software stack required for `UFS `_ applications such as the SRW App. +.. Attention:: + Skip the HPC-Stack installation if working on a `Level 1 system `__ (e.g., Cheyenne, Hera, Orion, NOAA Cloud), and :ref:`continue to the next section `. Background ---------------- -The UFS Weather Model draws on over 50 code libraries to run its applications. These libraries range from libraries developed in-house at NOAA (e.g., NCEPLIBS, FMS) to libraries developed by NOAA's partners (e.g., PIO, ESMF) to truly third party libraries (e.g., NETCDF). Individual installation of these libraries is not practical, so the `HPC-Stack `__ was developed as a central installation system to ensure that the infrastructure environment across multiple platforms is as similar as possible. Installation of the HPC-Stack is required to run the SRW App. +The UFS Weather Model draws on over 50 code libraries to run its applications. These libraries range from libraries developed in-house at NOAA (e.g., NCEPLIBS, FMS) to libraries developed by NOAA's partners (e.g., PIO, ESMF) to truly third party libraries (e.g., netCDF). Individual installation of these libraries is not practical, so the `HPC-Stack `__ was developed as a central installation system to ensure that the infrastructure environment across multiple platforms is as similar as possible. Installation of the HPC-Stack is required to run the SRW App. Instructions ------------------------- @@ -57,6 +57,9 @@ Users working on systems that fall under `Support Levels 2-4 `. +.. attention:: + Although HPC-Stack is the fully-supported option as of the v2.1.0 release, UFS applications are gradually shifting to :term:`spack-stack`, which is a :term:`Spack`-based method for installing UFS prerequisite software libraries. The spack-stack is currently used on NOAA Cloud platforms and in containers, while HPC-Stack is still used on other Level 1 systems and is the software stack validated by the UFS Weather Model as of the v2.1.0 release. Users are encouraged to check out `spack-stack `__ to prepare for the upcoming shift in support from HPC-Stack to spack-stack. + After completing installation, continue to the next section (:numref:`Section %s: Download the UFS SRW Application Code `). .. _DownloadSRWApp: @@ -114,7 +117,7 @@ The cloned repository contains the configuration files and sub-directories shown | | configuration files such as ``model_configure``, | | | ``diag_table``, and ``field_table``. | +--------------------------------+-----------------------------------------------------------+ - | README.md | Getting Started Guide | + | README.md | Contains SRW App introductory information | +--------------------------------+-----------------------------------------------------------+ | rename_model.sh | Used to rename the model before it becomes NCO | | | operational. The SRW App is a generic app that is the base| @@ -160,7 +163,7 @@ Run the executable that pulls in SRW App components from external repositories: .. code-block:: console - cd $SRW + cd ./manage_externals/checkout_externals The script should output dialogue indicating that it is retrieving different code repositories. It may take several minutes to download these repositories. @@ -169,7 +172,7 @@ To see more options for the ``checkout_externals`` script, users can run ``./man * ``-S``: Outputs the status of the repositories managed by ``checkout_externals``. By default only summary information is provided. Use with the ``-v`` (verbose) option to see details. * ``-x [EXCLUDE [EXCLUDE ...]]``: allows users to exclude components when checking out externals. - * ``-o``: By default only the required externals are checked out. This flag will also checkout the optional externals. + * ``-o``: By default only the required externals are checked out. This flag will also check out the optional externals. Generally, users will not need to use the options and can simply run the script, but the options are available for those who are curious. @@ -189,10 +192,10 @@ On Level 1 systems for which a modulefile is provided under the ``modulefiles`` ./devbuild.sh --platform= -where ```` is replaced with the name of the platform the user is working on. Valid values are: ``cheyenne`` | ``gaea`` | ``hera`` | ``jet`` | ``linux`` | ``macos`` | ``noaacloud`` | ``odin`` | ``orion`` | ``singularity`` | ``wcoss2`` +where ```` is replaced with the name of the platform the user is working on. Valid values include: ``cheyenne`` | ``gaea`` | ``hera`` | ``jet`` | ``linux`` | ``macos`` | ``noaacloud`` | ``orion`` .. note:: - Although build modulefiles exist for generic Linux and MacOS machines, users will need to alter these according to the instructions in Sections :numref:`%s ` & :numref:`%s `. Users on these systems may have more success building the SRW App with the :ref:`CMake Approach ` instead. + Although build modulefiles exist for generic Linux and MacOS machines, users will need to alter these according to the instructions in Sections :numref:`%s ` & :numref:`%s `. Users on these systems may have more success building the SRW App with the :ref:`CMake Approach ` instead. If compiler auto-detection fails for some reason, specify it using the ``--compiler`` argument. For example: @@ -375,7 +378,7 @@ Set Up the Build Environment .. attention:: * If users successfully built the executables in :numref:`Table %s `, they should skip to step :numref:`Chapter %s `. - * Users who want to build the SRW App on a generic MacOS should skip to :numref:`Section %s ` and follow the approach there. + * Users who want to build the SRW App on MacOS or generic Linux systems should skip to :numref:`Section %s ` and follow the approach there. If the ``devbuild.sh`` approach failed, users need to set up their environment to run a workflow on their specific platform. First, users should make sure ``Lmod`` is the app used for loading modulefiles. This is the case on most Level 1 systems; however, on systems such as Gaea/Odin, the default modulefile loader is from Cray and must be switched to Lmod. For example, on Gaea, users can run one of the following two commands depending on whether they have a bash or csh shell, respectively: @@ -384,9 +387,11 @@ If the ``devbuild.sh`` approach failed, users need to set up their environment t source etc/lmod-setup.sh gaea source etc/lmod-setup.csh gaea -If users execute one of the above commands on systems that don't need it, it will not cause any problems (it will simply do a ``module purge``). +.. note:: -From here on, ``Lmod`` is ready to load the modulefiles needed by the SRW App. These modulefiles are located in the ``modulefiles`` directory. To load the necessary modulefile for a specific ```` using a given ````, run: + If users execute one of the above commands on systems that don't need it, it will not cause any problems (it will simply do a ``module purge``). + +From here, ``Lmod`` is ready to load the modulefiles needed by the SRW App. These modulefiles are located in the ``modulefiles`` directory. To load the necessary modulefile for a specific ```` using a given ````, run: .. code-block:: console @@ -423,7 +428,7 @@ From the build directory, run the following commands to build the pre-processing cmake .. -DCMAKE_INSTALL_PREFIX=.. -DCMAKE_INSTALL_BINDIR=exec .. make -j 4 >& build.out & -``-DCMAKE_INSTALL_PREFIX`` specifies the location in which the ``exec``, ``include``, ``lib``, and ``share`` directories will be created. These directories will contain various components of the SRW App. Its recommended value ``..`` denotes one directory up from the build directory. In the next line, the ``make`` argument ``-j 4`` indicates that the build will run in parallel with 4 threads. Although users can specify a larger or smaller number of threads (e.g., ``-j 8``, ``-j 2``), it is highly recommended to use at least 4 parallel threads to prevent overly long installation times. +``-DCMAKE_INSTALL_PREFIX`` specifies the location where the ``exec``, ``include``, ``lib``, and ``share`` directories will be created. These directories will contain various components of the SRW App. Its recommended value ``..`` denotes one directory up from the build directory. In the next line, the ``make`` argument ``-j 4`` indicates that the build will run in parallel with 4 threads. Although users can specify a larger or smaller number of threads (e.g., ``-j 8``, ``-j 2``), it is highly recommended to use at least 4 parallel threads to prevent overly long installation times. The build will take a few minutes to complete. When it starts, a random number is printed to the console, and when it is done, a ``[1]+ Done`` message is printed to the console. ``[1]+ Exit`` indicates an error. Output from the build will be in the ``ufs-srweather-app/build/build.out`` file. When the build completes, users should see the forecast model executable ``ufs_model`` and several pre- and post-processing executables in the ``ufs-srweather-app/exec`` directory. These executables are described in :numref:`Table %s `. @@ -431,62 +436,41 @@ The build will take a few minutes to complete. When it starts, a random number i If you see the ``build.out`` file, but there is no ``ufs-srweather-app/exec`` directory, wait a few more minutes for the build to complete. -.. _MacDetails: +.. _MacLinuxDetails: -Additional Details for Building on MacOS ------------------------------------------- +Additional Details for Building on MacOS or Generic Linux +------------------------------------------------------------ .. note:: - Users who are **not** building the SRW App on a MacOS machine may skip to the :numref:`Section %s ` to finish building the SRW App or continue to :numref:`Chapter %s ` to configure and run an experiment. + Users who are **not** building the SRW App on MacOS or generic Linux platforms may skip to :numref:`Section %s ` to finish building the SRW App or continue to :numref:`Chapter %s ` to configure and run an experiment. -The SRW App can be built on MacOS machines, presuming HPC-Stack has already been installed successfully. The following two options have been tested: +The SRW App can be built on MacOS and generic Linux machines after the prerequisite software has been installed on these systems (via :term:`HPC-Stack` or :term:`spack-stack`). The installation for MacOS is architecture-independent and has been tested using both x86_64 and M1 chips (running natively). The following configurations for MacOS have been tested: -* **Option 1:** MacBookAir 2020, M1 chip (arm64, running natively), 4+4 cores, Big Sur 11.6.4, GNU compiler suite v.11.2.0_3 (gcc, gfortran, g++); no MPI pre-installed + #. MacBookPro 2019, 2.4 GHz 8-core Intel Core i9 (x86_64), Monterey Sur 12.1, GNU compiler suite v.11.3.0 (gcc, gfortran, g++); mpich 3.3.2 or openmpi/4.1.2 + #. MacBookAir 2020, M1 chip (arm64, running natively), 4+4 cores, Big Sur 11.6.4, GNU compiler suite v.11.3.0 (gcc, gfortran, g++); mpich 3.3.2 or openmpi/4.1.2 + #. MacBook Pro 2015, 2.8 GHz Quad-Core Intel Core i7 (x86_64), Catalina OS X 10.15.7, GNU compiler suite v.11.2.0_3 (gcc, gfortran, g++); mpich 3.3.2 or openmpi/4.1.2 -* **Option 2:** MacBook Pro 2015, 2.8 GHz Quad-Core Intel Core i7 (x86_64), Catalina OS X 10.15.7, GNU compiler suite v.11.2.0_3 (gcc, gfortran, g++); no MPI pre-installed +Several Linux builds have been tested on systems with x86_64 architectures. -The ``build_macos_gnu`` modulefile initializes the module environment, lists the location of HPC-Stack modules, loads the meta-modules and modules, and sets compilers, additional flags, and environment variables needed for building the SRW App. The modulefile must be modified to include the absolute path to the user's HPC-Stack installation and ``ufs-srweather-app`` directories. In particular, the following section must be modified: +The ``./modulefiles/build__gnu.lua`` modulefile (where ```` is ``macos`` or ``linux``) is written as a Lmod module in the Lua language, and it can be loaded once the Lmod module environment has been initialized (which should have happened even prior to :ref:`installing HPC-Stack `). This module lists the location of the HPC-Stack modules, loads the meta-modules and modules, sets serial and parallel compilers, additional flags, and any environment variables needed for building the SRW App. The modulefile must be modified to include the absolute path to the user's HPC-Stack installation: .. code-block:: console - # This path should point to your HPCstack installation directory - setenv HPCstack "/Users/username/hpc-stack/install" - - # This path should point to your SRW Application directory - setenv SRW "/Users/username/ufs-srweather-app" + - This path should point to your HPCstack installation directory + local HPCstack="/Users/username/hpc-stack/install" -An excerpt of the ``build_macos_gnu`` contents appears below for Option 1. To use Option 2, the user will need to comment out the lines specific to Option 1 (using a double hyphen) and uncomment the lines specific to Option 2 in the ``build_macos_gnu`` modulefile. Additionally, users need to verify that all file paths reflect their system's configuration and that the correct version numbers for software libraries appear in the modulefile. - -.. code-block:: console +Linux users need to configure the ``ufs-srweather-app/etc/lmod-setup.sh`` file for the ``linux`` case and set the ``BASH_ENV`` variable to point to the Lmod initialization script. There is no need to modify this script for the ``macos`` case presuming that Lmod followed a standard installation procedure using the Homebrew package manager for MacOS. - -- Option 1 compiler paths: - setenv("CC", "/opt/homebrew/bin/gcc") - setenv("FC", "/opt/homebrew/bin/gfortran") - setenv("CXX", "/opt/homebrew/bin/g++") - - -- Option 2 compiler paths: - --[[ - setenv("CC", "/usr/local/bin/gcc") - setenv("FC", "/usr/local/bin/gfortran") - setenv("CXX", "/usr/local/bin/g++") - --]] - -Then, users must source the Lmod setup file, just as they would on other systems, and load the modulefiles needed for building and running the SRW App: +Next, users must source the Lmod setup file, just as they would on other systems, and load the modulefiles needed for building and running the SRW App: .. code-block:: console - - source etc/lmod-setup.sh macos + + source etc/lmod-setup.sh module use - module load build_macos_gnu - export LDFLAGS="-L${MPI_ROOT}/lib" - -In a csh/tcsh shell, users would run ``source etc/lmod-setup.csh macos`` in place of the first line in the code block above. - -Additionally, for Option 1 systems, set the variable ``ENABLE_QUAD_PRECISION`` to ``OFF`` in the ``$SRW/src/ufs-weather-model/FV3/atmos_cubed_sphere/CMakeLists.txt`` file. This change is optional if using Option 2 to build the SRW App. To make this change using a streamline editor (`sed`), run: - -.. code-block:: console + module load build__gnu + export LDFLAGS+=" -L${MPI_ROOT}/lib " - sed -i .bak 's/QUAD_PRECISION\" ON)/QUAD_PRECISION\" OFF)/' $SRW/src/ufs-weather-model/FV3/atmos_cubed_sphere/CMakeLists.txt +In a csh/tcsh shell, users would run ``source etc/lmod-setup.csh `` in place of the first line in the code block above. The last line is primarily needed for the MacOS platforms. Proceed to building the executables using the process outlined in :numref:`Step %s `. diff --git a/docs/UsersGuide/source/Components.rst b/docs/UsersGuide/source/Components.rst index b187db6680..2b6ff1ddd8 100644 --- a/docs/UsersGuide/source/Components.rst +++ b/docs/UsersGuide/source/Components.rst @@ -8,11 +8,11 @@ The SRW Application assembles a variety of components, including: * Pre-processor Utilities & Initial Conditions * UFS Weather Forecast Model -* Unified Post-Processor +* Unified Post Processor * Visualization Examples * Build System and Workflow -These components are documented within this User's Guide and supported through a `community forum `_. +These components are documented within this User's Guide and supported through the `GitHub Discussions `__ forum. .. _Utils: @@ -30,25 +30,23 @@ The SRW Application can be initialized from a range of operational initial condi Forecast Model ============== -The prognostic atmospheric model in the UFS SRW Application is the Finite-Volume Cubed-Sphere -(:term:`FV3`) dynamical core configured with a Limited Area Model (:term:`LAM`) capability :cite:`BlackEtAl2021`. The :term:`dynamical core` is the computational part of a model that solves the equations of fluid motion. A User’s Guide for the UFS :term:`Weather Model` is `here `__. +The prognostic atmospheric model in the UFS SRW Application is the Finite-Volume Cubed-Sphere (:term:`FV3`) dynamical core configured with a Limited Area Model (:term:`LAM`) capability :cite:`BlackEtAl2021`. The :term:`dynamical core` is the computational part of a model that solves the equations of fluid motion. A User's Guide for the UFS Weather Model can be accessed `here `__. Supported model resolutions in this release include 3-, 13-, and 25-km predefined contiguous U.S. (:term:`CONUS`) domains, each with 127 vertical levels. Preliminary tools for users to define their own domain are also available in the release with full, formal support of these tools to be provided in future releases. The Extended Schmidt Gnomonic (ESG) grid is used with the FV3-LAM, which features relatively uniform grid cells across the entirety of the domain. Additional information about the FV3 dynamical core can be found in the `scientific documentation `__, the `technical documentation `__, and on the `NOAA Geophysical Fluid Dynamics Laboratory website `__. -Interoperable atmospheric physics, along with various land surface model options, are supported through the Common Community Physics Package (CCPP), described `here `__. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. There will be four physics suites supported for the SRW App v2.0.0 release. The first is the FV3_RRFS_v1beta physics suite, which is being tested for use in the future operational implementation of the Rapid Refresh Forecast System (RRFS) planned for 2023-2024, and the second is an updated version of the physics suite used in the operational Global Forecast System (GFS) v16. Additionally, FV3_WoFS_v0 and FV3_HRRR will be supported. A scientific description of the CCPP parameterizations and suites can be found in the `CCPP Scientific Documentation `__, and CCPP technical aspects are described in the `CCPP Technical Documentation `__. The model namelist has many settings beyond the physics options that can optimize various aspects of the model for use with each of the supported suites. Additional information on Stochastic Physics options is available `here `__. +Interoperable atmospheric physics, along with various land surface model options, are supported through the Common Community Physics Package (CCPP), described `here `__. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. There are four physics suites supported as of the SRW App v2.1.0 release. The first is the FV3_RRFS_v1beta physics suite, which is being tested for use in the future operational implementation of the Rapid Refresh Forecast System (:term:`RRFS`) planned for 2023-2024, and the second is an updated version of the physics suite used in the operational Global Forecast System (GFS) v16. Additionally, FV3_WoFS_v0 and FV3_HRRR are supported. A detailed list of CCPP updates since the SRW App v2.0.0 release is available :ref:`here `. A full scientific description of CCPP parameterizations and suites can be found in the `CCPP Scientific Documentation `__, and CCPP technical aspects are described in the `CCPP Technical Documentation `__. The model namelist has many settings beyond the physics options that can optimize various aspects of the model for use with each of the supported suites. Additional information on Stochastic Physics options is available `here `__. .. note:: SPP is currently only available for specific physics schemes used in the RAP/HRRR physics suite. Users need to be aware of which physics suite definition file (:term:`SDF`) is chosen when turning this option on. Among the supported physics suites, the full set of parameterizations can only be used with the ``FV3_HRRR`` option for ``CCPP_PHYS_SUITE``. The SRW App supports the use of both :term:`GRIB2` and :term:`NEMSIO` input data. The UFS Weather Model ingests initial and lateral boundary condition files produced by :term:`chgres_cube` and outputs files in netCDF format on a specific projection (e.g., Lambert Conformal) in the horizontal direction and model levels in the vertical direction. -Post-processor +Post Processor ============== The SRW Application is distributed with the Unified Post Processor (:term:`UPP`) included in the workflow as a way to convert the netCDF output on the native model grid to :term:`GRIB2` format on standard isobaric vertical coordinates. The UPP can also be used to compute a variety of useful diagnostic fields, as described in the `UPP User's Guide `__. -Output from UPP can be used with visualization, plotting, and verification packages or in -further downstream post-processing (e.g., statistical post-processing techniques). +Output from UPP can be used with visualization, plotting, and verification packages or in further downstream post-processing (e.g., statistical post-processing techniques). .. _MetplusComponent: @@ -57,39 +55,38 @@ METplus Verification Suite The enhanced Model Evaluation Tools (`METplus `__) verification system has been integrated into the SRW App to facilitate forecast evaluation. METplus is a verification framework that spans a wide range of temporal scales (warn-on-forecast to climate) and spatial scales (storm to global). It is supported by the `Developmental Testbed Center (DTC) `__. -METplus *installation* is not included as part of the build process for the most recent release of the SRW App. However, METplus is preinstalled on many `Level 1 & 2 `__ systems; existing builds can be viewed `here `__. METplus can be installed on other systems individually or as part of :term:`HPC-Stack` installation. Users on systems without a previous installation of METplus can follow the `MET Installation Guide `__ and `METplus Installation Guide `__ for individual installation. Currently, METplus *installation* is not a supported feature for this release of the SRW App. However, METplus *use* is supported on systems with a functioning METplus installation. +METplus *installation* is not included as part of the build process for the most recent release of the SRW App. However, METplus is preinstalled on many `Level 1 & 2 `__ systems; existing builds can be viewed `here `__. -The core components of the METplus framework include the statistical driver, MET, the associated database and display systems known as METviewer and METexpress, and a suite of Python wrappers to provide low-level automation and examples, also called use-cases. MET is a set of verification tools developed for use by the :term:`NWP` community. It matches up grids with either gridded analyses or point observations and applies configurable methods to compute statistics and diagnostics. Extensive documentation is available in the `METplus User’s Guide `__ and `MET User’s Guide `__. Documentation for all other components of the framework can be found at the Documentation link for each component on the METplus `downloads `__ page. +METplus can be installed on other systems individually or as part of :term:`HPC-Stack` installation. Users on systems without a previous installation of METplus can follow the `MET Installation Guide `__ and `METplus Installation Guide `__ for individual installation. Currently, METplus *installation* is not a supported feature for this release of the SRW App. However, METplus *use* is supported on systems with a functioning METplus installation. -Among other techniques, MET provides the capability to compute standard verification scores for comparing deterministic gridded model data to point-based and gridded observations. It also provides ensemble and probabilistic verification methods for comparing gridded model data to point-based or gridded observations. Verification tasks to accomplish these comparisons are defined in the SRW App in :numref:`Table %s `. Currently, the SRW App supports the use of :term:`NDAS` observation files in `prepBUFR format `__ (which include conventional point-based surface and upper-air data) for point-based verification. It also supports gridded Climatology-Calibrated Precipitation Analysis (:term:`CCPA`) data for accumulated precipitation evaluation and Multi-Radar/Multi-Sensor (:term:`MRMS`) gridded analysis data for composite reflectivity and :term:`echo top` verification. +The core components of the METplus framework include the statistical driver, MET, the associated database and display systems known as METviewer and METexpress, and a suite of Python wrappers to provide low-level automation and examples, also called use cases. MET is a set of verification tools developed for use by the :term:`NWP` community. It matches up grids with either gridded analyses or point observations and applies configurable methods to compute statistics and diagnostics. Extensive documentation is available in the `METplus User's Guide `__ and `MET User's Guide `__. Documentation for all other components of the framework can be found at the Documentation link for each component on the METplus `downloads `__ page. + +Among other techniques, MET provides the capability to compute standard verification scores for comparing deterministic gridded model data to point-based and gridded observations. It also provides ensemble and probabilistic verification methods for comparing gridded model data to point-based or gridded observations. Verification tasks to accomplish these comparisons are defined in the SRW App in :numref:`Table %s `. Currently, the SRW App supports the use of :term:`NDAS` observation files (which include conventional point-based surface and upper-air data) in `prepBUFR format `__ for point-based verification. It also supports gridded Climatology-Calibrated Precipitation Analysis (:term:`CCPA`) data for accumulated precipitation evaluation and Multi-Radar/Multi-Sensor (:term:`MRMS`) gridded analysis data for composite reflectivity and :term:`echo top` verification. METplus is being actively developed by :term:`NCAR`/Research Applications Laboratory (RAL), NOAA/Earth Systems Research Laboratories (ESRL), and NOAA/Environmental Modeling Center (:term:`EMC`), and it is open to community contributions. Visualization Example ===================== -A Python script is provided to create basic visualizations of the model output. The script -is designed to output graphics in PNG format for 14 standard meteorological variables +A Python script is provided to create basic visualizations of the model output. The script is designed to output graphics in PNG format for several standard meteorological variables when using the pre-defined :term:`CONUS` domain. A difference plotting script is also included to visually compare two runs for the same domain and resolution. These scripts are provided only as an example for users familiar with Python. They may be used to perform a visual check to verify that the application is producing reasonable results. -After running ``manage_externals/checkout_externals``, the visualization scripts will be available in the ``ufs-srweather-app/ush/Python`` directory. Usage information and instructions are described in :numref:`Chapter %s ` and are also included at the top of the script. +After running ``manage_externals/checkout_externals``, the visualization scripts will be available in the ``ufs-srweather-app/ush/Python`` directory. Usage information and instructions are described in :numref:`Chapter %s ` and are also included at the top of each script. Build System and Workflow ========================= The SRW Application has a portable build system and a user-friendly, modular, and expandable workflow framework. -An umbrella CMake-based build system is used for building the components necessary for running the end-to-end SRW Application, including the UFS Weather Model and the pre- and post-processing software. Additional libraries necessary for the application (e.g., :term:`NCEPLIBS-external` and :term:`NCEPLIBS`) are not included in the SRW Application build system but are available pre-built on pre-configured platforms. On other systems, they can be installed via the HPC-Stack (see :doc:`HPC-Stack Documentation `). There is a small set of system libraries and utilities that are assumed to be present on the target computer: the CMake build software, a Fortran, C, and C++ compiler, and an :term:`MPI` library. +An umbrella CMake-based build system is used for building the components necessary for running the end-to-end SRW Application, including the UFS Weather Model and the pre- and post-processing software. Additional libraries necessary for the application (e.g., :term:`NCEPLIBS-external` and :term:`NCEPLIBS`) are not included in the SRW Application build system but are available pre-built on pre-configured platforms. On other systems, they can be installed via the HPC-Stack (see :doc:`HPC-Stack Documentation `). There is a small set of system libraries and utilities that are assumed to be present on the target computer: the CMake build software; a Fortran, C, and C++ compiler; and an :term:`MPI` library. Once built, the provided experiment generator script can be used to create a Rocoto-based -workflow file that will run each task in the system in the proper sequence (see :numref:`Chapter %s ` or the `Rocoto documentation `_ for more information on Rocoto). If Rocoto and/or a batch system is not present on the available platform, the individual components can be run in a stand-alone, command line fashion with provided run scripts. The generated namelist for the atmospheric model can be modified in order to vary settings such as forecast starting and ending dates, forecast length hours, the :term:`CCPP` physics suite, integration time step, history file output frequency, and more. It also allows for configuration of other elements of the workflow; for example, users can choose whether to run some or all of the pre-processing, forecast model, and post-processing steps. +workflow file that will run each task in the system in the proper sequence (see :numref:`Chapter %s ` or the `Rocoto documentation `__ for more information on Rocoto). If Rocoto and/or a batch system is not present on the available platform, the individual components can be run in a stand-alone, command line fashion with provided run scripts. The generated namelist for the atmospheric model can be modified in order to vary settings such as forecast starting and ending dates, forecast length hours, the :term:`CCPP` physics suite, integration time step, history file output frequency, and more. It also allows for configuration of other elements of the workflow; for example, users can choose whether to run some or all of the pre-processing, forecast model, and post-processing steps. -The latest SRW Application release has been tested on a variety of platforms widely used by -researchers, such as the NOAA Research and Development High-Performance Computing Systems -(RDHPCS), including Hera, Orion, and Jet; the National Center for Atmospheric Research (:term:`NCAR`) Cheyenne system; the National Severe Storms Laboratory (NSSL) HPC machine, Odin; the National Science Foundation Stampede2 system; and generic Linux and MacOS systems using Intel and GNU compilers. Four `levels of support `_ have been defined for the SRW Application, including pre-configured (Level 1), configurable (Level 2), limited test platforms (Level 3), and build only platforms (Level 4). Each level is further described below. +The latest SRW Application release has been tested on a variety of platforms widely used by researchers, such as the NOAA Research and Development High-Performance Computing Systems (RDHPCS), including Hera, Orion, and Jet; the National Center for Atmospheric Research (:term:`NCAR`) Cheyenne system; and generic Linux and MacOS systems using Intel and GNU compilers. Four `levels of support `__ have been defined for the SRW Application, including pre-configured (Level 1), configurable (Level 2), limited-test (Level 3), and build-only (Level 4) platforms. Each level is further described below. On pre-configured (Level 1) computational platforms, all the required libraries for building the SRW Application are available in a central place. That means bundled libraries (NCEPLIBS) and third-party libraries (NCEPLIBS-external) have both been built. The SRW Application is expected to build and run out-of-the-box on these pre-configured platforms. A few additional computational platforms are considered configurable for the SRW Application release. Configurable platforms (Level 2) are platforms where all of the required libraries for building the SRW Application are expected to install successfully but are not available in a central location. Applications and models are expected to build and run once the required bundled libraries (e.g., NCEPLIBS) and third-party libraries (e.g., NCEPLIBS-external) are built. -Limited-Test (Level 3) and Build-Only (Level 4) computational platforms are those in which the developers have built the code but little or no pre-release testing has been conducted, respectively. A complete description of the levels of support, along with a list of preconfigured and configurable platforms can be found in the `SRW Application Wiki `_. +Limited-Test (Level 3) and Build-Only (Level 4) computational platforms are those in which the developers have built the code but little or no pre-release testing has been conducted, respectively. A complete description of the levels of support, along with a list of preconfigured and configurable platforms can be found in the `SRW Application Wiki `__. diff --git a/docs/UsersGuide/source/ConfigWorkflow.rst b/docs/UsersGuide/source/ConfigWorkflow.rst index eb0407e954..ecde1be5d3 100644 --- a/docs/UsersGuide/source/ConfigWorkflow.rst +++ b/docs/UsersGuide/source/ConfigWorkflow.rst @@ -488,8 +488,7 @@ Verification Tasks ``RUN_TASK_VX_ENSPOINT``: (Default: false) Flag that determines whether to run the ensemble point verification task. If this flag is set, both ensemble-stat point verification and point verification of ensemble-stat output is computed. The :ref:`MET Ensemble-Stat tool ` provides verification statistics for ensemble forecasts and can be used in conjunction with the :ref:`MET Point-Stat tool `. See :numref:`Section %s ` for additional parameters related to this task. Valid values: ``True`` | ``False`` -.. - COMMENT: Might be worth defining "ensemble-stat verification for gridded data," "ensemble point verification," "ensemble-stat point verification," and "point verification of ensemble-stat output" +.. COMMENT: COMMENT: Define "ensemble-stat verification for gridded data," "ensemble point verification," "ensemble-stat point verification," and "point verification of ensemble-stat output"? .. _make-grid: @@ -917,10 +916,12 @@ Non-default parameters for the ``make_lbcs`` task are set in the ``task_make_lbc ``OMP_STACKSIZE_MAKE_LBCS``: (Default: "1024m") Controls the size of the stack for threads created by the OpenMP implementation. +.. _FcstConfigParams: + FORECAST Configuration Parameters ===================================== -Non-default parameters for the ``run_fcst`` task are set in the ``task_run_fcst`` section of the ``config.yaml`` file. +Non-default parameters for the ``run_fcst`` task are set in the ``task_run_fcst:`` section of the ``config.yaml`` file. Basic Task Parameters --------------------------------- @@ -961,13 +962,15 @@ For each workflow task, certain parameter values must be passed to the job sched ``OMP_STACKSIZE_RUN_FCST``: (Default: "1024m") Controls the size of the stack for threads created by the OpenMP implementation. +.. _ModelConfigParams: + Model Configuration Parameters ---------------------------------- These parameters set values in the Weather Model's ``model_configure`` file. ``DT_ATMOS``: (Default: "") - Time step for the outermost atmospheric model loop in seconds. This corresponds to the frequency at which the physics routines and the top level dynamics routine are called. (Note that one call to the top-level dynamics routine results in multiple calls to the horizontal dynamics, :term:`tracer` transport, and vertical dynamics routines; see the `FV3 dycore scientific documentation `__ for details.) Must be set. Takes an integer value. In the SRW App, a default value for ``DT_ATMOS`` appears in the ``set_predef_grid_params.yaml`` script, but a different value can be set in ``config.yaml``. + Time step for the outermost atmospheric model loop in seconds. This corresponds to the frequency at which the physics routines and the top level dynamics routine are called. (Note that one call to the top-level dynamics routine results in multiple calls to the horizontal dynamics, :term:`tracer` transport, and vertical dynamics routines; see the `FV3 dycore scientific documentation `__ for details.) Must be set. Takes an integer value. In the SRW App, a default value for ``DT_ATMOS`` appears in the ``set_predef_grid_params.yaml`` script, but a different value can be set in ``config.yaml``. In general, the smaller the grid cell size is, the smaller this value needs to be in order to avoid numerical instabilities during the forecast. ``RESTART_INTERVAL``: (Default: 0) Frequency of the output restart files in hours. Using the default interval (0), restart files are produced at the end of a forecast run. When ``RESTART_INTERVAL: 1``, restart files are produced every hour with the prefix "YYYYMMDD.HHmmSS." in the ``RESTART`` directory. @@ -1012,7 +1015,7 @@ Write-Component (Quilting) Parameters Flag that determines whether to output extra (debugging) information from :term:`ESMF` routines. Note that the write component uses ESMF library routines to interpolate from the native forecast model grid to the user-specified output grid (which is defined in the model configuration file ``model_configure`` in the forecast run directory). Valid values: ``True`` | ``False`` ``WRTCMP_write_groups``: (Default: 1) - The number of write groups (i.e., groups of :term:`MPI` tasks) to use in the write component. + The number of write groups (i.e., groups of :term:`MPI` tasks) to use in the write component. Each write group will write to one set of output files (a ``dynf${fhr}.nc`` and a ``phyf${fhr}.nc`` file, where ``${fhr}`` is the forecast hour). Each write group contains ``WRTCMP_write_tasks_per_group`` tasks. Usually, one write group is sufficient. This may need to be increased if the forecast is proceeding so quickly that a single write group cannot complete writing to its set of files before there is a need/request to start writing the next set of files at the next output time. ``WRTCMP_write_tasks_per_group``: (Default: 20) The number of MPI tasks to allocate for each write group. @@ -1912,11 +1915,6 @@ The parameters below turn on SPP in Noah or RUC LSM (support for Noah MP is in p ``LSM_SPP_MAG_LIST``: (Default: [ 0.017, 0.001, 0.001, 0.001, 0.001, 0.001, 0.2 ] ) Sets the maximum random pattern amplitude for each of the LSM perturbations. -.. COMMENT: This variable no longer appears and was going to be removed. See if anything has replaced it. - ``LSM_SPP_EACH_STEP``: (Default: "true") - When set to "TRUE", it sets ``lndp_each_step=.true.`` and perturbs each time step. - - .. _HaloBlend: Halo Blend Parameter diff --git a/docs/UsersGuide/source/ContainerQuickstart.rst b/docs/UsersGuide/source/ContainerQuickstart.rst index 8b72c079e5..025893a840 100644 --- a/docs/UsersGuide/source/ContainerQuickstart.rst +++ b/docs/UsersGuide/source/ContainerQuickstart.rst @@ -4,7 +4,7 @@ Container-Based Quick Start Guide ==================================== -This Container-Based Quick Start Guide will help users build and run the "out-of-the-box" case for the Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application using a `Singularity `__ container. The :term:`container` approach provides a uniform enviroment in which to build and run the SRW App. Normally, the details of building and running the SRW App vary from system to system due to the many possible combinations of operating systems, compilers, :term:`MPI`'s, and package versions available. Installation via Singularity container reduces this variability and allows for a smoother SRW App build experience. Normally, containers can only run on a single compute node and are not compatible with the `Rocoto workflow manager `__, so users must run each task in the workflow manually. However, the Singularity container described in this chapter has been adapted such that it is able to run across multiple nodes using Rocoto. This makes it an excellent starting point for beginners. The :ref:`non-container build approach ` may still be more appropriate for users who desire additional customizability, particularly if they already have experience running the SRW App. +This Container-Based Quick Start Guide will help users build and run the "out-of-the-box" case for the Unified Forecast System (:term:`UFS`) Short-Range Weather (SRW) Application using a `Singularity `__ container. The :term:`container` approach provides a uniform enviroment in which to build and run the SRW App. Normally, the details of building and running the SRW App vary from system to system due to the many possible combinations of operating systems, compilers, :term:`MPIs `, and package versions available. Installation via Singularity container reduces this variability and allows for a smoother SRW App build experience. Normally, containers can only run on a single compute node and are not compatible with the `Rocoto workflow manager `__, so users must run each task in the workflow manually. However, the Singularity container described in this chapter has been adapted such that it is able to run across multiple nodes using Rocoto. This makes it an excellent starting point for beginners. The :ref:`non-container build approach ` may still be more appropriate for users who desire additional customizability, particularly if they already have experience running the SRW App. The "out-of-the-box" SRW App case described in this User's Guide builds a weather forecast for June 15-16, 2019. Multiple convective weather events during these two days produced over 200 filtered storm reports. Severe weather was clustered in two areas: the Upper Midwest through the Ohio Valley and the Southern Great Plains. This forecast uses a predefined 25-km Continental United States (:term:`CONUS`) grid (RRFS_CONUS_25km), the Global Forecast System (:term:`GFS`) version 16 physics suite (FV3_GFS_v16 :term:`CCPP`), and :term:`FV3`-based GFS raw external model data for initialization. @@ -28,7 +28,7 @@ Users must have an **Intel** compiler and :term:`MPI` (available for free `here Install Singularity ^^^^^^^^^^^^^^^^^^^^^^^ -To build and run the SRW App using a Singularity container, first install the Singularity package according to the `Singularity Installation Guide `__. This will include the installation of dependencies and the installation of the Go programming language. SingularityCE Version 3.7 or above is recommended. +To build and run the SRW App using a Singularity container, first install the Singularity package according to the `Singularity Installation Guide `__. This will include the installation of dependencies and the installation of the Go programming language. SingularityCE Version 3.7 or above is recommended. .. warning:: Docker containers can only be run with root privileges, and users cannot have root privileges on :term:`HPCs `. Therefore, it is not possible to build the SRW App, which uses the HPC-Stack, inside a Docker container on an HPC system. However, a Singularity image may be built directly from a Docker image for use on the system. @@ -124,11 +124,11 @@ On non-Level 1 systems, users should build the container in a writable sandbox: Some users may prefer to issue the command without the ``sudo`` prefix. Whether ``sudo`` is required is system-dependent. .. note:: - Users can choose to build a release version of the container (SRW App v2.0.0) using a similar command: + Users can choose to build a release version of the container (SRW App v2.1.0) using a similar command: .. code-block:: console - sudo singularity build --sandbox ubuntu20.04-intel-srwapp docker://noaaepic/ubuntu20.04-intel22-ufs-srwapp:release-public-v2 + sudo singularity build --sandbox ubuntu20.04-intel-srwapp docker://noaaepic/ubuntu20.04-intel-srwapp:release-public-v2.1.0 .. _WorkOnHPC: @@ -210,7 +210,7 @@ Users can run ``exit`` to exit the shell. Download and Stage the Data ============================ -The SRW App requires input files to run. These include static datasets, initial and boundary condition files, and model configuration files. On Level 1 systems, the data required to run SRW App tests are already available as long as the bind argument (starting with ``-B``) in :numref:`Step %s ` included the directory with the input model data. For Level 2-4 systems, the data must be added manually by the user. Detailed instructions on how to add the data can be found in :numref:`Section %s `. Sections :numref:`%s ` and :numref:`%s ` contain useful background information on the input and output files used in the SRW App. +The SRW App requires input files to run. These include static datasets, initial and boundary condition files, and model configuration files. On Level 1 systems, the data required to run SRW App tests are already available as long as the bind argument (starting with ``-B``) in :numref:`Step %s ` included the directory with the input model data. See :numref:`Table %s ` for Level 1 data locations. For Level 2-4 systems, the data must be added manually by the user. Detailed instructions on how to add the data can be found in :numref:`Section %s `. Sections :numref:`%s ` and :numref:`%s ` contain useful background information on the input and output files used in the SRW App. .. _GenerateForecastC: @@ -246,7 +246,7 @@ To activate the regional workflow, run the following commands: where: - * ```` is replaced with the actual path to the modulefiles on the user's system (often ``$PWD/modulefiles``), and + * ```` is replaced with the actual path to the modulefiles on the user's local system (often ``$PWD/modulefiles``), and * ```` is a valid, lowercased machine/platform name (see the ``MACHINE`` variable in :numref:`Section %s `). The ``wflow_`` modulefile will then output instructions to activate the regional workflow. The user should run the commands specified in the modulefile output. For example, if the output says: @@ -310,7 +310,7 @@ From here, users can follow the steps below to configure the out-of-the-box SRW .. code-block:: console USE_CRON_TO_RELAUNCH: TRUE - CRON_RELAUNCH_INTVL_MNTS: 02 + CRON_RELAUNCH_INTVL_MNTS: 3 There are instructions for running the experiment via additional methods in :numref:`Section %s `. However, this technique (automation via :term:`crontab`) is the simplest option. @@ -351,7 +351,7 @@ Run the following command to generate the workflow: .. code-block:: console - python generate_FV3LAM_wflow.py + ./generate_FV3LAM_wflow.py This workflow generation script creates an experiment directory and populates it with all the data needed to run through the workflow. The last line of output from this script should start with ``*/3 * * * *`` (or similar). @@ -387,7 +387,7 @@ where: New Experiment =============== -To run a new experiment in the container at a later time, users will need to rerun the commands in :numref:`Section %s ` to reactivate the regional workflow. Then, users can configure a new experiment by updating the environment variables in ``config.yaml`` to reflect the desired experiment configuration. Basic instructions appear in :numref:`Section %s ` above, and detailed instructions can be viewed in :numref:`Section %s `. After adjusting the configuration file, regenerate the experiment by running ``python generate_FV3LAM_wflow.py``. +To run a new experiment in the container at a later time, users will need to rerun the commands in :numref:`Section %s ` to reactivate the regional workflow. Then, users can configure a new experiment by updating the environment variables in ``config.yaml`` to reflect the desired experiment configuration. Basic instructions appear in :numref:`Section %s ` above, and detailed instructions can be viewed in :numref:`Section %s `. After adjusting the configuration file, regenerate the experiment by running ``./generate_FV3LAM_wflow.py``. Plot the Output =============== diff --git a/docs/UsersGuide/source/ContributorsGuide.rst b/docs/UsersGuide/source/ContributorsGuide.rst deleted file mode 100644 index efb90ca4cc..0000000000 --- a/docs/UsersGuide/source/ContributorsGuide.rst +++ /dev/null @@ -1,463 +0,0 @@ - -.. _ContributorsGuide: - -============================== -SRW App Contributor's Guide -============================== - -.. _Background: - -Background -=========== - -Authoritative branch ------------------------ - -The ``ufs-srweather-app`` repository maintains a main branch for development called ``develop``. The HEAD of ``develop`` reflects the latest development changes. It points to regularly updated hashes for individual sub-components. Pull requests (PRs) will be merged to ``develop``. - -The ``develop`` branch is protected by the code management team: - #. Pull requests for this branch require approval by at least two code reviewers. - #. A code manager should perform at least one of the reviews and the merge, but other contributors are welcome to provide comments/suggestions. - - -Code Management Team --------------------------- - -Scientists from across multiple labs and organizations have volunteered to review pull requests for the ``develop`` branch: - -.. table:: - - +------------------+------------------------------------------------+---------------------------------------------------------------------------------------------+ - | **Organization** | **Reviewers** | **Areas of Expertise** | - +==================+================================================+=============================================================================================+ - | EMC | Chan-Hoo Jeon (@chan-hoo) | Workflow, Operational platform testing (WCOSS/NCO), and Air quality modeling (Online-CMAQ) | - | +------------------------------------------------+---------------------------------------------------------------------------------------------+ - | | Ben Blake (@BenjaminBlake-NOAA) | Output visualization, Rocoto | - | +------------------------------------------------+---------------------------------------------------------------------------------------------+ - | | Ratko Vasic (@RatkoVasic-NOAA) | Workflow, NCO requirements, and operational platform testing | - +------------------+------------------------------------------------+---------------------------------------------------------------------------------------------+ - | EPIC | Mark Potts (@mark-a-potts) | HPC systems | - | +------------------------------------------------+---------------------------------------------------------------------------------------------+ - | | Jong Kim (@jkbk2004) | UFS Weather Model configuration, forecast sensitivity analysis, data assimilation | - | +------------------------------------------------+---------------------------------------------------------------------------------------------+ - | | Michael Lueken (@MichaelLueken) | SRW App code management | - | +------------------------------------------------+---------------------------------------------------------------------------------------------+ - | | Natalie Perlin (@natalie-perlin) | Generic Linux/Mac installations, hpc-stack/spack-stack | - | +------------------------------------------------+---------------------------------------------------------------------------------------------+ - | | Gillian Petro (@gspetro-NOAA) | Documentation | - | +------------------------------------------------+---------------------------------------------------------------------------------------------+ - | | Edward Snyder (@EdwardSnyder-NOAA) | WE2E testing, input data | - +------------------+------------------------------------------------+---------------------------------------------------------------------------------------------+ - | GLERL | David Wright (@dmwright526) | FVCOM integration, output visualization, preprocessing tasks | - +------------------+------------------------------------------------+---------------------------------------------------------------------------------------------+ - | GSL | Jeff Beck (@JeffBeck-NOAA) | SRW App configuration/workflow, code management, meteorological evaluation | - | +------------------------------------------------+---------------------------------------------------------------------------------------------+ - | | Gerard Ketefian (@gsketefian) | regional workflow scripts, jinja templates, and verification tasks | - | +------------------------------------------------+---------------------------------------------------------------------------------------------+ - | | Linlin Pan (@panll) | Workflow, CCPP/physics, verification | - | +------------------------------------------------+---------------------------------------------------------------------------------------------+ - | | Christina Holt (@christinaholtNOAA) | Workflow, conda environment support, testing, and code management | - | +------------------------------------------------+---------------------------------------------------------------------------------------------+ - | | Christopher Harrop (@christopherwharrop-noaa) | Rocoto, code management, and testing | - | +------------------------------------------------+---------------------------------------------------------------------------------------------+ - | | Daniel Abdi (@danielabdi-noaa) | Workflow generation, testing RRFS on the cloud, environment modules | - +------------------+------------------------------------------------+---------------------------------------------------------------------------------------------+ - | NCAR | Mike Kavulich (@mkavulich) | CCPP/physics | - | +------------------------------------------------+---------------------------------------------------------------------------------------------+ - | | Will Mayfield (@willmayfield) | Verification/METplus tasks, regional workflow (esp. on Cheyenne) | - +------------------+------------------------------------------------+---------------------------------------------------------------------------------------------+ - | NSSL | Yunheng Wang (@ywangwof) | HPC systems, code management and regional workflow especially on Stampede, Jet | - | | | and NSSL computers | - +------------------+------------------------------------------------+---------------------------------------------------------------------------------------------+ - -.. _ContribProcess: - -Contribution Process -======================== - -The steps below should be followed in order to make changes to the ``develop`` branch of the ``ufs-srweather-app`` repository. Communication with code managers and the code management team throughout the process is encouraged. - - #. **Issue** - Open an issue to document changes. Click `here `__ to open a new ``ufs-srweather-app`` issue or see :numref:`Step %s ` for detailed instructions. - #. **GitFlow** - Follow `GitFlow `__ procedures for development. - #. **Fork the repository** - Read more `here `__ about forking in GitHub. - #. **Create a branch** - Create a branch in your fork of the authoritative repository. Follow `GitFlow `__ conventions when creating the branch. All development should take place on a branch, *not* on ``develop``. Branches should be named as follows, where [name] is a one-word description of the branch: - - * **bugfix/[name]:** Fixes a demonstrably incorrect portion of code - * **feature/[name]:** Adds a new feature to the code or improves an existing portion of the code - * **text/[name]:** Changes elements of the repository that do not impact program output or log files (e.g., changes to README, documentation, comments, changing quoted Registry elements, white space alignment). Any change that does not impact the compiled code in any way should fall under this category. - - #. **Development** - Perform and test changes in the branch (not on ``develop``!). Document work in the issue and mention the issue number in commit messages to link your work to the issue (e.g., ``commit -m "Issue #23 - "``). Test code modifications on as many platforms as possible, and request help with further testing from the code management team when unable to test on all Level 1 platforms. Document changes to the workflow and capabilities in the ``.rst`` files so that the SRW App documentation stays up-to-date. - #. **Pull request** - When ready to merge changes back to the ``develop`` branch, the code developer should initiate a pull request (PR) of the feature branch into the ``develop`` branch. Read `here `__ about pull requests in GitHub. When a PR is initiated, the :ref:`PR Template