Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ESMF requirement for external land component #1542

Closed
uturuncoglu opened this issue Dec 19, 2022 · 94 comments · Fixed by #1845
Closed

ESMF requirement for external land component #1542

uturuncoglu opened this issue Dec 19, 2022 · 94 comments · Fixed by #1845
Labels
enhancement New feature or request

Comments

@uturuncoglu
Copy link
Collaborator

Description

We are in a transition on moving from FMS to ESMF to handle multi-tile file access (read/write) under new external land component (NOAHMP) and ESMF tag v8.5.0b10 has all the development in terms of multi-tile file I/O through the PIO.

Solution

Install v8.5.0b10 on supported platforms and update UFS to use this version.

Alternatives

N/A

Related to

Directly reference any issues or PRs in this or other repositories that this is related to, and describe how they are related.
N/A

@uturuncoglu uturuncoglu added the enhancement New feature or request label Dec 19, 2022
@uturuncoglu
Copy link
Collaborator Author

@junwang-noaa I created this issue to track installation of new ESMF tag which is required for the external land component and it is required for the next PR related with it. I am not expecting next external land component PR soon but it would be nice to start thinking about it since installation of new ESMF tag could take time. I think that will be handled by the EPIC team but I am not sure. Let me know what do you think?

@junwang-noaa
Copy link
Collaborator

@uturuncoglu Thanks for creating the issue. We are currently getting the ESMF 840 release version installed and used in ufs WM as we have the operational code freeze coming soon and only release version is accepted in operation. We can ask EPIC team to install some test version ESMF v8.5.0b10 on R&D platform for this external land component work.

@uturuncoglu
Copy link
Collaborator Author

@junwang-noaa Thanks. I think once operational code freezing is passed. The UFS model could start using beta snapshots again. Right?

@junwang-noaa
Copy link
Collaborator

I think so.

@uturuncoglu
Copy link
Collaborator Author

@junwang-noaa is there any update about it? What about the operational code freeze. Is it done? Once this will available I am plaining to replace the I/O layer in the land component.

@junwang-noaa
Copy link
Collaborator

No, not yet. We are waiting for HR1 testing before we create a tag.
@jkbk2004 can your team install ESMF v8.5.0b10 library on hera? Thanks

@uturuncoglu
Copy link
Collaborator Author

@junwang-noaa Thanks for the update. I think NCAR's Cheyenne will be better since I have no access to Hera.

@jkbk2004
Copy link
Collaborator

@uturuncoglu We can coordinate thru EPIC on cheyenne.

@jkbk2004
Copy link
Collaborator

I will give a try to install 8.5.0b10 on Cheyenne over weekend.

@uturuncoglu
Copy link
Collaborator Author

@jkbk2004 is there any progress on this? thanks.

@jkbk2004
Copy link
Collaborator

@jkbk2004 is there any progress on this? thanks.

@uturuncoglu give a try 8.5.0b10 installed at /glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/modulefiles/stack. Last week was busy one for program increment planning. Let me know

@uturuncoglu
Copy link
Collaborator Author

uturuncoglu commented Jan 24, 2023

@jkbk2004 Thanks for your help. I tried to compile the model with new version of ESMF and I am getting following error from the link step,

/usr/lib64/gcc/x86_64-suse-linux/4.8/../../../../x86_64-suse-linux/bin/ld: /glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/esmf/8.5.0b10/lib/libpioc.a(pioc.c.o): in function `PIOc_iosystem_is_active':
/glade/work/jongkim/stacks/hash/hpc-stack-6eb6/pkg/v8.5.0b10/src/Infrastructure/IO/PIO/ParallelIO/src/clib/pioc.c:97: multiple definition of `PIOc_iosystem_is_active'; /glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/pio/2.5.7/lib/libpioc.a(pioc.c.o):/glade/work/jongkim/stacks/hash/hpc-stack-6eb6/pkg/pio-2.5.7/src/clib/pioc.c:97: first defined here

I think that ESMF 8.5.0b10 is using its own internal PIO library and this is conflicting with the external installation due to the version differences maybe. Is it possible to install ESMF by pointing external PIO. So, it would not cause a conflict. It seems that ESMF is using 2.5.10. You could set following variables for it,

export ESMF_PIO="external"
export ESMF_PIO_LIBPATH=$PIO_LIBDIR
export ESMF_PIO_INCLUDE=$PIO_INCDIR

I wonder if UFS tested with the ESMF Version > 8.3.0b09 before.

@uturuncoglu
Copy link
Collaborator Author

@jkbk2004 Hi, I just want to check the current status of this installation. Thanks.

@jkbk2004
Copy link
Collaborator

jkbk2004 commented Feb 1, 2023

@jkbk2004 Hi, I just want to check the current status of this installation. Thanks.

@uturuncoglu I will take a look. I will get back to you tomorrow.

@uturuncoglu
Copy link
Collaborator Author

@jkbk2004 Thank you. It is not super urgent but it would be nice to have it soon since I am planing to put restructured I/O code that leverages from ESMF multi-tile support to Noah-MP.

@jkbk2004
Copy link
Collaborator

jkbk2004 commented Feb 2, 2023

@jkbk2004 Thank you. It is not super urgent but it would be nice to have it soon since I am planing to put restructured I/O code that leverages from ESMF multi-tile support to Noah-MP.

I tried ... but it sounds like an issue make chkdir_apps make[5]: Entering directory '/glade/work/jongkim/stacks/hash/hpc-stack-6eb6/pkg/v8.5.0b10/src/apps/ESMF_PrintInfo' make[5]: Leaving directory '/glade/work/jongkim/stacks/hash/hpc-stack-6eb6/pkg/v8.5.0b10/src/apps/ESMF_PrintInfo' mpif90 -m64 -mcmodel=small -pthread -threads -cxxlib -Wl,--no-as-needed -qopenmp -L/glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/hdf5/1.10.6/lib -L/glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/zlib/1.2.11/lib -L/glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/esmf/8.5.0b10/lib -L/glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/netcdf/4.7.4/lib -L/glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/pio/2.5.7/lib -Wl,-rpath,/glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/esmf/8.5.0b10/lib -Wl,-rpath,/glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/netcdf/4.7.4/lib -Wl,-rpath,/glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/pio/2.5.7/lib -o /glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/esmf/8.5.0b10/bin/ESMF_PrintInfo /glade/work/jongkim/stacks/hash/hpc-stack-6eb6/pkg/v8.5.0b10/obj/objO/Linux.intel.64.mpt.default/src/apps/ESMF_PrintInfo/ESMF_PrintInfo.o -lesmf -lmpi++ -lrt -ldl -lnetcdff -lnetcdf -lhdf5_hl -lhdf5 -lz -ldl -lm -lpioc /usr/lib64/gcc/x86_64-suse-linux/4.8/../../../../x86_64-suse-linux/bin/ld: /glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/esmf/8.5.0b10/lib/libesmf.so: undefined reference to PIOc_InitDecomp_ReadOnly'
/glade/work/jongkim/stacks/hash/hpc-stack-6eb6/pkg/v8.5.0b10/build/common.mk:2583: recipe for target '/glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/esmf/8.5.0b10/bin/ESMF_PrintInfo' failed
make[4]: *** [/glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/intel-2022.1/mpt-2.25/esmf/8.5.0b10/bin/ESMF_PrintInfo] Error 1`

@jkbk2004
Copy link
Collaborator

jkbk2004 commented Feb 2, 2023

I was using pio/2.5.7 installed already at /glade/work/epicufsrt/GMTB/tools/intel/2022.1/hpc-stack-v1.2.0_6eb6/modulefiles/stack

@jkbk2004
Copy link
Collaborator

jkbk2004 commented Feb 2, 2023

Yeah, we need pio-2.5.8 that has PIOc_InitDecomp_ReadOnly

@jkbk2004
Copy link
Collaborator

jkbk2004 commented Feb 2, 2023

let me try again with pio-2.5.8

@jkbk2004
Copy link
Collaborator

jkbk2004 commented Feb 2, 2023

@uturuncoglu It did go thru with pio-2.5.8. Give a try one more time with module path https://github.com/ufs-community/ufs-weather-model/blob/develop/modulefiles/ufs_cheyenne.intel.lua

@uturuncoglu
Copy link
Collaborator Author

@jkbk2004 Thanks for your help. I could able to compile the model with pio 2.5.8 and esmf 8.5.0b10. I'll try to update my fork with new I/O later that uses ESMF multi-tile support to see what happens. I'll update you soon.

@uturuncoglu
Copy link
Collaborator Author

@jkbk2004 I confirm that it is working without any issue. BTW, do we have also GNU version on Cheyenne. It would be nice to test new I/O implementation under GNU too to see any possible issues.

@DeniseWorthen
Copy link
Collaborator

@uturuncoglu In CMEPS, there is a routine in med.F90 called med_grid_write, which is limited right now to tileCount=1. Will the new I/O features allow tileCount>1 in this routine?

@jkbk2004
Copy link
Collaborator

jkbk2004 commented Feb 8, 2023

@jkbk2004 I confirm that it is working without any issue. BTW, do we have also GNU version on Cheyenne. It would be nice to test new I/O implementation under GNU too to see any possible issues.
@uturuncoglu sure! I will install them on gnu as well. I will keep you posted: maybe sometime this afternoon.

@uturuncoglu
Copy link
Collaborator Author

@DeniseWorthen I think we could try to remove that restriction with the recent update in ESMF side. I am currently working on restructuring I/O later in Noah-MP component model. Once I have finalized that one, I could try to test it on CMEPS.

@jkbk2004
Copy link
Collaborator

@uturuncoglu we migrated cheyenne hpc stack locations yesterday. Old ones still available. I want to follow up again with new locations. @natalie-perlin can you install esmf-8.5.0b10 on cheyenne? it needs pio-2.5.8 (read conversation above). Please, give a priority. Installation itself goes thru quickly.

@uturuncoglu
Copy link
Collaborator Author

@jkbk2004 @natalie-perlin You mean the module locations are changed? BTW, last tag is v8.5.0b14 and also has couple of fix related with I/O but I think it requires pio-2.5.10. Anyway, we could also stick to the esmf-8.5.0b10 and pio-2.5.8 for both Intel and GNU.

@jkbk2004
Copy link
Collaborator

@jkbk2004 @natalie-perlin You mean the module locations are changed? BTW, last tag is v8.5.0b14 and also has couple of fix related with I/O but I think it requires pio-2.5.10. Anyway, we could also stick to the esmf-8.5.0b10 and pio-2.5.8 for both Intel and GNU.

@uturuncoglu Yeah, we made location changes at weather model develop branch yesterday. But you can stay with old one. Let me install esmf-8.5.0b10 and pio-2.5.8 gnu to old location now. I will let you know in an hour or so.

@uturuncoglu
Copy link
Collaborator Author

@natalie-perlin Yes. The same environment that we used for previous beta snapshot will work. Thanks for your help. I could also try to install in my end but since there were some dependencies to ESMF such as MAPL, it would be hard for me to run entire test suite.

@natalie-perlin
Copy link
Collaborator

@uturuncoglu @junwang-noaa - FYI:
esmf/8.5.0b028 and mapl/2.35.2-esmf-8.5.0b28 were installed for the intel/2022.1 and gnu/10.1.0 stacks on Cheyenne:

/glade/work/epicufsrt/contrib/hpc-stack/intel2022.1_ncdf492
/glade/work/epicufsrt/contrib/hpc-stack/gnu10.1.0_ncdf492

@uturuncoglu
Copy link
Collaborator Author

@natalie-perlin Thank you very much. When I checkout ufs-weather-model head of develop, it points to /glade/work/epicufsrt/contrib/hpc-stack/intel2022.1/modulefiles/stack directory and it has different version of netcdf and hdf5 etc. I could change versions of those libraries too but I am not sure that will lead some answer change or not. Do you have any idea? At this point, I am planing to isolate the ESMF version change without changing any other library in the system.

@uturuncoglu
Copy link
Collaborator Author

@natalie-perlin I tested following but I am getting error from netcdf module,

Lmod has detected the following error: The following module(s) are unknown:
"netcdf/4.9.2"

Please check the spelling or version number. Also try "module spider ..."
It is also possible your cache file is out-of-date; it may help to try:
  $ module --ignore_cache load "netcdf/4.9.2"

Here is my ufs_common.lua,

whatis("Description: UFS build environment common libraries")

help([[Load UFS Model common libraries]])

local ufs_modules = {
  {["jasper"]      = "2.0.25"},
  {["zlib"]        = "1.2.11"},
  {["libpng"]      = "1.6.37"},
  --{["hdf5"]        = "1.10.6"},
  {["hdf5"]        = "1.14.0"},
  --{["netcdf"]      = "4.7.4"},
  {["netcdf"]      = "4.9.2"},
  --{["pio"]         = "2.5.7"},
  {["pio"]         = "2.5.10"},
  --{["esmf"]        = "8.3.0b09"},
  {["esmf"]        = "8.5.0b028"},
  {["fms"]         = "2022.04"},
  {["bacio"]       = "2.4.1"},
  {["crtm"]        = "2.4.0"},
  {["g2"]          = "3.4.5"},
  {["g2tmpl"]      = "1.10.2"},
  {["ip"]          = "3.3.3"},
  {["sp"]          = "2.3.3"},
  {["w3emc"]       = "2.9.2"},
  {["gftl-shared"] = "v1.5.0"},
  --{["mapl"]        = "2.22.0-esmf-8.3.0b09"},
  {["mapl"]        = "2.35.2-esmf-8.5.0b28"},
}

for i = 1, #ufs_modules do
  for name, default_version in pairs(ufs_modules[i]) do
    local env_version_name = string.gsub(name, "-", "_") .. "_ver"
    load(pathJoin(name, os.getenv(env_version_name) or default_version))
  end
end

I also modify MODULEPATH in ufs_cheyenne.intel.lua,

prepend_path("MODULEPATH", "/e2glade/work/epicufsrt/contrib/hpc-stack/intel2022.1_ncdf492/modulefiles/stack")

@uturuncoglu
Copy link
Collaborator Author

@natalie-perlin Sorry for false alarm. I fixed the path problem and I am making some progress. The only issue is that I have no scotch/7.0.3.

@uturuncoglu
Copy link
Collaborator Author

@natalie-perlin Okay. I got scotch/7.0.3 from original location. It seems that I could load all the required libraries and compile the model. Thanks for your help.

@uturuncoglu
Copy link
Collaborator Author

@natalie-perlin I am getting error with FMS for datm_cdeps_control_cfsr_intel, maybe the model is not compatible with this new FMS version,

26:MPT:    from /glade/u/apps/ch/os/lib64/libpthread.so.0
26:MPT: Missing separate debuginfos, use: zypper install glibc-debuginfo-2.22-100.27.3.x86_64
26:MPT: (gdb) #0  0x00002b9f630f87da in waitpid ()
26:MPT:    from /glade/u/apps/ch/os/lib64/libpthread.so.0
26:MPT: #1  0x00002b9f6343dc66 in mpi_sgi_system (
26:MPT: #2  MPI_SGI_stacktraceback (
26:MPT:     header=header@entry=0x7ffd4ecf1490 "MPT ERROR: Rank 26(g:26) received signal SIGSEGV(11).\n\tProcess ID: 67703, Host: r10i5n14, Program: /glade/scratch/turuncu/FV3_RT/rt_8590/datm_cdeps_control_cfsr_intel/fv3.exe\n\tMPT Version: HPE MPT 2.2"...) at sig.c:340
26:MPT: #3  0x00002b9f6343de66 in first_arriver_handler (signo=signo@entry=11,
26:MPT:     stack_trace_sem=stack_trace_sem@entry=0x2b9f71040080) at sig.c:489
26:MPT: #4  0x00002b9f6343e0f3 in slave_sig_handler (signo=11,
26:MPT:     siginfo=<optimized out>, extra=<optimized out>) at sig.c:565
26:MPT: #5  <signal handler called>
26:MPT: #6  0x0000000002fd7d0a in netcdf_mp_nf90_inquire_variable_ ()
26:MPT: #7  0x000000000374b324 in netcdf_io_mod_mp_get_variable_num_dimensions_ ()
26:MPT: #8  0x0000000003749615 in netcdf_io_mod_mp_netcdf_read_data_0d_ ()
26:MPT: #9  0x00000000037655ac in netcdf_io_mod_mp_compressed_read_0d_ ()
26:MPT: #10 0x00000000037bbc10 in grid2_mod_mp_open_mosaic_file_ ()
26:MPT: #11 0x00000000037bcf16 in grid2_mod_mp_open_component_mosaics_ ()
26:MPT: #12 0x00000000037bd0cd in grid2_mod_mp_grid_init_ ()
26:MPT: #13 0x00000000036e6fca in fms_mod_mp_fms_init_ ()
26:MPT: #14 0x0000000001c77da3 in mom_cap_mod_mp_initializeadvertise_ ()
26:MPT: #15 0x00000000009fcc76 in ESMCI::FTable::callVFuncPtr(char const*, ESMCI::VM*, int*) ()
26:MPT: #16 0x0000000000a00b86 in ESMCI_FTableCallEntryPointVMHop ()
26:MPT: #17 0x00000000008fde3b in ESMCI::VMK::enter(ESMCI::VMKPlan*, void*, void*) ()
26:MPT: #18 0x00000000012dd4ba in ESMCI::VM::enter(ESMCI::VMPlan*, void*, void*) ()
26:MPT: #19 0x00000000009fe297 in c_esmc_ftablecallentrypointvm_ ()
26:MPT: #20 0x0000000000921880 in esmf_compmod_mp_esmf_compexecute_ ()
26:MPT: #21 0x0000000000c2cc56 in esmf_gridcompmod_mp_esmf_gridcompinitialize_ ()
26:MPT: #22 0x000000000089f6b0 in nuopc_driver_mp_loopmodelcompss_ ()
26:MPT: #23 0x00000000008c7135 in nuopc_driver_mp_initializeipdv02p1_ ()
26:MPT: #24 0x00000000008d09bb in nuopc_driver_mp_initializegeneric_ ()
26:MPT: #25 0x00000000009fcc76 in ESMCI::FTable::callVFuncPtr(char const*, ESMCI::VM*, int*) ()
26:MPT: #26 0x0000000000a00b86 in ESMCI_FTableCallEntryPointVMHop ()
26:MPT: #27 0x00000000008fdc4f in ESMCI::VMK::enter(ESMCI::VMKPlan*, void*, void*) ()
26:MPT: #28 0x00000000012dd4ba in ESMCI::VM::enter(ESMCI::VMPlan*, void*, void*) ()
26:MPT: #29 0x00000000009fe297 in c_esmc_ftablecallentrypointvm_ ()
26:MPT: #30 0x0000000000921880 in esmf_compmod_mp_esmf_compexecute_ ()
26:MPT: #31 0x0000000000c2cc56 in esmf_gridcompmod_mp_esmf_gridcompinitialize_ ()
26:MPT: #32 0x00000000004180c4 in MAIN__ ()
26:MPT: #33 0x0000000000417062 in main ()
26:MPT: #34 0x00002b9f6471fa35 in __libc_start_main ()
26:MPT:    from /glade/u/apps/ch/os/lib64/libc.so.6
26:MPT: #35 0x0000000000416f69 in _start ()

The UFS Weather model was using 2022.04 but the new modules are using 2023.01. Is it possible to install FMS 2022.04 to the new location too. I tired to used it from old location but FMS_ROOT is corrupted and model could not find FMS library.

@natalie-perlin
Copy link
Collaborator

@uturuncoglu - got your notes, will update later today!

@uturuncoglu
Copy link
Collaborator Author

@natalie-perlin Thank you for your help. In any case, there is an issue in baseline on Cheyenne too. I am also waiting for it. So, there is no too urgent at this point. Maybe we could not have test before ESMF release but at least I could test it latest snapshot before I have PR for my upcoming component land PR. We might need to install ESMF 8.5 to all the supported platforms once it is released. It is required for the component land model.

@uturuncoglu
Copy link
Collaborator Author

@natalie-perlin So, maybe it is worth to wait for 8.5 and install it after the release. It would appear soon. I'll update you about it.

@uturuncoglu
Copy link
Collaborator Author

@natalie-perlin The ESMF 8.5 has just been released. Here is the link, https://github.com/esmf-org/esmf/releases/tag/v8.5.0 I think if we could install to Cheyenne and Orion (Cheyenne will be down next week) by using existing modules used under UFS weather model that would be great. Then I could test the land implementation and also run entire RT and report the results through the issues if I have. Again, thanks for great help and sorry about extra work.

@natalie-perlin
Copy link
Collaborator

@uturuncoglu -
As to new ESMF 8.5, does it need to be installed in the current stack locations that are based on hdf5/1.10.6 and netcdf/4.7.4? The other locations are the stacks with hdf5/1.14.0 and netcdf/4.9.2?

@uturuncoglu
Copy link
Collaborator Author

@natalie-perlin i am not familiar to the procedure to move UFS to new version of the ESMF. I think it would be nice to install to the current location and keep all other dependencies same (I think also MAPL needs to build with this version of ESMF)

@natalie-perlin
Copy link
Collaborator

@uturuncoglu - thank you for the input.
@jkbk2004 , @junwang-noaa - your suggestion on test installation of ESMF/8.5?

Option 1: Current stack locations, hdf5/1.10.6 and netcdf/4.7.4, adding esmf/8.5, and then mapl/2.22.0-esmf/8.5 ? mapl/2.35.2-esmf/8.5?

Option 2: Stacks with hdf5/1.14.0 and netcdf/4.9.2, adding esmf/8.5, and mapl/2.35.2-esmf/8.5? (not a problem to adding fms/2022.04 as @uturuncoglu asked)

@junwang-noaa
Copy link
Collaborator

@natalie-perlin I suggest going with option 2. Also I saw the ESMF 8.5.0 was released, would you please install the release version for us to test UFS weather model on hera too?
@Hang-Lei-NOAA my understanding is that the ESMF 8.4.2 is available on wcoss2. Would you please install the new ESMF 8.5.0 release version on acorn for testing purpose?
@uturuncoglu @theurich FYI.

@Hang-Lei-NOAA
Copy link

Hang-Lei-NOAA commented Aug 2, 2023 via email

@natalie-perlin
Copy link
Collaborator

natalie-perlin commented Aug 2, 2023

@uturuncoglu @Hang-Lei-NOAA @junwang-noaa -

Orion is on maintenance today, but esmf/8.5.0 installations were done earlier this week, as shown below (both option 1 and option 2):

ESMF 8.5.0 has been installed on Orion in the updated stack location (following a mandatory migration to a new role-epic project and disc space), as well as mapl/2.22.0-esmf-8.5.0.
Please feel free to test it!
The Orion stack in a new role-epic account location:
/work/noaa/epic/role-epic/contrib/orion/hpc-stack/intel-2022.1.2

This stack has been successfully tested for UFS-WM regression tests, SRW fundamental and met verification tests, and GSI regression tests.
PR to the UFS-WM repo: #1846
PR to a SRW repo: ufs-community/ufs-srweather-app#826
PR to a GSI repo: NOAA-EMC/GSI#571, merged: hu5970/GSI#15

Similar packages (esmf/8.5.0 and mapl/2.35.2-esmf-8.5.0) are installed in a new location of the hdf5/1.14.0- and netcdf/4.9.2-based stack (still hpc-stack for dev tasks) .
/work/noaa/epic/role-epic/contrib/orion/hpc-stack/intel-2022.1.2_ncdf492

@uturuncoglu
Copy link
Collaborator Author

@natalie-perlin Thanks for your help. I could not access to Orion today. I think it is down. I'll try tomorrow.

@natalie-perlin
Copy link
Collaborator

natalie-perlin commented Aug 3, 2023

@uturuncoglu @junwang-noaa -
Installed esmf/8.5.0 and mapl (2.22.0 or 2.352.) with esmf/8.5.0 on Hera:
Intel compiler:

/scratch1/NCEPDEV/nems/role.epic/hpc-stack/libs/intel-2022.1.2
/scratch1/NCEPDEV/nems/role.epic/hpc-stack/libs/intel-2022.1.2_ncdf492

Gnu:

/scratch1/NCEPDEV/nems/role.epic/hpc-stack/libs/gnu-9.2
/scratch1/NCEPDEV/nems/role.epic/hpc-stack/libs/gnu-9.2_ncdf492

@uturuncoglu
Copy link
Collaborator Author

@natalie-perlin Thanks for your help. I have no access to Hera. I could only use Cheyenne and Orion. As I know you already installed to Orion but it is still down doe to some disk issue. I think Cheyenne is back. If you could able to install to Cheyenne, I could try in there. Otherwise, I need to wait for Orion. anyway, Thanks again for your kind and great help.

@uturuncoglu
Copy link
Collaborator Author

@natalie-perlin As I know PIO needs to be updated to 2.5.10 to be consistent with the ESMf 8.5.0. I could only see the old version (2.5.7) under /work/noaa/epic/role-epic/contrib/orion/hpc-stack/intel-2022.1.2/modulefiles/mpi/intel/2022.1.2/impi/2022.1.2/pio/ on Orion. I am not sure which version is used to build ESMF but its internal PIO version is 2.5.10 and when it is used along with 2.5.7 external PIO package, it creates issue.

@BrianCurtis-NOAA
Copy link
Collaborator

BrianCurtis-NOAA commented Aug 4, 2023

There are two upcoming PRs that adjust PIO. First, one that updates PIO to 2.5.10 in hpc-stack and immediately following that a move to spack-stack that has PIO 2.5.10.

@BrianCurtis-NOAA
Copy link
Collaborator

The first upcoming PR updates the UFSWM to use PIO 2.5.10, it is already available in hpc-stack as far as I know.

@natalie-perlin
Copy link
Collaborator

natalie-perlin commented Aug 4, 2023

@uturuncoglu -
Please note that stacks that are named *_ncdf492 all have pio/2.5.10, in addition to hdf5/1/14.0 and netcdf/4.9.2
These would be
Orion:
/work/noaa/epic/role-epic/contrib/orion/hpc-stack/intel-2022.1.2_ncdf492/
Hera intel:
/scratch1/NCEPDEV/nems/role.epic/hpc-stack/libs/intel-2022.1.2_ncdf492/
Hera gnu:
/scratch1/NCEPDEV/nems/role.epic/hpc-stack/libs/gnu-9.2_ncdf492

UPD:
Cheyenne intel-2022.1:
/glade/work/epicufsrt/contrib/hpc-stack/src-intel2022.1_ncdf492
Cheyenne gnu-10.1.0_ncdf492
/glade/work/epicufsrt/contrib/hpc-stack/gnu10.1.0_ncdf492

@uturuncoglu
Copy link
Collaborator Author

@natalie-perlin Okay. Thanks. I'll test that one.

@natalie-perlin
Copy link
Collaborator

natalie-perlin commented Aug 7, 2023

@uturuncoglu -
Added esmf/8.5.0 and mapl/2.35.2-esmf-8.5.0 to Cheyenne gnu 10.1.0:
/glade/work/epicufsrt/contrib/hpc-stack/gnu10.1.0_ncdf492
Cheyenne intel-2022.1:
/glade/work/epicufsrt/contrib/hpc-stack/src-intel2022.1_ncdf492

@uturuncoglu
Copy link
Collaborator Author

@natalie-perlin @junwang-noaa @BrianCurtis-NOAA I think we could close this ticket since model is updated to ESMF 8.5.0 now. Let me know if you want to keep it.

@BrianCurtis-NOAA
Copy link
Collaborator

Sounds good to me. Closing

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

8 participants