-
Notifications
You must be signed in to change notification settings - Fork 139
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add WW3 updates to CESM NUOPC driver #717
Conversation
@@ -913,6 +943,26 @@ subroutine ice_export( exportState, rc ) | |||
! ice fraction | |||
ailohi(i,j,iblk) = min(aice(i,j,iblk), c1) | |||
|
|||
! ice thickness (m) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A comment about a preceding line (933) containing the OMP directive. Do additional variables need to be made private for the added floediam calculation?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good catch. Looks like k and floe_rad_c need to be added to the OMP PRIVATE statement. Anything else you noticed?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I had the wave-ice coupling working in a branch, but I need to make sure it works if we don't have wave-coupling activated.
I also noticed the change at LN 802 in import export---is this related to the c-grid development?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes! The C-grid has been merged onto main.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also, just found out that this code for the FSD needs to be in an if block. Probably the set_stateexport and so on needs to check for the fields being asked for as well.
!call grid_average_X2Y('F',work,'T',ss_tltx,'U') | ||
!work = ss_tlty | ||
!call grid_average_X2Y('F',work,'T',ss_tlty,'U') | ||
call t2ugrid_vector(uocn) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm getting a compile error for t2ugrid_vector, it has been removed from cicecore/cicedynB/infrastructure/ice_grid.F90
and replaced by the grid_averageX2Y
interface it seems.
I'm not sure I understand the @apcraig comment; maybe this entire block should be removed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
uocn, vocn, ss_tltx, ss_tlty are now assumed to be on the U grid and the averaging to the T grid is being done in CICE. It no longer needs to be done here. We are moving to a more general system in CICE where the location of the coupling/forcing variables are more clearly defined within CICE. This should be carefully reviewed in the context of CESM coupling to make sure it's working properly.
Good catch. I hadn't actually tested this cap with the latest C-grid changes. I will fix this. |
This actually makes life easier, so we don't have to worry about the grid internally in CICE. However, this is double averaging ... I think there is always going to be averaging whatever we do. Eventually, the fields should come through the coupler on the C-grid and we should not need to interpolate at all. |
You should not have the X2Y calls in the cap anymore. As you say, you will double average and the results will be incorrect. The way the implementation works now is that you set grid_ocn to A, B, or C and this tells CICE what grid the T and U variables are on. CICE then will interpolate forcing/coupling variables to whatever grid is needed internally. So for instance, if uocn is on the T grid, set grid_ocn="A" and do nothing in the cap. If uocn is on the U grid, set grid_ocn="B" and do nothing in the cap. If uocn is on the East face, set grid_ocn="C" and do nothing in the cap. Whether CICE is running on the B or C grid, it will interpolate grid_ocn to grid_ice correctly internally. Some of this hasn't been fully tested yet (i.e. coupling variables on the B or C grid), so we'll have to validate as we migrate to that capability, but it means there should be no more grid averaging in the caps. The coupling/forcing field locations just has to be consistent with A, B, or C staggers for everything to work. |
So if the MOM6 cap was modified to export uocn, vocn on the C-grid (vs what it does now, which is to move everything to the A-grid), CICE would now accommodate those velocities on the native C-grid. That's great. There is also a rotation (and back) to true directions (vs I-J directions) and as long as MOM6 and CICE6 were on the same grid, that also would not be needed. |
@DeniseWorthen, that's the idea. We have to validate when we start doing it. You do have to be careful about the rotations. In CESM, the standard is E/N. If you don't rotate to E/N, then you risk running into problems if the ice model is not on the same grid. You also almost certainly run into problems if the ocean currents are used in another component, say the atm model. There are many options including sending out both rotated and non-rotated versions of the fields for use in different places. Or maybe we need to set some rotation information either via namelist or coupling flags as part of controlling this stuff. TBD. |
We will likely need some flags here to account for rotation. Actually if the ANGLE array is all zeroes say, then we would have effectively no rotation. We would be doing an extra calculation for nothing. |
@dabail10, this all looks fine to me. When you have finished your testing and mods, could you make a note of that here, and I'll take a final look and approve. Thanks. |
I needed a couple of more changes to get it to compile. The
|
Great. Thanks. I will hopefully get to test this out with the head of Consortium main soon. |
I'm testing now w/ the wave coupling off and all baselines are changing. I expected them to be B4B, so I need to check this out some more. |
Curious. Cheyenne is back up, but I won't be able to look at this until Monday. The last version of the code I had was from December. I think there were some answer changes on Consortium main. The threading in particular ... Tony can confirm or deny this. |
I'm pretty sure the threading didn't change results, but maybe something else did. It could also be the changes from the C-grid. We tried to track how the C-grid changes would affect coupled models, but we didn't do any testing. @dabail10, if you tell me what "December" version of CICE is currently in CESM, I can review changes since then and see if anything jumps out. |
I was up to date with the Dec 17th hash, dfc6a11. Dave |
There were a couple bug fixes in the OMP stuff and variables that were not private. Maybe these did not impact the answers. |
Thanks, I updated NOAA-EMC just before the C-grid merge, which brought me up 8d7314f. At that point all baselines were B4B w/ our previous baselines. |
Thanks @DeniseWorthen, it sounds like you and @dabail10 are not synced up beyond maybe the cap. Whatever answer changes you're seeing must be coming from from C-grid and/or cap issues. It might be worthwhile for @dabail10 to update CICE to before the C-grid merge and verify results, cap should not have to change. Then update to C-grid merge and update cap. CICE standalone should be bit-for-bit with the C-grid merge. But, the coupled system may not be. Some of the averaging was moved out of the forcing/coupling code and into the dynamics. This was done in a bit-for-bit way for standalone, but if the averaging was being done slightly differently in the cap than in the standlone forcing, this could result in answer changes. I think this testing needs to be done carefully. Let me know if you want me to help. I'm willing to jump into CESM a bit and contribute. |
Thanks for the clarification for @apcraig. I did the update just prior to the Cgrid merge just so I'd be able to test in the way you suggest. What I have done is created an addCgrid branch and merged the latest main into it (I had one minor compile fix, for save_init). Then I ran our current develop branch and the Cgrid branch and compared the difference in the mediator history files for the ice at the end of the first ice timestep. I'm seeing some anomalies along the seam which may be of concern (these have a factor of 1e14 applied). The largest of these are the differences in taux,tauy exported by ice. . |
@DeniseWorthen. Thanks for doing this. Does 1e14 implies the differences are in the 14th digit? I have a couple ideas to see if we can figure out where the difference comes in. The first test would be to change cicecore/cicedynB/dynamics/ice_dyn_evp.F90 at about line 321, change the 'S' in these lines to an 'F'.
Does that bring the solutions closer together or make them the same? |
Yes, the differences are O~10-14 along the few rows at the seam; outside of that they drop by an order of magnitude. There was no significant difference using 'F' instead of 'S' comparing against the current develop branch. I need to find some time to dig into this too. I'd like to either look at the internal cice fields w/in the cap, or else the cice history fields written at each timestep. |
That's too bad it didn't change answers much. Were "S" and "F" bit-for-bit? I assume not. Just confirming, are you using evp dynamics? You could try the following. In cicecore/cicedynB/dynamics/ice_dyn_evp.F90 at about line 321, change
to
and in cicecore/drivers/nuopc/cmeps/ice_import_export.F90, uncomment at about line 808
If you can compare that against your 3 prior runs (pre-cgrid, cgrid, cgrid+mod1), that would be great. I think this should be identical to cgrid+mod1. If it isn't, that would be interesting to know. And if it isn't bit-for-bit with pre-cgrid, then we'll have to dig more. |
OK, great I'll try those cases. The mod1 ("F") was not B4B with "S". Our EVP settings are
|
This is great debugging. Thanks for this Denise. I will try to get up and running with CESM2 and the latest CICE today. |
Do we test the PIO interfaces? Anyhow, the function query_field in io_pio2/ice_restart.F90 has a USE_NETCDF CPP. I don't think this should be there. |
Ok. I added the changes to get the CMEPS/NUOPC cap to compile. I will try a run now. |
The DEBUG test is just hanging on cheyenne. I am trying to turn off threading. |
@apcraig I'll call your second set of changes "mod2". They test B4B with mod1 as you expected. But, they are not B4B with the base cgrid case or the pre-cgrid. The same pattern appears as we got initially. Just FYI, this test case for us starts with the ocean at rest, so uocn,vocn=0. However, the surface tilts are non-zero. I've set up on Cheyenne now, and I'll verify my previous results. |
Thanks @DeniseWorthen, results as expected. That's sort of good. I think the next step is more detailed debugging. I think the right approach is to update to just before the C-grid merge, then be able to do side by side testing between that and a version that has updated to the C-grid merge, as @DeniseWorthen has done. We have 3 models we could do this with, UFS, CESM, and RASM, that include two different caps. I will probably have to do the same testing with RASM. Between the three of us, what's the best way to proceed? Could we have a brief call today to chat about strategy and timeline? I'll send an email to organize. |
The changes to io_pio2/ice_restart.F90 were commited in #726 and create a conflict. Sorry about that. |
@dabail10 I'd like to get the wave-ice coupling changes merged. Have you had a chance to work on this? I have a branch I've been using so I'd be happy to make the PR if that takes the load off you. |
I have not. We have been waiting on feedback from Lettie and some additional WW related cap issues. I am in Madison at the AMSPolar right now, but could perhaps take a look at this next week. |
What is the status of this PR? Are there still issues that need to be debugged? |
We were finding other C-grid related issues that we were dealing with first. I think @DeniseWorthen had something with the waves here? |
I apologize for missing this conversation for an entire month. I did merge the changes for wave-ice coupling to the nuopc cap in emc/develop. At the time, we were ahead of escomp since I had also merged the c-grid changes to the cap. I haven't merged in the most recent changes from the consortium though (since Aug, PR #752). @dabail10 , what is the best course here? Should I make a PR to ESCOMP or to Consortium/main? |
I think it would be great if you could make the changes in the Consortium. I have a version in ESCOMP that has a number of new changes to the cap for CESM. I'd like to make sure to get your changes in. |
Ok, thanks. I'll prepare a PR. |
This has been replaced by #782 |
For detailed information about submitting Pull Requests (PRs) to the CICE-Consortium,
please refer to: https://github.com/CICE-Consortium/About-Us/wiki/Resource-Index#information-for-developers
PR checklist
Add updates for CESM NUOPC driver. Needed for coupling to WW3 and single column CAM.
dabail10 (D. Bailey)
Ran aux_cice testsuite within CESM.
These are the initial set of changes from CESM. These are only in the NUOPC driver and do not change CICE standalone.