Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Switch to mosaic for all plotting functionality #256

Merged
merged 12 commits into from
Feb 6, 2025

Conversation

andrewdnolan
Copy link
Collaborator

@andrewdnolan andrewdnolan commented Dec 24, 2024

Partially addresses #255 and fixes the plotting issues from #242.

Checklist

  • Developer's Guide has been updated
  • API documentation in the Developer's Guide (api.md) has any new or modified class, method and/or functions listed
  • Documentation has been built locally and changes look as expected
  • Testing comment in the PR documents testing used to verify the changes

@xylar
Copy link
Collaborator

xylar commented Dec 25, 2024

@andrewdnolan, could you change "Fixes #255" to "Partially addresses #255" since we don't want to add python 3.13 testing in this PR. If you leave things as they are, merging this PR will automatically close #255.

@andrewdnolan
Copy link
Collaborator Author

Testing

Setup for testing:
You'll want to apply the diff below to both the main and the development branch so that barotropic long run is only 3 months, instead of 3 years.

diff --git a/polaris/ocean/tasks/barotropic_gyre/forward.py b/polaris/ocean/tasks/barotropic_gyre/forward.py
index 2424bdc5d..4ab854740 100644
--- a/polaris/ocean/tasks/barotropic_gyre/forward.py
+++ b/polaris/ocean/tasks/barotropic_gyre/forward.py
@@ -139,7 +139,7 @@ class Forward(OceanModelStep):
             output_interval_str = time.strftime('0000_%H:%M:%S',
                                                 time.gmtime(run_duration))
         else:
-            stop_time_str = time.strftime('0004-01-01_00:00:00')
+            stop_time_str = time.strftime('0001-04-01_00:00:00')
             output_interval_str = time.strftime('0000-01-00_00:00:00')

         replacements = dict(

You'll also need a config file (below named custom.cfg) which contains:

[barotropic_gyre_default]
steps_to_run = init short_forward long_forward analysis

Setting up the baseline:

# with the head of E3SM's main branch checked out
source <POLARIS_DIR>/load_dev_polaris_0.5.0-alpha.1_pm-cpu_gnu_mpich.sh

# 9:   ocean/planar/barotropic_gyre/default
# 11:  ocean/planar/ice_shelf_2d/5km/z-star/default/with_viz
# 65: ocean/spherical/icos/cosine_bell/convergence_space/with_viz
polaris setup -n 9 11 65 -w $PSCRATCH/polaris_PR#256/main/ -f custom.cfg

pushd $PSCRATCH/polaris_PR#256/main/

salloc --nodes 3 --qos interactive --time 01:00:00 --constraint cpu --account m4259
bash job_script.custom.sh
# kill job once it's done 

Testing this PR's branch:

cd <POLARIS_DIR>
git checkout mosaic_viz

source <POLARIS_DIR>/load_dev_polaris_0.5.0-alpha.2_pm-cpu_gnu_mpich.sh

polaris setup -n 9 11 65  -w $PSCRATCH/polaris_PR#256/devel/ -b $PSCRATCH/polaris_PR#256/main/  -f custom.cfg

pushd $PSCRATCH/polaris_PR#256/devel/

salloc --nodes 3 --qos interactive --time 01:00:00 --constraint cpu --account m4259
bash job_script.custom.sh
# kill job once it's done 

Then, if you want to pull the all of the figures generated by the test suite locally I ran:

ssh perlmutter.nersc.gov 'find /pscratch/sd/a/anolan/polaris_PR#256 -name "*.png"' | rsync -vP --files-from=- perlmutter.nersc.gov:/ ~/Desktop/

@andrewdnolan
Copy link
Collaborator Author

andrewdnolan commented Jan 23, 2025

Testing: Spherical Results

File to compare: /ocean/spherical/icos/cosine_bell/viz/480km_180s/final.png

main:

final

develop:
final

@andrewdnolan
Copy link
Collaborator Author

Testing: Planar Results

File to compare: /ocean/planar/ice_shelf_2d/5km/z-star/default/with_viz/viz/v_surf_horiz_t-1.png

main:
v_surf_horiz_t-1

develop:
v_surf_horiz_t-1

@andrewdnolan
Copy link
Collaborator Author

Testing: Planer Results (at vertices)

File to compare: ocean/planar/barotropic_gyre/default/analysis/comparison.png

Note: the error is really large because I only ran this for 3 months, instead of 3 years.

main:
comparison

develop:
comparison

@andrewdnolan andrewdnolan marked this pull request as ready for review January 23, 2025 00:52
@xylar xylar requested review from cbegeman and xylar January 23, 2025 10:51
@xylar xylar self-assigned this Jan 23, 2025
@xylar xylar added framework Changes relating to the polaris framework as opposed to individual tests or analysis ocean Related to ocean tests or analysis labels Jan 23, 2025
Copy link
Collaborator

@xylar xylar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@andrewdnolan, this looks great!

I do have some suggested changes based just on a code review so far.

Please add mosaic to the intersphinx part of the docs/conf.py file:
https://github.com/andrewdnolan/polaris/blob/mosaic_viz/docs/conf.py#L74
This will allow links to mosaic classes from the documentation (notably the autogenerated API docs).

@cbegeman, can you comment on whether you're comfortable with switching the barotropic gyre plots from contourf to continuous for now? mosaic doesn't (yet?) support contourf.

@xylar
Copy link
Collaborator

xylar commented Jan 23, 2025

One more thing, we will need to drop python 3.9 support since mosaic has already done so. My suggestion would be that we merge this without that fix and let @altheaden take care of both dropping python 3.9 and adding python 3.13 (#255) in one PR.

@xylar
Copy link
Collaborator

xylar commented Jan 23, 2025

@andrewdnolan, a couple of my suggested renaming changes need other changes to go with them that I didn't try to make. Please add those as well, as things are broken right now.

@andrewdnolan
Copy link
Collaborator Author

andrewdnolan commented Jan 23, 2025

@andrewdnolan, a couple of my suggested renaming changes need other changes to go with them that I didn't try to make. Please add those as well, as things are broken right now.

@xylar I've got those changes locally, going to remove the polaris_step_complete.log from any steps with viz and rerun just to make sure everything still works fine and then I'll push.

@xylar
Copy link
Collaborator

xylar commented Jan 23, 2025

@andrewdnolan, a quick rebase seems to be needed to fix conflicting updates in docs/conf.py.

After that, I'm happy to run a few tests and then I think I'll be able to approve.

@andrewdnolan
Copy link
Collaborator Author

andrewdnolan commented Jan 23, 2025

@xylar I still need to update the docs/developers_guide/framework/visualization.md to reflect the mosaic changes. I'm working on that now. It'd also be good to actually run all tasks which are affected by these changes, (not just three representative ones like above). I'm identifying the tasks altered by running:

> git diff --name-only origin/main | grep polaris/ocean/tasks | xargs dirname | sort -u

polaris/ocean/tasks/baroclinic_channel
polaris/ocean/tasks/baroclinic_channel/rpe
polaris/ocean/tasks/barotropic_gyre
polaris/ocean/tasks/cosine_bell
polaris/ocean/tasks/geostrophic
polaris/ocean/tasks/ice_shelf_2d
polaris/ocean/tasks/inertial_gravity_wave
polaris/ocean/tasks/manufactured_solution
polaris/ocean/tasks/sphere_transport

As a final step I'll run a suite with all of those. If you're OK with it, I'm not going to do a baseline like above. I'm just going to run with my branch and do visual inspection that everything looks OK.

masked_cells = ds_mesh.nCells.where(~cell_mask, drop=True).astype(int)

# use inverse so True/False convention matches input cell_mask
edge_mask = ~cells_on_edge.isin(masked_cells).any("TWO")
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does this mean that there has to be unmasked cells on either side of an edge for it to be valid?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, there does have to be unmasked cells on either side of an edge, but not both, for it to be valid.

With the cells_on_edge array being zero based, boundary edges will have one entry that's >= 0 and one -1 entry. The int array masked_cells is zero based as well, so a value of -1 can never be in it. So, the isin condition will never be true for the missing cell along boundary edges. Therefore, the edge_mask will only ever be true for a boundary edge if the one valid cell on the edge is True in the cell_mask that's passed to the function.

@cbegeman
Copy link
Collaborator

Testing: Planer Results (at vertices)

File to compare: ocean/planar/barotropic_gyre/default/analysis/comparison.png

@andrewdnolan Can we switch back to the discrete colorbar here? I chose that intentionally so that the streamlines were more clear.

@xylar
Copy link
Collaborator

xylar commented Jan 23, 2025

@andrewdnolan, to get what @cbegeman is asking for, I think it should be possible to just coarsely sample the colormap to get the same effect as contourf, right?

https://matplotlib.org/stable/users/explain/colors/colorbar_only.html#discrete-and-extended-colorbar-with-continuous-colorscale

@andrewdnolan
Copy link
Collaborator Author

@xylar yes, I can do something like what's described here in the matplotlib docs.

@cbegeman I can absolutely change to a discrete colormap. Would you prefer to keep it as a tricontourf call, or is plotting vertex patches but using a discrete colorbar alright?

@cbegeman
Copy link
Collaborator

cbegeman commented Jan 23, 2025

@cbegeman I can absolutely change to a discrete colormap. Would you prefer to keep it as a tricontourf call, or is plotting vertex patches but using a discrete colorbar alright?

Discrete colormap is sufficient. Thanks!

@andrewdnolan
Copy link
Collaborator Author

There was some offline discussion, but as of 0759678, the barotropic stream function plotting routine uses a discrete colorbar with 21 level for the first two pannel, producing:
comparison

@andrewdnolan
Copy link
Collaborator Author

@xylar I've updated the docs/developers_guide/framework/visualization.md page. You can find a copy of the documentation here: https://portal.nersc.gov/cfs/e3sm/anolan/html/html/

@andrewdnolan
Copy link
Collaborator Author

Testing: full suite of tasks altered by this PR

Setup for testing:
You'll want to apply the diff, like above,

diff --git a/polaris/ocean/tasks/barotropic_gyre/forward.py b/polaris/ocean/tasks/barotropic_gyre/forward.py
index 2424bdc5d..4ab854740 100644
--- a/polaris/ocean/tasks/barotropic_gyre/forward.py
+++ b/polaris/ocean/tasks/barotropic_gyre/forward.py
@@ -139,7 +139,7 @@ class Forward(OceanModelStep):
             output_interval_str = time.strftime('0000_%H:%M:%S',
                                                 time.gmtime(run_duration))
         else:
-            stop_time_str = time.strftime('0004-01-01_00:00:00')
+            stop_time_str = time.strftime('0001-04-01_00:00:00')
             output_interval_str = time.strftime('0000-01-00_00:00:00')

         replacements = dict(

You'll also need a config file (below named custom.cfg) which contains:

[barotropic_gyre_default]
steps_to_run = init short_forward long_forward analysis

[inertial_gravity_wave_convergence_space]
steps_to_run = init_200km forward_200km_300s init_100km forward_100km_300s init_50km forward_50km_300s init_25km forward_25km_300s analysis viz

[manufactured_solution_convergence_space]
steps_to_run = init_200km forward_200km_150s init_100km forward_100km_150s init_50km forward_50km_150s init_25km forward_25km_150s analysis viz

Testing this PR's branch:

# 0: ocean/planar/baroclinic_channel/10km/default
# 4: ocean/planar/baroclinic_channel/10km/rpe
# 9: ocean/planar/barotropic_gyre/default
# 11: ocean/planar/ice_shelf_2d/5km/z-star/default/with_viz
# 28: ocean/planar/inertial_gravity_wave/convergence_space
# 59: ocean/planar/manufactured_solution/convergence_space
# 65: ocean/spherical/icos/cosine_bell/convergence_space/with_viz
# 77: ocean/spherical/icos/geostrophic/convergence_space/with_viz
# 113: ocean/spherical/icos/rotation_2d/with_viz

polaris setup -n 0 4 9 11 28 59 65 77 113 \
-w $PSCRATCH/polaris_PR#256/devel_full_suite/ \
-f  custom.cfg

pushd $PSCRATCH/polaris_PR#256/devel_full_suite/

salloc --nodes 3 --qos interactive --time 01:30:00 --constraint cpu --account m4259
bash job_script.custom.sh

@andrewdnolan
Copy link
Collaborator Author

andrewdnolan commented Jan 24, 2025

Results : full suite of tasks altered by this PR

A bunch of tasks fail, but I don't think for visualization related reasons for any of them....

Results:

Task Runtimes:
0:00:00 PASS ocean/planar/baroclinic_channel/10km/default
0:00:00 FAIL ocean/planar/baroclinic_channel/10km/rpe
0:00:00 PASS ocean/planar/barotropic_gyre/default
0:00:00 PASS ocean/planar/ice_shelf_2d/5km/z-star/default/with_viz
0:00:00 FAIL ocean/planar/inertial_gravity_wave/convergence_space
0:00:03 FAIL ocean/planar/manufactured_solution/convergence_space
0:00:30 PASS ocean/spherical/icos/cosine_bell/convergence_space/with_viz
0:02:08 PASS ocean/spherical/icos/geostrophic/convergence_space/with_viz
0:00:00 PASS ocean/spherical/icos/rotation_2d/with_viz
Total runtime: 0:02:56
FAIL: 3 tasks failed, see above.

baroclinic_channel/10km/rpe:

# less case_outputs/ocean_planar_baroclinic_channel_10km_rpe.log
polaris calling: polaris.run.serial._run_task()
  in /global/u2/a/anolan/polaris/polaris/run/serial.py

Running steps: init, nu_1, nu_5, nu_10, nu_20, nu_200, analysis
  * step: init
          already completed
  * step: nu_1
          execution:        ERROR
Exception raised while running the steps of the task
Traceback (most recent call last):
  File "/global/u2/a/anolan/polaris/polaris/run/serial.py", line 324, in _log_and_run_task
    baselines_passed = _run_task(task, available_resources)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/global/u2/a/anolan/polaris/polaris/run/serial.py", line 403, in _run_task
    _run_step(task, step, task.new_step_log_file,
  File "/global/u2/a/anolan/polaris/polaris/run/serial.py", line 449, in _run_step
    raise OSError(
OSError: input file(s) missing in step nu_1 in ocean/planar/baroclinic_channel/10km/rpe/nu_1: ['/pscratch/sd/a/anolan/polaris_PR#256/devel_full_suite/graph.info']

Seems to be an issue with running this as an interactive job, maybe? The task is not using the full path to read in the graph.info file.

baroclinic_channel/10km/rpe:

# tail case_outputs/ocean_planar_inertial_gravity_wave_convergence_space.log
...
  * step: viz
          execution:        ERROR
Exception raised while running the steps of the task
Traceback (most recent call last):
  File "/global/u2/a/anolan/polaris/polaris/run/serial.py", line 324, in _log_and_run_task
    baselines_passed = _run_task(task, available_resources)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/global/u2/a/anolan/polaris/polaris/run/serial.py", line 403, in _run_task
    _run_step(task, step, task.new_step_log_file,
  File "/global/u2/a/anolan/polaris/polaris/run/serial.py", line 449, in _run_step
    raise OSError(
OSError: input file(s) missing in step viz in ocean/planar/inertial_gravity_wave/convergence_space/viz: ['/pscratch/sd/a/anolan/polaris_PR#256/devel_full_suite/ocean/planar/inertial_gravity_wave/convergence_space/forward/200km/output.nc', '/pscratch/sd/a/anolan/polaris_PR#256/devel_full_suite/ocean/planar/inertial_gravity_wave/convergence_space/forward/100km/output.nc', '/pscratch/sd/a/anolan/polaris_PR#256/devel_full_suite/ocean/planar/inertial_gravity_wave/convergence_space/forward/50km/output.nc', '/pscratch/sd/a/anolan/polaris_PR#256/devel_full_suite/ocean/planar/inertial_gravity_wave/convergence_space/forward/25km/output.nc']

Again another issue finding the files, not with plotting. Looks like the ocean/planar/inertial_gravity_wave/convergence_space/forward directories have the run length (i.e. 300s) appended to the directory names. So it should be ocean/planar/inertial_gravity_wave/convergence_space/forward/200km_300s, instead of ocean/planar/inertial_gravity_wave/convergence_space/forward/200km.

manufactured_solution/convergence_space:

# tail case_outputs/ocean_planar_manufactured_solution_convergence_space.log

polaris calling: polaris.ocean.tasks.manufactured_solution.analysis.Analysis.run()
  inherited from: polaris.ocean.convergence.analysis.ConvergenceAnalysis.run()
  in /global/u2/a/anolan/polaris/polaris/ocean/convergence/analysis.py

Order of convergence for SSH: 0.023
Error: order of convergence for SSH
  0.023 < min tolerance 1.8
          execution:        ERROR
Exception raised while running the steps of the task
Traceback (most recent call last):
  File "/global/u2/a/anolan/polaris/polaris/run/serial.py", line 324, in _log_and_run_task
    baselines_passed = _run_task(task, available_resources)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/global/u2/a/anolan/polaris/polaris/run/serial.py", line 403, in _run_task
    _run_step(task, step, task.new_step_log_file,
  File "/global/u2/a/anolan/polaris/polaris/run/serial.py", line 502, in _run_step
    step.run()
  File "/global/u2/a/anolan/polaris/polaris/ocean/convergence/analysis.py", line 154, in run
    self.plot_convergence(
  File "/global/u2/a/anolan/polaris/polaris/ocean/convergence/analysis.py", line 285, in plot_convergence
    raise ValueError('Convergence rate below minimum tolerance.')
ValueError: Convergence rate below minimum tolerance.
polaris calling: polaris.run.serial._run_task()
  in /global/u2/a/anolan/polaris/polaris/run/serial.py

The convergence rate is way out of wack here, do I need to be using a specific branch of mpas-ocean to get this to pass? Not totally sure why the viz task doesn't run after the analysis task fails, but maybe that's the expected behavior...

@xylar
Copy link
Collaborator

xylar commented Jan 24, 2025

@andrewdnolan, I'll look into the first two failures. Those are bugs, I'm pretty sure.

The convergence rate is way out of wack here, do I need to be using a specific branch of mpas-ocean to get this to pass?

I think that's expected. We usually test convergence_both but even there, some tests aren't passing, indicating convergence rates that aren't as high as they are expected to be in MPAS-Ocean. Those failures aren't anything you need to worry about.

Not totally sure why the viz task doesn't run after the analysis task fails, but maybe that's the expected behavior...

Yes, that's by design. The viz step doesn't run by default as part of the test because we usually use it for regression testing:
https://github.com/E3SM-Project/polaris/blob/main/polaris/ocean/tasks/manufactured_solution/__init__.py#L160
You have to go into the viz directory and run it manually after running the main test or you have to edit the config file for the test in the work directory to add viz to steps_to_run. Update: I see that you did the latter above so that's not it. I now do see that, while a suite continues if a test fails, a test will stop if a step fails. So, yes, that's expected. You could move viz before analysis in steps_to_run to "fix" this.

@xylar
Copy link
Collaborator

xylar commented Jan 24, 2025

I think it should be min(xEdge) to min(xEdge)+xPeriod.

@andrewdnolan
Copy link
Collaborator Author

That makes sense to me. I'll open PR in mosaic to account for this.

andrewdnolan and others added 11 commits February 5, 2025 07:51
- Interface to the `plot_horiz_field` has been simplified slightly.

- The mask array now must be the same shape as the input field array.

- Added a framework function to convert a cell mask to and edge mask.
Also simplified the config options for the colorbar options, which
makes the spherical and lat/lon plots have the same options
Dependencies for `xarray` based visualization have been removed
Co-authored-by: Xylar Asay-Davis <[email protected]>
- Renamed all instances of cell_mask_to_edge_mask and added to API
- Fixed typo with adding cartopy features to spherical plots
- Added mosaic to intersphinx mapping
@andrewdnolan
Copy link
Collaborator Author

@xylar I just rebased this branch onto main, which required bumping the alpha version from 2 to 3.

I've also constrained the mosaic version so that it uses the most recent release, which properly sets the axis limits for periodic axes. I reran the baroclinic_channel viz step and this is what is now produced:
final_normalVelocity

@cbegeman
Copy link
Collaborator

cbegeman commented Feb 5, 2025

@andrewdnolan I think something might be going wrong with kite selection at edges. The left and right sides don't fit together like puzzle pieces to my eye (if you try to wrap the domain hot dog-wise)

@xylar
Copy link
Collaborator

xylar commented Feb 5, 2025

@cbegeman, I think this is working correctly in that the cropping bounds are exactly what we want them to be. There are some edge kites that are entirely outside of this range, but that is the fault of periodic_hex and not @andrewdnolan's code. He prefers to fix that when he makes periodic copies of the polygons (which he isn't yet ready to code up) rather than have a special fix now, when things are still ragged, and another one when he makes periodic copies of the polygons. In E3SM-Project/mosaic#27, I said that was okay with me. Do you think the missing edge polygons in the figure above are important enough to hold up the move to mosaic for? If so, we need to just wait for a proper fix to the periodic copies of polygons.

@andrewdnolan
Copy link
Collaborator Author

andrewdnolan commented Feb 5, 2025

@xylar and @cbegeman Yes, that is correct. A limiting assumption that we make in mosaic is that the center of a patch is not periodic, only the nodes of that patch can cross the periodic boundary. The edges (i.e. center of the missing kites) lie outside of the current periodic extent, so we have no way of correcting them in mosaic currently.

I've done some initial prototyping of mirroring the patches across periodic boundaries, and these "incorrectly" placed edge patches would implicitly be corrected in that mirroring. I haven't had a chance to really finish that work, since it requires a decent amount of code additions (the array length is changing with mirrored patches, which has to be handled under the hood). Because this (i.e "incorrectly" placed edge patches) will be fixed with mirror, Xylar and I opted for one all encompassing solution than a special condition for these edge patches. But if it's seems like a necessity for this PR, I can look into a short term solution to tide us over.

@andrewdnolan
Copy link
Collaborator Author

For what it's worth, this is the version of the figure that's produced using the current HEAD of main

final_normalVelocity

So the dangling kites on the right hand side also appear to be a problem in the current version of the polaris viz utilities. With this PR, we are now at least setting the periodic axis limits correctly. There's still some work do to on the mosaic front to handle the patch mirroring, but for the purposes of this PR we aren't reverting any of the currently plotting functionality.

@cbegeman
Copy link
Collaborator

cbegeman commented Feb 5, 2025

@andrewdnolan and @xylar Thanks for catching me up on what you've been thinking. The main thing I'm concerned with is all the edges being plotted in time for us to catch any periodic boundary issues that might emerge with Omega.

Can you clarify what you mean by

for the purposes of this PR we aren't reverting any of the currently plotting functionality

Do you mean that there is a way to plot the figure you showed above with the current main even after this PR goes in? If not, and we are losing the ability to plot some edge kites, what would the timeline be for patch mirroring?

@andrewdnolan
Copy link
Collaborator Author

Oh, sorry I guess I misinterpreted what what was requested above (circa 2 weeks ago).

@cbegeman Until we can mirror patches across the periodic boundary, is it preferable to keep the whole kite visible (even if they extend beyond the periodic axis limit)? This contrasts with "tight" axis implemented in 394fa42, which set the periodic limits based on the cell patch extent and cuts off any patches that cross the periodic boundary. With the "tight" axis limits these "incorrectly" places edge patches appear to be missing, because they lie entirely over the periodic axis limit.

To clarify what I mean above; I was just trying to highlight that the current polaris viz functions also have this problem of "incorrectly" placed edge patches. With 394fa42, we were "tightly" setting the axis limits based on the periodic extent and in turn entirely cutting off the "incorrectly" placed patches. We can undo the "tight" axis setting, so that the axis limit match the what's producing using the HEAD of main.

@cbegeman
Copy link
Collaborator

cbegeman commented Feb 5, 2025

@andrewdnolan I'm sorry for the confusion and for not following this PR more closely.

My preference is for "incorrectly" placed edges with no information loss over the tight axis. So if you're ok with it, I'd prefer to have the tight axis off by default.

I do think your patch mirroring solution will be great. I didn't realize that main had incorrectly placed edges; not sure what changed from the first merge of this capability (e.g., #33 (comment)) but that's water under the bridge.

@andrewdnolan
Copy link
Collaborator Author

@cbegeman No worries! That's an easy enough fix on my end.

I think in the figure linked, these "incorrectly" placed edges are masked out. In the current version, if focus on the bottom right corner with the patch with it's leftmost node at exactly 150 km, you'll notice two edge patches to the right of that edge (and one of those is an "incorrectly" placed edge). Both of those seem to be missing in the figure you linked above. Also the left x-axis limit is greater in the linked version of the plot than the version here. Looks like the current version of the edge plotting was done in #150.

Anyway, I think once I loosen the axis limit this PR should be just about complete. I'm planning to talk through some of the lingering details with regards to mirror patches and we should have that implemented soon (hopefully within the next two weeks?).

Copy link
Collaborator

@cbegeman cbegeman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@andrewdnolan Great work! Approving after agreeing on a way forward with those edge kites. I'm excited about this switch!

@andrewdnolan
Copy link
Collaborator Author

Tight axis limits have been undone as of 9684bea. I just commented out the code to handle the tight axis limits (instead of deleting), since we'll hopefully have the mirroring in mosaic done soon.

Here's what the barotropic gyre figure now look like, using this branch:
final_normalVelocity

Copy link
Collaborator

@xylar xylar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@andrewdnolan, thanks so much for working so hard on implementing and testing this. I'm happy to approve based on your latest testing.

@xylar
Copy link
Collaborator

xylar commented Feb 6, 2025

Again, merging despite the expected failures in python 3.9.

@xylar xylar merged commit c189fd3 into E3SM-Project:main Feb 6, 2025
4 of 5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
framework Changes relating to the polaris framework as opposed to individual tests or analysis ocean Related to ocean tests or analysis
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants