-
Notifications
You must be signed in to change notification settings - Fork 56
Viewing Solutions With Paraview in Serial and Parallel
After running a simulation in Proteus (see ref), the next step will be to post-process the solution. An important part of this will be visualizing the results. There are number of different scientific visualization softwares available in the open source community including Visit and Paraview. The Proteus developers typically use Paraview so you will probably find this the easiest software to use.
The following commands will get you
1 From Paraview, open the *.xmf file
2 Select the option Xdmf Reader
3 Push the Green Apply button below the Properties tab
In order to view large simulations, we can make use of a ParaView Server and perform the rendering on HPC machines. For ERDC HPC users specifically, there is a very useful configuration file (i.e. default_servers.pvsc
) that sets the server up remotely from your local machine. You can obtain this configuration file from one of the core developers or you can search for one on the HPC systems:
find /apps/DAAC/paraview/ -name "*.pvsc"
/apps/DAAC/paraview/5.2.0/local/default_servers.pvsc
/apps/DAAC/paraview/5.2.0/lib/paraview-5.2/default_servers.pvsc
/apps/DAAC/paraview/5.3.0/local/default_servers.pvsc
/apps/DAAC/paraview/5.3.0/lib/paraview-5.3/default_servers.pvsc
/apps/DAAC/paraview/5.4.1/local/default_servers.pvsc
/apps/DAAC/paraview/5.4.1/lib/paraview-5.4/default_servers.pvsc
/apps/DAAC/paraview/5.0.1/local/default_servers.pvsc
/apps/DAAC/paraview/5.0.1/lib/paraview-5.0/default_servers.pvsc
/apps/DAAC/paraview/5.4.1_gl2_cuda/local/default_servers.pvsc
/apps/DAAC/paraview/5.4.1_gl2_cuda/lib/paraview-5.4/default_servers.pvsc
/apps/DAAC/paraview/5.1.2/local/default_servers.pvsc
/apps/DAAC/paraview/5.1.2/lib/paraview-5.1/default_servers.pvsc
- From your local machine and a kerberized shell, start an instance of
paraview
. - Go to
File->Connect
. This prompts you to load the.pvsc
file. - A list of ERDC servers will be shown. Choose the machine that you were working on such as
topaz
. - You will now be given a window to specify a list of options (those set by default correctly will be omitted):
-
Local SSH Command
: specify the path to your kerberizedossh
-
Username
: your HPC account username -
Project number
: your HPC account project number -
Queue name
: the desired queue, e.g.debug
,standard
, etc. -
Number of nodes
: the desired number of compute nodes for the job -
Number of Processors/Node
: self-explanatory -
Wall time(min)
: Desired amount of time for post-processing
- Once you've set these parameters, you can hit
OK
at which point ParaView attempts to connect to the chosen machine, start a job, and load the same ParaView version as your local instance. Note, that means that the HPC machine has to have the same ParaView version on your local machine. If you're using downloaded binaries, you might wind up with a ParaView version like5.3.0-RC2
but the HPC machines might have5.3.0
. In this case, you need to modify the.pvsc
file to ignore the additional-RC2
string in your version. Within the.pvsc
file, you will have references to$PV_VERSION_FULL$
. You can, for example, substitute this variable with$PV_VERSION$.$PV_VERSION_PATCH$
to eliminate the-RC2
string. For more info on the version variables, you can look at the ParaView Wiki. Note : the ParaView versions available on the HPC can be found in/apps/DAAC/paraview
. - A job will be queued on the HPC machine and once it starts, you will be able to open files within the ParaView GUI.
You only need to login once to an HPC machine via HPCMP in a single shell (for Yubikey users) if you modify your .ssh/config
file with the following lines:
Host topaz.erdc.hpc.mil
ControlPath ~/.ssh/%r@%h:%p
ControlMaster auto
ControlPersist 10m
You can add additional domains like excalibur
or onyx
to directly ssh
into those domains in a second shell after connecting once.