Pfem compatibility update
Created by: bjbobach
Two things have changed in PFEM that affect CUPyDO:
1. solution vectors
This concerns changes in the interface/Pfem.py file. The solution vectors u
, v
, p
, and MatID
have become members of PFEM's TimeIntegration
class and therefore do not need to be passed as arguments to many of the functions of the same class. This includes the functions
init()
setNextTimeStep()
runOneTimeStep()
vizu()
- several
update_xxx()
functions
Whenever CUPyDO needs to access these, it can directly read and write using self.scheme.u
etc., where scheme is the local name given to the instance of the TimeIntegration
class. This way, no duplication of these vectors is necessary, making the local vectors self.u etc. obsolete
SolutionScheme
vs. NonLinearAlgorithm
2. This concerns all pfem input files (xxx_fluid.py). Due to a restructuring of how the pfem classes SolutionScheme
and NonLinearAlgorithm
relate to each other, TimeIntegration
is now initialized with a SolutionScheme
object, where a NonLinearAlgorithm
object was before.
before:
solScheme = w.SchemeMonolithicPSPG(msh, pbl)
convCriterion = w.ForcesBalanceNormedBodyForceCriterion(msh, pbl, toll)
nonLinAlgo = w.PicardAlgorithm(solScheme, convCriterion, nItMax)
scheme = w.TimeIntegration(msh, pbl, nonLinAlgo)
after:
solScheme = w.SchemeMonolithicPSPG(msh, pbl)
convCriterion = w.ForcesBalanceNormedBodyForceCriterion(msh, pbl, toll)
nonLinAlgo = w.PicardAlgorithm(solScheme, convCriterion, nItMax)
scheme = w.TimeIntegration(msh, pbl, solScheme)
4. Fix MPI issues
When running in parallel, it tended to hang up at the very beginning due to the lack of an MPI barrier in utilities.py
. That is solved.
Furthermore, it used to have some issues when sending and receiving from the master node in interpolator.py
. Now it receives for the master before sending the data to the others. There is probably a better way to handle this I assume, but for the moment it works.