Dear NGSolve Users,
The PETSc toolkit provides many powerful parallel solvers, and much more than that.
We have now two possibilities to interface PETSc from NGSolve available:
- a pure pythonic interface
- a C++ interface
The first one uses petsc4py Python bindings. The important small step was to make NGSolve compatible with mpi4py communicators. You can find a few elementary tutorials here, more will come soon:
https://ngsolve.org/docu/nightly/i-tutorials/unit-5a.3-petsc/petsc.html
The second one uses the ngs-petsc C++ library available from here:
It contains many examples from C++, own Python bindings and petsc4py.
We are happy to hear your feedback and see your examples how you use PETSc with NGSolve,
Both approaches need a very recent NGSolve-nightly, or the coming NGSolve 20-08 release.
best, Joachim
Hello,
I have a question on preconditioning use of petsc. To my knowledge, there are two ways to use petsc solvers:
(1) ship ngsolve matrix to petsc, and use petsc’s internal preconditioner to solve
(2) use ngsolve’s build-in solver (CGSolver, e.g.), and build (part of) preconditioner using petsc and ship the petsc preconditioner back to ngsolve.
In my previous experence months ago, I found petsc’s build-in solver with its build-in preconditioner is slightly faster than ngsolve’s solver. Is this to be expected?
My first question is that: can we pass ngsolve (operator) preconditioners (on ngsolve mat), say, an auxiliary low order space preconditioner for a high order operator (where we want to apply petsc preconditioner for coarse space solver) , directly to petsc (on petsc mat)? This seems not possible.
I don’t know how to do this, so my second question is: how to implement the PETSc2NGsPrecond operator available in the ngs_petsc adds on, but not yet in the new ngspetsc interface?
The reason to ask the second question is that I noticed, after updating to the newest version of ngsolve, the ngs_petsc adds on seems to be broken.
When I import ngs_petsc, it complains that generic_type: type “VecMap” is already registered!
Best,
Guosheng
Never mind, the ngs-petsc library is working now…
I mainly use petsc without MPI. I guess ngs-petsc is the way to go for now. :>