Hi, i try to use the mumps solver without mpiexec. from python this works but not from c++. i have compiled ngsolve by myself with mumps, mkl and mpi enabled.
From the tutorials in python i learned that the only thing i habe to do with mpi is:
comm = MPI.COMM_WORLD
rank = comm.rank
if comm.rank == 0:
path = sys.argv[1]
mesh = ngsolve.Mesh(path)
ngmesh = mesh.ngmesh #netgen.csg.unit_cube.GenerateMesh(maxh=0.05)
ngmesh.Distribute(comm)
else:
ngmesh = netgen.meshing.Mesh.Receive(comm)
mesh = Mesh(ngmesh)
where the else branch is only relevant for mpiexec -n > 1, so i think this should also work from C++ ?
MPI_Init(&argc, &argv);
netgen::NgMPI_Comm comm(MPI_COMM_WORLD);
std::cout << argv[1] << std::endl;
netgen::Ngx_Mesh mesh(argv[1], comm);
std::cout << mesh.GetCommunicator().Size() << std::endl;
std::shared_ptr<MeshAccess> ma = make_shared<MeshAccess>(mesh.GetMesh());
shared_ptr<netgen::Mesh> ngmesh = ma->GetNetgenMesh();
but calling the Arnoldisolver with inverse mumps gives me :
[x201t-arch:00000] *** An error occurred in MPI_Comm_rank
[x201t-arch:00000] *** reported by process [3076849664,0]
[x201t-arch:00000] *** on communicator MPI_COMM_WORLD
[x201t-arch:00000] *** MPI_ERR_COMM: invalid communicator
[x201t-arch:00000] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
[x201t-arch:00000] *** and MPI will try to terminate your MPI job as well)
the python code runs without error and uses also the self compiled code from nsolve.
what can go wrong here ? THX for help