I’ve been experimenting with the mpi_cmagnet.py tutorial, and I’ve noticed a problem. When I change the finite element space to complex, I get the following error:
collect data[node6:16841] *** An error occurred in MPI_Send
[node6:16841] *** reported by process [3593732097,7]
[node6:16841] *** on communicator MPI_COMM_WORLD
[node6:16841] *** MPI_ERR_TYPE: invalid datatype
[node6:16841] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
[node6:16841] *** and potentially your MPI job)
[node7:12248] 5 more processes have sent help message help-mpi-errors.txt / mpi_errors_are_fatal
[node7:12248] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
This leads me to believe that NGSolve is having difficulty handling complex finite element spaces in parallel. Is this a bug or a known issue? I have attached my copy mpi_cmagnet.py
Thank you for your time
https://ngsolve.org/media/kunena/attachments/1252/mpi_cmagnet.py
Attachment: mpi_cmagnet.py