mfem: PETSc tests fail on Ubuntu 18.04 LTS with default OpenMPI 2.1.1-8

PETSc examples 3, 4 and 5 are failing with my current installation. The output of make test is below.

I am using the latest MFEM master branch with PETSc 3.12.5 (I also tried 3.13.5 with similar results) and the default Ubuntu 18.04 OpenMPI.

Is there something wrong with my installation or could it be a problem with MFEM or the examples? What should I do to try to debug these errors?

$ make test
Testing the MFEM library. This may take a while...
Building all examples, miniapps, and tests...
make[1]: Entering directory '/home/ben/projects/mfem/mfem'
make -C examples
make[2]: Entering directory '/home/ben/projects/mfem/mfem/examples'
make -C petsc all
make[3]: Entering directory '/home/ben/projects/mfem/mfem/examples/petsc'
make[3]: Nothing to be done for 'all'.
make[3]: Leaving directory '/home/ben/projects/mfem/mfem/examples/petsc'
make[2]: Leaving directory '/home/ben/projects/mfem/mfem/examples'
make -C miniapps/common
make[2]: Entering directory '/home/ben/projects/mfem/mfem/miniapps/common'
make[2]: Nothing to be done for 'all'.
make[2]: Leaving directory '/home/ben/projects/mfem/mfem/miniapps/common'
make -C miniapps/electromagnetics
make[2]: Entering directory '/home/ben/projects/mfem/mfem/miniapps/electromagnetics'
make -C ../../miniapps/common
make[3]: Entering directory '/home/ben/projects/mfem/mfem/miniapps/common'
make[3]: Nothing to be done for 'all'.
make[3]: Leaving directory '/home/ben/projects/mfem/mfem/miniapps/common'
make[2]: Leaving directory '/home/ben/projects/mfem/mfem/miniapps/electromagnetics'
make -C miniapps/meshing
make[2]: Entering directory '/home/ben/projects/mfem/mfem/miniapps/meshing'
make -C ../../miniapps/common
make[3]: Entering directory '/home/ben/projects/mfem/mfem/miniapps/common'
make[3]: Nothing to be done for 'all'.
make[3]: Leaving directory '/home/ben/projects/mfem/mfem/miniapps/common'
make[2]: Leaving directory '/home/ben/projects/mfem/mfem/miniapps/meshing'
make -C miniapps/navier
make[2]: Entering directory '/home/ben/projects/mfem/mfem/miniapps/navier'
make[2]: Nothing to be done for 'all'.
make[2]: Leaving directory '/home/ben/projects/mfem/mfem/miniapps/navier'
make -C miniapps/performance
make[2]: Entering directory '/home/ben/projects/mfem/mfem/miniapps/performance'
make[2]: Nothing to be done for 'all'.
make[2]: Leaving directory '/home/ben/projects/mfem/mfem/miniapps/performance'
make -C miniapps/tools
make[2]: Entering directory '/home/ben/projects/mfem/mfem/miniapps/tools'
make -C ../../miniapps/common
make[3]: Entering directory '/home/ben/projects/mfem/mfem/miniapps/common'
make[3]: Nothing to be done for 'all'.
make[3]: Leaving directory '/home/ben/projects/mfem/mfem/miniapps/common'
make[2]: Leaving directory '/home/ben/projects/mfem/mfem/miniapps/tools'
make -C miniapps/toys
make[2]: Entering directory '/home/ben/projects/mfem/mfem/miniapps/toys'
make -C ../../miniapps/common
make[3]: Entering directory '/home/ben/projects/mfem/mfem/miniapps/common'
make[3]: Nothing to be done for 'all'.
make[3]: Leaving directory '/home/ben/projects/mfem/mfem/miniapps/common'
make[2]: Leaving directory '/home/ben/projects/mfem/mfem/miniapps/toys'
make -C miniapps/nurbs
make[2]: Entering directory '/home/ben/projects/mfem/mfem/miniapps/nurbs'
make[2]: Nothing to be done for 'all'.
make[2]: Leaving directory '/home/ben/projects/mfem/mfem/miniapps/nurbs'
make -C miniapps/gslib
make[2]: Entering directory '/home/ben/projects/mfem/mfem/miniapps/gslib'
make[2]: Nothing to be done for 'all'.
make[2]: Leaving directory '/home/ben/projects/mfem/mfem/miniapps/gslib'
make -C miniapps/adjoint
make[2]: Entering directory '/home/ben/projects/mfem/mfem/miniapps/adjoint'
make[2]: Nothing to be done for 'all'.
make[2]: Leaving directory '/home/ben/projects/mfem/mfem/miniapps/adjoint'
make -C tests/unit
make[2]: Entering directory '/home/ben/projects/mfem/mfem/tests/unit'
make[2]: Nothing to be done for 'all'.
make[2]: Leaving directory '/home/ben/projects/mfem/mfem/tests/unit'
make[1]: Leaving directory '/home/ben/projects/mfem/mfem'
Running tests in: [ tests/unit examples miniapps/electromagnetics miniapps/meshing miniapps/navier miniapps/performance miniapps/tools miniapps/toys miniapps/nurbs miniapps/gslib miniapps/adjoint ] ...
Running tests in tests/unit ...
make[1]: Entering directory '/home/ben/projects/mfem/mfem/tests/unit'
    Parallel unit tests [ mpirun -np 1 punit_tests ... ]: OK  (1.70s 41980kB)
    Parallel unit tests [ mpirun -np 4 punit_tests ... ]: OK  (9.96s 30632kB)
    Parallel unit tests [ mpirun -np 1 psedov_tests_cpu ... ]: OK  (1.00s 25092kB)
    Parallel unit tests [ mpirun -np 4 psedov_tests_cpu ... ]: OK  (0.57s 24636kB)
    Parallel unit tests [ mpirun -np 1 psedov_tests_debug ... ]: OK  (1.52s 25576kB)
    Parallel unit tests [ mpirun -np 4 psedov_tests_debug ... ]: OK  (1.81s 25244kB)
    Unit tests [ unit_tests ... ]: OK  (30.20s 85124kB)
    Unit tests [ sedov_tests_cpu ... ]: OK  (0.85s 16088kB)
    Unit tests [ sedov_tests_debug ... ]: OK  (1.45s 17204kB)
make[1]: Leaving directory '/home/ben/projects/mfem/mfem/tests/unit'
Running tests in examples ...
make[1]: Entering directory '/home/ben/projects/mfem/mfem/examples'
make -C petsc all
make[2]: Entering directory '/home/ben/projects/mfem/mfem/examples/petsc'
make[2]: Nothing to be done for 'all'.
make[2]: Leaving directory '/home/ben/projects/mfem/mfem/examples/petsc'
    Parallel example [ mpirun -np 4 ex1p ... ]: OK  (0.49s 33972kB)
    Parallel example [ mpirun -np 4 ex2p ... ]: OK  (0.27s 24756kB)
    Parallel example [ mpirun -np 4 ex3p ... ]: OK  (1.48s 29820kB)
    Parallel example [ mpirun -np 4 ex4p ... ]: OK  (0.31s 24688kB)
    Parallel example [ mpirun -np 4 ex5p ... ]: OK  (3.20s 176032kB)
    Parallel example [ mpirun -np 4 ex6p ... ]: OK  (5.26s 75004kB)
    Parallel example [ mpirun -np 4 ex7p ... ]: OK  (0.26s 24956kB)
    Parallel example [ mpirun -np 4 ex8p ... ]: OK  (0.48s 37604kB)
    Parallel example [ mpirun -np 4 ex9p ... ]: OK  (0.80s 24756kB)
    Parallel example [ mpirun -np 4 ex10p ... ]: OK  (0.53s 25064kB)
    Parallel example [ mpirun -np 4 ex11p ... ]: OK  (0.25s 24680kB)
    Parallel example [ mpirun -np 4 ex12p ... ]: OK  (0.28s 25124kB)
    Parallel example [ mpirun -np 4 ex13p ... ]: OK  (1.86s 32624kB)
    Parallel example [ mpirun -np 4 ex14p ... ]: OK  (1.47s 94100kB)
    Parallel example [ mpirun -np 4 ex15p ... ]: OK  (0.56s 24920kB)
    Parallel example [ mpirun -np 4 ex16p ... ]: OK  (0.79s 25008kB)
    Parallel example [ mpirun -np 4 ex17p ... ]: OK  (0.88s 39820kB)
    Parallel example [ mpirun -np 4 ex18p ... ]: OK  (0.51s 25076kB)
    Parallel example [ mpirun -np 4 ex19p ... ]: OK  (0.81s 25120kB)
    Parallel example [ mpirun -np 4 ex20p ... ]: OK  (0.23s 25176kB)
    Parallel example [ mpirun -np 4 ex21p ... ]: OK  (1.39s 25452kB)
    Parallel example [ mpirun -np 4 ex22p ... ]: OK  (0.24s 25076kB)
    Parallel example [ mpirun -np 4 ex24p ... ]: OK  (0.44s 26384kB)
    Parallel example [ mpirun -np 4 ex25p ... ]: OK  (0.32s 25276kB)
    Parallel example [ mpirun -np 4 ex26p ... ]: OK  (0.40s 34196kB)
    Parallel example [ mpirun -np 4 ex27p ... ]: OK  (0.25s 24612kB)
    Serial example [ ex1 ... ]: OK  (0.29s 23404kB)
    Serial example [ ex2 ... ]: OK  (0.14s 15104kB)
    Serial example [ ex2-bz1 ... ]: OK  (0.02s 13100kB)
    Serial example [ ex2-stress ... ]: OK  (0.06s 13072kB)
    Serial example [ ex3 ... ]: OK  (2.14s 39684kB)
    Serial example [ ex4 ... ]: OK  (1.20s 28184kB)
    Serial example [ ex5 ... ]: OK  (1.38s 38752kB)
    Serial example [ ex6 ... ]: OK  (2.32s 72076kB)
    Serial example [ ex7 ... ]: OK  (0.01s 12412kB)
    Serial example [ ex8 ... ]: OK  (1.00s 19524kB)
    Serial example [ ex9 ... ]: OK  (2.12s 20788kB)
    Serial example [ ex10 ... ]: OK  (0.82s 14920kB)
    Serial example [ ex14 ... ]: OK  (2.13s 76008kB)
    Serial example [ ex15 ... ]: OK  (0.71s 14428kB)
    Serial example [ ex16 ... ]: OK  (0.31s 13264kB)
    Serial example [ ex17 ... ]: OK  (1.29s 28272kB)
    Serial example [ ex18 ... ]: OK  (0.68s 12580kB)
    Serial example [ ex19 ... ]: OK  (1.06s 13180kB)
    Serial example [ ex20 ... ]: OK  (0.00s 10760kB)
    Serial example [ ex21 ... ]: OK  (3.13s 23276kB)
    Serial example [ ex22 ... ]: OK  (0.00s 12476kB)
    Serial example [ ex23 ... ]: OK  (0.14s 13508kB)
    Serial example [ ex24 ... ]: OK  (5.36s 146800kB)
    Serial example [ ex25 ... ]: OK  (0.20s 18980kB)
    Serial example [ ex26 ... ]: OK  (0.17s 20004kB)
    Serial example [ ex27 ... ]: OK  (0.03s 13176kB)
make -C petsc test
make[2]: Entering directory '/home/ben/projects/mfem/mfem/examples/petsc'
    Parallel PETSc example [ mpirun -np 4 ex1p ... ]: OK  (4.06s 55668kB)
    Parallel PETSc example [ mpirun -np 4 ex1p ... ]: OK  (6.15s 100312kB)
    Parallel PETSc example [ mpirun -np 4 ex2p ... ]: OK  (0.50s 40684kB)
    Parallel PETSc example [ mpirun -np 4 ex3p ... ]: FAILED  (6.45s 7519996kB)
Options used:
   --mesh ../../data/klein-bottle.mesh
   --order 2
   --frequency 0.1
   --no-static-condensation
   --no-visualization
   --usepetsc
   --petscopts rc_ex3p_bddc
   --nonoverlapping
Number of finite element unknowns: 65536
Size of linear system: 65536
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 3 in communicator MPI_COMM_WORLD
with errorcode 59.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 15 Terminate: Some process (or the batch system) has told this process to end
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see https://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[0]PETSC ERROR: likely location of problem given in stack below
[0]PETSC ERROR: ---------------------  Stack Frames ------------------------------------
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[0]PETSC ERROR:       INSTEAD the line number of the start of the function
[0]PETSC ERROR:       is given.
[0]PETSC ERROR: [0] PetscTrMallocDefault line 157 /home/ben/projects/mfem/petsc/src/sys/memory/mtr.c
[0]PETSC ERROR: [0] PetscMallocA line 401 /home/ben/projects/mfem/petsc/src/sys/memory/mal.c
[0]PETSC ERROR: [0] PCBDDCMatISSubassemble line 7596 /home/ben/projects/mfem/petsc/src/ksp/pc/impls/bddc/bddcprivate.c
[0]PETSC ERROR: [0] PCBDDCSetUpCoarseSolver line 8133 /home/ben/projects/mfem/petsc/src/ksp/pc/impls/bddc/bddcprivate.c
[0]PETSC ERROR: [0] PCBDDCSetUpSolvers line 3719 /home/ben/projects/mfem/petsc/src/ksp/pc/impls/bddc/bddcprivate.c
[0]PETSC ERROR: [0] PCSetUp_BDDC line 1600 /home/ben/projects/mfem/petsc/src/ksp/pc/impls/bddc/bddc.c
[0]PETSC ERROR: [0] PCSetUp line 856 /home/ben/projects/mfem/petsc/src/ksp/pc/interface/precon.c
[0]PETSC ERROR: [0] KSPSetUp line 289 /home/ben/projects/mfem/petsc/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: [0] KSPSolve line 656 /home/ben/projects/mfem/petsc/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Signal received
[0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.12.5, unknown 
[0]PETSC ERROR: Unknown Name on a arch-linux2-c-debug named p1 by ben Mon Sep 21 17:20:53 2020
[0]PETSC ERROR: Configure options --download-fblaslapack=yes --download-scalapack=yes --download-mumps=yes --download-suitesparse=yes --with-hypre-dir=../hypre/src/hypre --with-shared-libraries=0 --with-debugging=1
[0]PETSC ERROR: #1 User provided function() line 0 in  unknown file
[3]PETSC ERROR: ------------------------------------------------------------------------
[3]PETSC ERROR: Caught signal number 15 Terminate: Some process (or the batch system) has told this process to end
[3]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[3]PETSC ERROR: or see https://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[3]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[3]PETSC ERROR: likely location of problem given in stack below
[3]PETSC ERROR: ---------------------  Stack Frames ------------------------------------
[3]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[3]PETSC ERROR:       INSTEAD the line number of the start of the function
[3]PETSC ERROR:       is given.
[3]PETSC ERROR: [3] PetscTrMallocDefault line 157 /home/ben/projects/mfem/petsc/src/sys/memory/mtr.c
[3]PETSC ERROR: [3] PetscMallocA line 401 /home/ben/projects/mfem/petsc/src/sys/memory/mal.c
[3]PETSC ERROR: [3] PCBDDCMatISSubassemble line 7596 /home/ben/projects/mfem/petsc/src/ksp/pc/impls/bddc/bddcprivate.c
[3]PETSC ERROR: [3] PCBDDCSetUpCoarseSolver line 8133 /home/ben/projects/mfem/petsc/src/ksp/pc/impls/bddc/bddcprivate.c
[3]PETSC ERROR: [3] PCBDDCSetUpSolvers line 3719 /home/ben/projects/mfem/petsc/src/ksp/pc/impls/bddc/bddcprivate.c
[3]PETSC ERROR: [3] PCSetUp_BDDC line 1600 /home/ben/projects/mfem/petsc/src/ksp/pc/impls/bddc/bddc.c
[3]PETSC ERROR: [3] PCSetUp line 856 /home/ben/projects/mfem/petsc/src/ksp/pc/interface/precon.c
[3]PETSC ERROR: [3] KSPSetUp line 289 /home/ben/projects/mfem/petsc/src/ksp/ksp/interface/itfunc.c
[3]PETSC ERROR: [3] KSPSolve line 656 /home/ben/projects/mfem/petsc/src/ksp/ksp/interface/itfunc.c
[3]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[3]PETSC ERROR: Signal received
[3]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[3]PETSC ERROR: Petsc Release Version 3.12.5, unknown 
[3]PETSC ERROR: Unknown Name on a arch-linux2-c-debug named p1 by ben Mon Sep 21 17:20:53 2020
[3]PETSC ERROR: Configure options --download-fblaslapack=yes --download-scalapack=yes --download-mumps=yes --download-suitesparse=yes --with-hypre-dir=../hypre/src/hypre --with-shared-libraries=0 --with-debugging=1
[3]PETSC ERROR: #1 User provided function() line 0 in  unknown file
--------------------------------------------------------------------------
mpirun noticed that process rank 1 with PID 0 on node p1 exited on signal 9 (Killed).
--------------------------------------------------------------------------
[p1:11006] 1 more process has sent help message help-mpi-api.txt / mpi-abort
[p1:11006] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
makefile:103: recipe for target 'ex3p-test-par' failed
make[2]: *** [ex3p-test-par] Error 137
    Parallel PETSc example [ mpirun -np 4 ex4p ... ]: FAILED  (5.95s 7737064kB)
Options used:
   --mesh ../../data/klein-bottle.mesh
   --order 2
   --impose-bc
   --frequency 1
   --no-static-condensation
   --no-hybridization
   --no-visualization
   --usepetsc
   --petscopts rc_ex4p_bddc
   --nonoverlapping
Number of finite element unknowns: 65536
Size of linear system: 65536
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD
with errorcode 59.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[1]PETSC ERROR: ------------------------------------------------------------------------
[1]PETSC ERROR: Caught signal number 15 Terminate: Some process (or the batch system) has told this process to end
[1]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[1]PETSC ERROR: or see https://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[1]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[1]PETSC ERROR: likely location of problem given in stack below
[1]PETSC ERROR: ---------------------  Stack Frames ------------------------------------
[1]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[1]PETSC ERROR:       INSTEAD the line number of the start of the function
[1]PETSC ERROR:       is given.
[1]PETSC ERROR: [1] PetscTrMallocDefault line 157 /home/ben/projects/mfem/petsc/src/sys/memory/mtr.c
[1]PETSC ERROR: [1] PetscMallocA line 401 /home/ben/projects/mfem/petsc/src/sys/memory/mal.c
[1]PETSC ERROR: [1] PCBDDCMatISSubassemble line 7596 /home/ben/projects/mfem/petsc/src/ksp/pc/impls/bddc/bddcprivate.c
[1]PETSC ERROR: [1] PCBDDCSetUpCoarseSolver line 8133 /home/ben/projects/mfem/petsc/src/ksp/pc/impls/bddc/bddcprivate.c
[1]PETSC ERROR: [1] PCBDDCSetUpSolvers line 3719 /home/ben/projects/mfem/petsc/src/ksp/pc/impls/bddc/bddcprivate.c
[1]PETSC ERROR: [1] PCSetUp_BDDC line 1600 /home/ben/projects/mfem/petsc/src/ksp/pc/impls/bddc/bddc.c
[1]PETSC ERROR: [1] PCSetUp line 856 /home/ben/projects/mfem/petsc/src/ksp/pc/interface/precon.c
[1]PETSC ERROR: [1] PCApply line 426 /home/ben/projects/mfem/petsc/src/ksp/pc/interface/precon.c
[1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[1]PETSC ERROR: Signal received
[1]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[1]PETSC ERROR: Petsc Release Version 3.12.5, unknown 
[1]PETSC ERROR: Unknown Name on a arch-linux2-c-debug named p1 by ben Mon Sep 21 17:20:59 2020
[1]PETSC ERROR: Configure options --download-fblaslapack=yes --download-scalapack=yes --download-mumps=yes --download-suitesparse=yes --with-hypre-dir=../hypre/src/hypre --with-shared-libraries=0 --with-debugging=1
[1]PETSC ERROR: #1 User provided function() line 0 in  unknown file
[2]PETSC ERROR: ------------------------------------------------------------------------
[2]PETSC ERROR: Caught signal number 15 Terminate: Some process (or the batch system) has told this process to end
[2]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[2]PETSC ERROR: or see https://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[2]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[2]PETSC ERROR: likely location of problem given in stack below
[2]PETSC ERROR: ---------------------  Stack Frames ------------------------------------
[2]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[2]PETSC ERROR:       INSTEAD the line number of the start of the function
[2]PETSC ERROR:       is given.
[2]PETSC ERROR: [2] PetscTrMallocDefault line 157 /home/ben/projects/mfem/petsc/src/sys/memory/mtr.c
[2]PETSC ERROR: [2] PetscMallocA line 401 /home/ben/projects/mfem/petsc/src/sys/memory/mal.c
[2]PETSC ERROR: [2] PCBDDCMatISSubassemble line 7596 /home/ben/projects/mfem/petsc/src/ksp/pc/impls/bddc/bddcprivate.c
[2]PETSC ERROR: [2] PCBDDCSetUpCoarseSolver line 8133 /home/ben/projects/mfem/petsc/src/ksp/pc/impls/bddc/bddcprivate.c
[2]PETSC ERROR: [2] PCBDDCSetUpSolvers line 3719 /home/ben/projects/mfem/petsc/src/ksp/pc/impls/bddc/bddcprivate.c
[2]PETSC ERROR: [2] PCSetUp_BDDC line 1600 /home/ben/projects/mfem/petsc/src/ksp/pc/impls/bddc/bddc.c
[2]PETSC ERROR: [2] PCSetUp line 856 /home/ben/projects/mfem/petsc/src/ksp/pc/interface/precon.c
[2]PETSC ERROR: [2] PCApply line 426 /home/ben/projects/mfem/petsc/src/ksp/pc/interface/precon.c
[2]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[2]PETSC ERROR: Signal received
[2]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[2]PETSC ERROR: Petsc Release Version 3.12.5, unknown 
[2]PETSC ERROR: Unknown Name on a arch-linux2-c-debug named p1 by ben Mon Sep 21 17:20:59 2020
[2]PETSC ERROR: Configure options --download-fblaslapack=yes --download-scalapack=yes --download-mumps=yes --download-suitesparse=yes --with-hypre-dir=../hypre/src/hypre --with-shared-libraries=0 --with-debugging=1
[2]PETSC ERROR: #1 User provided function() line 0 in  unknown file
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 0 on node p1 exited on signal 9 (Killed).
--------------------------------------------------------------------------
[p1:11072] 1 more process has sent help message help-mpi-api.txt / mpi-abort
[p1:11072] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
makefile:105: recipe for target 'ex4p-test-par' failed
make[2]: *** [ex4p-test-par] Error 137
    Parallel PETSc example [ mpirun -np 4 ex5p ... ]: FAILED  (12.04s 7513312kB)
Options used:
   --mesh ../../data/star.mesh
   --order 0
   --serial-format
   --no-visualization
   --usepetsc
   --petscopts rc_ex5p_bddc
   --nonoverlapping
   --local-bdr
***********************************************************
dim(R) = 164480
dim(W) = 81920
dim(R+W) = 246400
***********************************************************
** PETSc DEPRECATION WARNING ** : the option -pc_factor_mat_solver_package is deprecated as of version 3.9 and will be removed in a future release. Please use the option -pc_factor_mat_solver_type instead. (Silence this warning with -options_suppress_deprecated_warnings)
** PETSc DEPRECATION WARNING ** : the option -pc_factor_mat_solver_package is deprecated as of version 3.9 and will be removed in a future release. Please use the option -pc_factor_mat_solver_type instead. (Silence this warning with -options_suppress_deprecated_warnings)
** PETSc DEPRECATION WARNING ** : the option -pc_factor_mat_solver_package is deprecated as of version 3.9 and will be removed in a future release. Please use the option -pc_factor_mat_solver_type instead. (Silence this warning with -options_suppress_deprecated_warnings)
** PETSc DEPRECATION WARNING ** : the option -pc_factor_mat_solver_package is deprecated as of version 3.9 and will be removed in a future release. Please use the option -pc_factor_mat_solver_type instead. (Silence this warning with -options_suppress_deprecated_warnings)
** PETSc DEPRECATION WARNING ** : the option -pc_factor_mat_solver_package is deprecated as of version 3.9 and will be removed in a future release. Please use the option -pc_factor_mat_solver_type instead. (Silence this warning with -options_suppress_deprecated_warnings)
** PETSc DEPRECATION WARNING ** : the option -pc_factor_mat_solver_package is deprecated as of version 3.9 and will be removed in a future release. Please use the option -pc_factor_mat_solver_type instead. (Silence this warning with -options_suppress_deprecated_warnings)
** PETSc DEPRECATION WARNING ** : the option -pc_factor_mat_solver_package is deprecated as of version 3.9 and will be removed in a future release. Please use the option -pc_factor_mat_solver_type instead. (Silence this warning with -options_suppress_deprecated_warnings)
** PETSc DEPRECATION WARNING ** : the option -pc_factor_mat_solver_package is deprecated as of version 3.9 and will be removed in a future release. Please use the option -pc_factor_mat_solver_type instead. (Silence this warning with -options_suppress_deprecated_warnings)
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 59.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 15 Terminate: Some process (or the batch system) has told this process to end
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see https://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[0]PETSC ERROR: likely location of problem given in stack below
[0]PETSC ERROR: ---------------------  Stack Frames ------------------------------------
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[0]PETSC ERROR:       INSTEAD the line number of the start of the function
[0]PETSC ERROR:       is given.
[0]PETSC ERROR: [0] PetscTrMallocDefault line 157 /home/ben/projects/mfem/petsc/src/sys/memory/mtr.c
[0]PETSC ERROR: [0] PetscMallocA line 401 /home/ben/projects/mfem/petsc/src/sys/memory/mal.c
[0]PETSC ERROR: [0] PCBDDCMatISSubassemble line 7596 /home/ben/projects/mfem/petsc/src/ksp/pc/impls/bddc/bddcprivate.c
[0]PETSC ERROR: [0] PCBDDCSetUpCoarseSolver line 8133 /home/ben/projects/mfem/petsc/src/ksp/pc/impls/bddc/bddcprivate.c
[0]PETSC ERROR: [0] PCBDDCSetUpSolvers line 3719 /home/ben/projects/mfem/petsc/src/ksp/pc/impls/bddc/bddcprivate.c
[0]PETSC ERROR: [0] PCSetUp_BDDC line 1600 /home/ben/projects/mfem/petsc/src/ksp/pc/impls/bddc/bddc.c
[0]PETSC ERROR: [0] PCSetUp line 856 /home/ben/projects/mfem/petsc/src/ksp/pc/interface/precon.c
[0]PETSC ERROR: [0] KSPSetUp line 289 /home/ben/projects/mfem/petsc/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: [0] KSPSolve line 656 /home/ben/projects/mfem/petsc/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Signal received
[0]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.12.5, unknown 
[0]PETSC ERROR: Unknown Name on a arch-linux2-c-debug named p1 by ben Mon Sep 21 17:21:05 2020
[0]PETSC ERROR: Configure options --download-fblaslapack=yes --download-scalapack=yes --download-mumps=yes --download-suitesparse=yes --with-hypre-dir=../hypre/src/hypre --with-shared-libraries=0 --with-debugging=1
[0]PETSC ERROR: #1 User provided function() line 0 in  unknown file
[3]PETSC ERROR: ------------------------------------------------------------------------
[3]PETSC ERROR: Caught signal number 15 Terminate: Some process (or the batch system) has told this process to end
[3]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[3]PETSC ERROR: or see https://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[3]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[3]PETSC ERROR: likely location of problem given in stack below
[3]PETSC ERROR: ---------------------  Stack Frames ------------------------------------
[3]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[3]PETSC ERROR:       INSTEAD the line number of the start of the function
[3]PETSC ERROR:       is given.
[3]PETSC ERROR: [3] PetscTrMallocDefault line 157 /home/ben/projects/mfem/petsc/src/sys/memory/mtr.c
[3]PETSC ERROR: [3] PetscMallocA line 401 /home/ben/projects/mfem/petsc/src/sys/memory/mal.c
[3]PETSC ERROR: [3] PCBDDCMatISSubassemble line 7596 /home/ben/projects/mfem/petsc/src/ksp/pc/impls/bddc/bddcprivate.c
[3]PETSC ERROR: [3] PCBDDCSetUpCoarseSolver line 8133 /home/ben/projects/mfem/petsc/src/ksp/pc/impls/bddc/bddcprivate.c
[3]PETSC ERROR: [3] PCBDDCSetUpSolvers line 3719 /home/ben/projects/mfem/petsc/src/ksp/pc/impls/bddc/bddcprivate.c
[3]PETSC ERROR: [3] PCSetUp_BDDC line 1600 /home/ben/projects/mfem/petsc/src/ksp/pc/impls/bddc/bddc.c
[3]PETSC ERROR: [3] PCSetUp line 856 /home/ben/projects/mfem/petsc/src/ksp/pc/interface/precon.c
[3]PETSC ERROR: [3] KSPSetUp line 289 /home/ben/projects/mfem/petsc/src/ksp/ksp/interface/itfunc.c
[3]PETSC ERROR: [3] KSPSolve line 656 /home/ben/projects/mfem/petsc/src/ksp/ksp/interface/itfunc.c
[3]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[3]PETSC ERROR: Signal received
[3]PETSC ERROR: See https://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
[3]PETSC ERROR: Petsc Release Version 3.12.5, unknown 
[3]PETSC ERROR: Unknown Name on a arch-linux2-c-debug named p1 by ben Mon Sep 21 17:21:05 2020
[3]PETSC ERROR: Configure options --download-fblaslapack=yes --download-scalapack=yes --download-mumps=yes --download-suitesparse=yes --with-hypre-dir=../hypre/src/hypre --with-shared-libraries=0 --with-debugging=1
[3]PETSC ERROR: #1 User provided function() line 0 in  unknown file
--------------------------------------------------------------------------
mpirun noticed that process rank 2 with PID 0 on node p1 exited on signal 9 (Killed).
--------------------------------------------------------------------------
[p1:11138] 1 more process has sent help message help-mpi-api.txt / mpi-abort
[p1:11138] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
makefile:108: recipe for target 'ex5p-test-par' failed
make[2]: *** [ex5p-test-par] Error 137
    Parallel PETSc example [ mpirun -np 4 ex6p ... ]: OK  (8.49s 108392kB)
    Parallel PETSc example [ mpirun -np 4 ex6p ... ]: OK  (8.69s 109468kB)
    Parallel PETSc example [ mpirun -np 4 ex9p ... ]: OK  (1.76s 34212kB)
    Parallel PETSc example [ mpirun -np 4 ex9p ... ]: OK  (0.80s 34564kB)
    Parallel PETSc example [ mpirun -np 4 ex9p ... ]: OK  (0.76s 39628kB)
    Parallel PETSc example [ mpirun -np 4 ex10p ... ]: OK  (2.81s 34004kB)
    Parallel PETSc example [ mpirun -np 4 ex10p ... ]: OK  (0.47s 30536kB)
    Parallel PETSc example [ mpirun -np 4 ex10p ... ]: OK  (0.39s 32844kB)
    Parallel PETSc example [ mpirun -np 4 ex10p ... ]: OK  (0.41s 33144kB)
make[2]: Target 'test' not remade because of errors.
make[2]: Leaving directory '/home/ben/projects/mfem/mfem/examples/petsc'
makefile:72: recipe for target 'petsc/test' failed
make[1]: *** [petsc/test] Error 2
make[1]: Target 'test' not remade because of errors.
make[1]: Leaving directory '/home/ben/projects/mfem/mfem/examples'
Running tests in miniapps/electromagnetics ...
make[1]: Entering directory '/home/ben/projects/mfem/mfem/miniapps/electromagnetics'
make -C ../../miniapps/common
make[2]: Entering directory '/home/ben/projects/mfem/mfem/miniapps/common'
make[2]: Nothing to be done for 'all'.
make[2]: Leaving directory '/home/ben/projects/mfem/mfem/miniapps/common'
    Electromagnetic miniapp [ mpirun -np 4 volta ... ]: OK  (0.30s 24920kB)
    Electromagnetic miniapp [ mpirun -np 4 volta ... ]: OK  (0.29s 24648kB)
    Electromagnetic miniapp [ mpirun -np 4 tesla ... ]: OK  (0.42s 25136kB)
    Electromagnetic miniapp [ mpirun -np 4 maxwell ... ]: OK  (0.29s 24928kB)
    Electromagnetic miniapp [ mpirun -np 4 joule ... ]: OK  (1.88s 50404kB)
make[1]: Leaving directory '/home/ben/projects/mfem/mfem/miniapps/electromagnetics'
Running tests in miniapps/meshing ...
make[1]: Entering directory '/home/ben/projects/mfem/mfem/miniapps/meshing'
make -C ../../miniapps/common
make[2]: Entering directory '/home/ben/projects/mfem/mfem/miniapps/common'
make[2]: Nothing to be done for 'all'.
make[2]: Leaving directory '/home/ben/projects/mfem/mfem/miniapps/common'
    Parallel meshing miniapp [ mpirun -np 4 pmesh-optimizer ... ]: OK  (0.23s 25112kB)
    Parallel meshing miniapp [ mpirun -np 4 pminimal-surface ... ]: OK  (0.44s 25224kB)
    Meshing miniapp [ mobius-strip ... ]: OK  (0.01s 11820kB)
    Meshing miniapp [ klein-bottle ... ]: OK  (0.02s 11848kB)
    Meshing miniapp [ toroid ... ]: OK  (0.01s 11556kB)
    Meshing miniapp [ trimmer ... ]: OK  (0.01s 10940kB)
    Meshing miniapp [ twist ... ]: OK  (0.01s 11876kB)
    Meshing miniapp [ extruder ... ]: OK  (0.01s 10956kB)
    Meshing miniapp [ mesh-optimizer ... ]: OK  (0.13s 12936kB)
    Meshing miniapp [ minimal-surface ... ]: OK  (0.74s 14316kB)
    Meshing miniapp [ polar-nc ... ]: OK  (0.01s 11952kB)
make[1]: Leaving directory '/home/ben/projects/mfem/mfem/miniapps/meshing'
Running tests in miniapps/navier ...
make[1]: Entering directory '/home/ben/projects/mfem/mfem/miniapps/navier'
    Navier [ mpirun -np 4 navier_mms ... ]: OK  (0.95s 24772kB)
    Navier [ mpirun -np 4 navier_kovasznay ... ]: OK  (1.44s 24988kB)
    Navier [ mpirun -np 4 navier_tgv ... ]: OK  (1.03s 28208kB)
make[1]: Leaving directory '/home/ben/projects/mfem/mfem/miniapps/navier'
Running tests in miniapps/performance ...
make[1]: Entering directory '/home/ben/projects/mfem/mfem/miniapps/performance'
    Performance miniapp [ mpirun -np 4 ex1p ... ]: OK  (2.18s 66812kB)
    Performance miniapp [ ex1 ... ]: OK  (0.20s 15968kB)
make[1]: Leaving directory '/home/ben/projects/mfem/mfem/miniapps/performance'
Running tests in miniapps/tools ...
make[1]: Entering directory '/home/ben/projects/mfem/mfem/miniapps/tools'
make -C ../../miniapps/common
make[2]: Entering directory '/home/ben/projects/mfem/mfem/miniapps/common'
make[2]: Nothing to be done for 'all'.
make[2]: Leaving directory '/home/ben/projects/mfem/mfem/miniapps/common'
make[1]: Leaving directory '/home/ben/projects/mfem/mfem/miniapps/tools'
Running tests in miniapps/toys ...
make[1]: Entering directory '/home/ben/projects/mfem/mfem/miniapps/toys'
make -C ../../miniapps/common
make[2]: Entering directory '/home/ben/projects/mfem/mfem/miniapps/common'
make[2]: Nothing to be done for 'all'.
make[2]: Leaving directory '/home/ben/projects/mfem/mfem/miniapps/common'
    Toys miniapp [ automata ... ]: OK  (0.01s 11116kB)
    Toys miniapp [ life ... ]: OK  (0.01s 11144kB)
    Toys miniapp [ mandel ... ]: OK  (0.76s 23160kB)
    Toys miniapp [ rubik ... ]: OK  (0.01s 11596kB)
    Toys miniapp [ snake ... ]: OK  (0.01s 12044kB)
    Toys miniapp [ lissajous ... ]: OK  (0.01s 11660kB)
    Toys miniapp [ mondrian ... ]: OK  (0.01s 11772kB)
make[1]: Leaving directory '/home/ben/projects/mfem/mfem/miniapps/toys'
Running tests in miniapps/nurbs ...
make[1]: Entering directory '/home/ben/projects/mfem/mfem/miniapps/nurbs'
    NURBS miniapp [ mpirun -np 4 nurbs_ex1p ... ]: OK  (0.31s 25208kB)
    NURBS miniapp [ mpirun -np 4 nurbs_ex1p ... ]: OK  (0.33s 25024kB)
    NURBS miniapp [ mpirun -np 4 nurbs_ex1p ... ]: OK  (0.32s 25212kB)
    NURBS miniapp [ mpirun -np 4 nurbs_ex1p ... ]: OK  (0.22s 25108kB)
    NURBS miniapp [ mpirun -np 4 nurbs_ex1p ... ]: OK  (0.26s 25004kB)
    NURBS miniapp [ mpirun -np 4 nurbs_ex11p ... ]: OK  (0.32s 25092kB)
    NURBS miniapp [ nurbs_ex1 ... ]: OK  (0.04s 13256kB)
    NURBS miniapp [ nurbs_ex1 ... ]: OK  (0.01s 13180kB)
    NURBS miniapp [ nurbs_ex1 ... ]: OK  (0.01s 12396kB)
    NURBS miniapp [ nurbs_ex1 ... ]: OK  (0.25s 20084kB)
    NURBS miniapp [ nurbs_ex1 ... ]: OK  (0.00s 12536kB)
    NURBS miniapp [ nurbs_ex1 ... ]: OK  (0.00s 12616kB)
    NURBS miniapp [ nurbs_ex1 ... ]: OK  (0.00s 12488kB)
    NURBS miniapp [ nurbs_ex1 ... ]: OK  (0.00s 12644kB)
    NURBS miniapp [ nurbs_ex1 ... ]: OK  (0.17s 16848kB)
    NURBS miniapp [ nurbs_ex1 ... ]: OK  (0.00s 12652kB)
    NURBS miniapp [ nurbs_ex1 ... ]: OK  (0.00s 12548kB)
make[1]: Leaving directory '/home/ben/projects/mfem/mfem/miniapps/nurbs'
Running tests in miniapps/gslib ...
make[1]: Entering directory '/home/ben/projects/mfem/mfem/miniapps/gslib'
make[1]: Leaving directory '/home/ben/projects/mfem/mfem/miniapps/gslib'
Running tests in miniapps/adjoint ...
make[1]: Entering directory '/home/ben/projects/mfem/mfem/miniapps/adjoint'
make[1]: Leaving directory '/home/ben/projects/mfem/mfem/miniapps/adjoint'
Some tests failed.
makefile:497: recipe for target 'test' failed
make: *** [test] Error 1

About this issue

  • Original URL
  • State: closed
  • Created 4 years ago
  • Comments: 15 (15 by maintainers)

Most upvoted comments

Ubuntu is a terrible distro. With that (politically correct) said, PETSc heavily relies on BLAS/LAPACK for CPU performances. It is always preferable to configure using whatever the distro provides you instead of downloading a reference implementation like fblaslapack. If you have MKL, just configure using --with-blaslapack-dir=$MKLROOT. Otherwise, if you have blas/lapack installed system-wide, PETSc will find them. A note aside: HYPRE wants BLAS/LAPACK too and they default to their own implementation if not provided. I would make sure PETSc and HYPRE use the same library for these kernels.