c - MPI Segmentation fault (signal 11) -


i have been trying more 2 days see mistakes have done couldn't find anything. keep getting following error:

= bad termination of 1 of application processes

= exit code: 139

= cleaning remaining processes

= can ignore below cleanup messages

your application terminated exit string: segmentation fault (signal 11)  typically refers problem application.  please see faq page debugging suggestions  make: *** [run] error 139 

so problem in mpi_bcast , in function have mpi_gather. can me figure out what's wrong? when compile code type following:

/usr/bin/mpicc  -i/usr/include   -l/usr/lib  z.main.c  z.mainmr.c  z.mainwr.c  -o  1dcode -g  -lm 

for run:

usr/bin/mpirun -np 2 ./1dcode dat.txt o.out.txt 

for example code includes function:

#include <stdio.h> #include <stdlib.h> #include <ctype.h> #include <math.h> #include <string.h> #include "functions.h" #include <mpi.h> /*...................z.mainmr master function............. */ void master(int argc, char *argv[], int nproc, int nwrs, int mster) {  /*... define variables going use in z.mainmr function..*/ double tend, dtfactor, dtout, d, b, dx, dtexpl, dt, time; int mm, m, maxsteps, nsteps; file *datp, *outp; /*.....reading data file "dat" saving data in o.out.....*/ datp = fopen(argv[1],"r"); // open file in read mode outp = fopen(argv[argc-1],"w"); // open output file in write mode if(datp != null) // if data file not empty continue { fscanf(datp,"%d %lf %lf %lf %lf %lf",&mm,&tend,&dtfactor,&dtout,&d,&b);    // read data fprintf(outp,"data>>>\nmm=%d\ntend=%lf\ndtfactor=%lf\ndtout=%lf\nd=%lf\nb=%lf\n",mm,tend,dtfactor,dtout,d,b); fclose(datp); // close data file fclose(outp); // close output file } else // if file empty print error message {     printf("there wrong. maybe file empty.\n"); }  /*.... find dx, m, dtexpl, dt , maxsteps........*/ dx = 1.0/ (double) mm; m = b * mm; dtexpl = (dx * dx) / (2.0 * d); dt = dtfactor * dtexpl; maxsteps = (int)( tend / dt ) + 1;  /*...pack integers in iparms array, reals in parms array...*/ int iparms[2] = {mm,m}; double parms[4] = {dx, dt, d, b};  mpi_bcast(iparms,2, mpi_int,0,mpi_comm_world); mpi_bcast(parms, 4, mpi_double,0, mpi_comm_world); } 

the runtime error due unfortunate combination of specific trait of mpich , feature of c language.

mpich provides both c , fortran interface code within single library file:

000000000007c7a0 w mpi_bcast 00000000000cd180 w mpi_bcast 000000000007c7a0 w pmpi_bcast 00000000000cd180 t pmpi_bcast 000000000007c7a0 w mpi_bcast 000000000007c7a0 w mpi_bcast_ 000000000007c7a0 w mpi_bcast__ 000000000007c7a0 w pmpi_bcast 000000000007c7a0 t pmpi_bcast_ 000000000007c7a0 w pmpi_bcast__ 

the fortran calls exported under variety of aliases in order support many different fortran compilers @ same time, including upper case mpi_bcast. mpi_bcast not declared in mpi.h ansi c allows calling functions without preceding prototype declarations. enabling c99 passing -std=c99 compiler have resulted warning implicit declaration of mpi_bcast function. -wall have resulted in warning. code fail link open mpi, provides fortran interface in separate library mpicc not link against.

even if code compiles , links properly, fortran functions expect arguments passed reference. also, fortran mpi calls take additional output argument error code returned. therefore segmentation fault.

to prevent such errors in future, compile -wall -werror, should catch similar problems possible.


Comments

Popular posts from this blog

php - Invalid Cofiguration - yii\base\InvalidConfigException - Yii2 -

How to show in django cms breadcrumbs full path? -

ruby on rails - npm error: tunneling socket could not be established, cause=connect ETIMEDOUT -