Segmentation Fault - core dumped. Do I have latest version ?

The problem is difficult to trace with debug print statements and
often dissapears completely when print statements (which resolve
variables) are added. The problem also stops happening when we run in
debugging mode. It is very frustrating.

We do use the 'use blah' statement and some have said in recent posts
that these can cause the problem.

Any advice would be greatly appreciated. (My current solution is to
run the program in production with -d switch ;-)

With the intentions of providing specificity to those contemplating
this problem, the source code is hereby included. Its largish, and is
certainly not the best coded I have ever seen, but it may be useful.
I hope no-one is grossly opposed to me posting >1500 lines of code.
Of course I dont expect anyone to read it all, but someoone might know
where excatly to look. Please feel free to <snip> the crap out of it
in any future replies. Thanks again.

# Extract the Inst keys out from the Hash ordered alphabetically by
# Institute name. CR1187. This will make the control file have the
# resultant Institute columns ordered alphabetically.
foreach $Inst (sort keys %WU600::InstDBaseList) {

# This array matches the Filelist below. It holds the count of
files for
# each filetype as there is a maximum of 2000 records for each
file, and
# then the file in split into a new file and given an incremented
suffix
@fcl = (1,1,1,1,1,1,1,1,1,1,1,1);

#########################################################################
# This routine is used after a file has reached its maximum record
count.
# There is an array of file descriptors and a file list which must be
# manipulated when a file reaches its maximum size.
sub ChangeFileDescriptors {

# ----------------------- LocalWriteOutRecord
-------------------------
#
# This routine writes records (sourced from the main SQL) to any one
of the
# 12 data files that can be open for each Institute. It decides which
file
# to write to based on the data selected.
#
sub LocalWriteOutRecord ($)
{
my ($Inst);
my ($InstCode);
my ($InstCountRef) = @ARG;

&CheckForErrors(); # This sets the CONSFileDescriptor to erroe or
data

# x will be the name of the file the record has to be written to.
Which
# file the data record is written to depends on the content of the
data row.
$x="INV_BAL_".$DebtType."_".$InstCode."_".$Balance."_".$WU600::RunDate."_";

# need to add the file count at this point, normally 0001, but it
can change
# make the filecount match for any number, using reg expr, then
add a "."
$x .= sprintf ("%04d.", ReturnFileCount($x."....\.".$FileType));
$x .= $FileType; # add either ".dat" or ".err" to the filename
for output

Advertisements

On 7 Oct 2003 19:08:15 -0700 (Glen Hendry) wrote:
> I am getting repeated seg faults and dumped cores on Solaris
> (version details below).
>
> The problem is difficult to trace with debug print statements and
> often dissapears completely when print statements (which resolve
> variables) are added. The problem also stops happening when we run
> in debugging mode. It is very frustrating.
>
> We do use the 'use blah' statement and some have said in recent
> posts that these can cause the problem.
>
> Any advice would be greatly appreciated. (My current solution is to
> run the program in production with -d switch ;-)
<snip>
> config_args='-Dcc=gcc -B/usr/ccs/bin/'

Are you sure this is where the gcc compiler is? It looks more like
the path to the compiler that comes with Solaris. It may not be the
source of the issue, but worth looking into.

Share This Page

Welcome to The Coding Forums!

Welcome to the Coding Forums, the place to chat about anything related to programming and coding languages.

Please join our friendly community by clicking the button below - it only takes a few seconds and is totally free. You'll be able to ask questions about coding or chat with the community and help others.
Sign up now!