Hi, I have recently installed hadoop-1.0.4 on a linux machine. Whilst working through the post-install instructions contained in the “Quick Start” guide, I incurred the following catastrophic Java runtime error (See below). I have attached the error report file “hs_err_pid24928.log”. I have submitted a Java bug report, but perhaps it is a known hadoop-1.0.4 version problem.

I am a first time user of Hadoop and would welcome guidance on this problem,

Registered in Ireland CRO No.507541 This email and any attachments to it is, unless otherwise stated, confidential, may contain copyright material and is for the use of the intended recipient only. If you have received this email in error, please notify the sender by return and deleting all copies. Any views expressed in this email are those of the sender and do not form part of any contract between Ostia Software Solutions Limited and any other party.

2013/1/17, Sean Hudson <[EMAIL PROTECTED]>:> Hi,> I have recently installed hadoop-1.0.4 on a linux machine. Whilst> working through the post-install instructions contained in the “Quick Start”> guide, I incurred the following catastrophic Java runtime error (See below).> I have attached the error report file “hs_err_pid24928.log”. I have> submitted a Java bug report, but perhaps it is a known hadoop-1.0.4 version> problem.>> I am a first time user of Hadoop and would welcome guidance on this> problem,>> Regards,>> Sean Hudson.>> shu@meath-nua:~/hadoop-1.0.4> bin/hadoop jar hadoop-examples-1.0.4.jar grep> input output 'dfs[a-z.]+'> #> # A fatal error has been detected by the Java Runtime Environment:> #> # SIGFPE (0x8) at pc=0xb7f2b1fb, pid=24928, tid=3074923424> #> # JRE version: 6.0_25-b06> # Java VM: Java HotSpot(TM) Client VM (20.0-b11 mixed mode, sharing> linux-x86 )> # Problematic frame:> # C [ld-linux.so.2+0x91fb] double+0xab> #> # An error report file with more information is saved as:> # /home/shu/hadoop-1.0.4/hs_err_pid24928.log> #> # If you would like to submit a bug report, please visit:> # http://java.sun.com/webapps/bugreport/crash.jsp> # The crash happened outside the Java Virtual Machine in native code.> # See problematic frame for where to report the bug.> #> Aborted> --> Ostia Software Solutions Limited, 6 The Mill Building, The Maltings, Bray,> Co. Wicklow, Ireland>> Registered in Ireland CRO No.507541 This email and any attachments to it> is, unless otherwise stated, confidential, may contain copyright material> and is for the use of the intended recipient only. If you have received> this email in error, please notify the sender by return and deleting all> copies. Any views expressed in this email are those of the sender and do> not form part of any contract between Ostia Software Solutions Limited and> any other party.>

Do you know what causes 1.7 to fail? I am running 1.7 and so far have not done whatever it takes to make it fail.

On 1/17/2013 1:46 PM, Leo Leung wrote:> Use Sun/Oracle 1.6.0_32+ Build should be 20.7-b02+>> 1.7 causes failure and AFAIK, not supported, but you are free to try the latest version and report back.>>>> -----Original Message-----> From: Sean Hudson [mailto:[EMAIL PROTECTED]]> Sent: Thursday, January 17, 2013 6:57 AM> To: [EMAIL PROTECTED]> Subject: Re: Problems>> Hi,> My Java version is>> java version "1.6.0_25"> Java(TM) SE Runtime Environment (build 1.6.0_25-b06) Java HotSpot(TM) Client VM (build 20.0-b11, mixed mode, sharing)>> Would you advise obtaining a later Java version?>> Sean>> -----Original Message-----> From: Jean-Marc Spaggiari> Sent: Thursday, January 17, 2013 2:52 PM> To: [EMAIL PROTECTED]> Subject: Re: Problems>> Hi Sean,>> This is an issue with your JVM. Not related to hadoop.>> Which JVM are you using, and can you try with the last from Sun?>> JM>> 2013/1/17, Sean Hudson <[EMAIL PROTECTED]>:>> Hi,>> I have recently installed hadoop-1.0.4 on a linux machine.>> Whilst working through the post-install instructions contained in the>> “Quick Start”>> guide, I incurred the following catastrophic Java runtime error (See>> below).>> I have attached the error report file “hs_err_pid24928.log”. I have>> submitted a Java bug report, but perhaps it is a known hadoop-1.0.4>> version problem.>>>> I am a first time user of Hadoop and would welcome guidance on this>> problem,>>>> Regards,>>>> Sean Hudson.>>>> shu@meath-nua:~/hadoop-1.0.4> bin/hadoop jar hadoop-examples-1.0.4.jar>> grep input output 'dfs[a-z.]+'>> #>> # A fatal error has been detected by the Java Runtime Environment:>> #>> # SIGFPE (0x8) at pc=0xb7f2b1fb, pid=24928, tid=3074923424 # # JRE>> version: 6.0_25-b06 # Java VM: Java HotSpot(TM) Client VM (20.0-b11>> mixed mode, sharing>> linux-x86 )>> # Problematic frame:>> # C [ld-linux.so.2+0x91fb] double+0xab # # An error report file with>> more information is saved as:>> # /home/shu/hadoop-1.0.4/hs_err_pid24928.log>> #>> # If you would like to submit a bug report, please visit:>> # http://java.sun.com/webapps/bugreport/crash.jsp>> # The crash happened outside the Java Virtual Machine in native code.>> # See problematic frame for where to report the bug.>> #>> Aborted>> -->> Ostia Software Solutions Limited, 6 The Mill Building, The Maltings,>> Bray, Co. Wicklow, Ireland>>>> Registered in Ireland CRO No.507541 This email and any attachments to>> it is, unless otherwise stated, confidential, may contain copyright>> material and is for the use of the intended recipient only. If you>> have received this email in error, please notify the sender by return>> and deleting all copies. Any views expressed in this email are those>> of the sender and do not form part of any contract between Ostia>> Software Solutions Limited and any other party.>>>> --> Ostia Software Solutions Limited, 6 The Mill Building, The Maltings, Bray, Co. Wicklow, Ireland>> Registered in Ireland CRO No.507541 This email and any attachments to it is, unless otherwise stated, confidential, may contain copyright material and is for the use of the intended recipient only. If you have received this email in error, please notify the sender by return and deleting all copies. Any views expressed in this email are those of the sender and do not form part of any contract between Ostia Software Solutions Limited and any other party.

Do you know what causes 1.7 to fail? I am running 1.7 and so far have not done whatever it takes to make it fail.

On 1/17/2013 1:46 PM, Leo Leung wrote:> Use Sun/Oracle 1.6.0_32+ Build should be 20.7-b02+>> 1.7 causes failure and AFAIK, not supported, but you are free to try the latest version and report back.>>>> -----Original Message-----> From: Sean Hudson [mailto:[EMAIL PROTECTED]]> Sent: Thursday, January 17, 2013 6:57 AM> To: [EMAIL PROTECTED]> Subject: Re: Problems>> Hi,> My Java version is>> java version "1.6.0_25"> Java(TM) SE Runtime Environment (build 1.6.0_25-b06) Java HotSpot(TM) > Client VM (build 20.0-b11, mixed mode, sharing)>> Would you advise obtaining a later Java version?>> Sean>> -----Original Message-----> From: Jean-Marc Spaggiari> Sent: Thursday, January 17, 2013 2:52 PM> To: [EMAIL PROTECTED]> Subject: Re: Problems>> Hi Sean,>> This is an issue with your JVM. Not related to hadoop.>> Which JVM are you using, and can you try with the last from Sun?>> JM>> 2013/1/17, Sean Hudson <[EMAIL PROTECTED]>:>> Hi,>> I have recently installed hadoop-1.0.4 on a linux machine.>> Whilst working through the post-install instructions contained in the >> “Quick Start”>> guide, I incurred the following catastrophic Java runtime error (See >> below).>> I have attached the error report file “hs_err_pid24928.log”. I have >> submitted a Java bug report, but perhaps it is a known hadoop-1.0.4 >> version problem.>>>> I am a first time user of Hadoop and would welcome guidance on this >> problem,>>>> Regards,>>>> Sean Hudson.>>>> shu@meath-nua:~/hadoop-1.0.4> bin/hadoop jar >> hadoop-examples-1.0.4.jar grep input output 'dfs[a-z.]+'>> #>> # A fatal error has been detected by the Java Runtime Environment:>> #>> # SIGFPE (0x8) at pc=0xb7f2b1fb, pid=24928, tid=3074923424 # # JRE>> version: 6.0_25-b06 # Java VM: Java HotSpot(TM) Client VM (20.0-b11 >> mixed mode, sharing>> linux-x86 )>> # Problematic frame:>> # C [ld-linux.so.2+0x91fb] double+0xab # # An error report file >> with more information is saved as:>> # /home/shu/hadoop-1.0.4/hs_err_pid24928.log>> #>> # If you would like to submit a bug report, please visit:>> # http://java.sun.com/webapps/bugreport/crash.jsp>> # The crash happened outside the Java Virtual Machine in native code.>> # See problematic frame for where to report the bug.>> #>> Aborted>> -->> Ostia Software Solutions Limited, 6 The Mill Building, The Maltings, >> Bray, Co. Wicklow, Ireland>>>> Registered in Ireland CRO No.507541 This email and any attachments to >> it is, unless otherwise stated, confidential, may contain copyright >> material and is for the use of the intended recipient only. If you >> have received this email in error, please notify the sender by return >> and deleting all copies. Any views expressed in this email are those >> of the sender and do not form part of any contract between Ostia >> Software Solutions Limited and any other party.>>>> --> Ostia Software Solutions Limited, 6 The Mill Building, The Maltings, > Bray, Co. Wicklow, Ireland>> Registered in Ireland CRO No.507541 This email and any attachments to it is, unless otherwise stated, confidential, may contain copyright material and is for the use of the intended recipient only. If you have received this email in error, please notify the sender by return and deleting all copies. Any views expressed in this email are those of the sender and do not form part of any contract between Ostia Software Solutions Limited and any other party.

Some of the unit tests fail with 1.7. HDFS and MR mostly work OK, butif you run into problems with 1.7 the first question will be "does itwork in 1.6?".

-andy

On Thu, Jan 17, 2013 at 11:19 AM, Chris Mawata <[EMAIL PROTECTED]> wrote:> Do you know what causes 1.7 to fail? I am running 1.7 and so far have not> done whatever it takes to make it fail.>>> On 1/17/2013 1:46 PM, Leo Leung wrote:>>>> Use Sun/Oracle 1.6.0_32+ Build should be 20.7-b02+>>>> 1.7 causes failure and AFAIK, not supported, but you are free to try the>> latest version and report back.>>>>>>>> -----Original Message----->> From: Sean Hudson [mailto:[EMAIL PROTECTED]]>> Sent: Thursday, January 17, 2013 6:57 AM>> To: [EMAIL PROTECTED]>> Subject: Re: Problems>>>> Hi,>> My Java version is>>>> java version "1.6.0_25">> Java(TM) SE Runtime Environment (build 1.6.0_25-b06) Java HotSpot(TM)>> Client VM (build 20.0-b11, mixed mode, sharing)>>>> Would you advise obtaining a later Java version?>>>> Sean>>>> -----Original Message----->> From: Jean-Marc Spaggiari>> Sent: Thursday, January 17, 2013 2:52 PM>> To: [EMAIL PROTECTED]>> Subject: Re: Problems>>>> Hi Sean,>>>> This is an issue with your JVM. Not related to hadoop.>>>> Which JVM are you using, and can you try with the last from Sun?>>>> JM>>>> 2013/1/17, Sean Hudson <[EMAIL PROTECTED]>:>>>>>> Hi,>>> I have recently installed hadoop-1.0.4 on a linux machine.>>> Whilst working through the post-install instructions contained in the>>> “Quick Start”>>> guide, I incurred the following catastrophic Java runtime error (See>>> below).>>> I have attached the error report file “hs_err_pid24928.log”. I have>>> submitted a Java bug report, but perhaps it is a known hadoop-1.0.4>>> version problem.>>>>>> I am a first time user of Hadoop and would welcome guidance on this>>> problem,>>>>>> Regards,>>>>>> Sean Hudson.>>>>>> shu@meath-nua:~/hadoop-1.0.4> bin/hadoop jar hadoop-examples-1.0.4.jar>>> grep input output 'dfs[a-z.]+'>>> #>>> # A fatal error has been detected by the Java Runtime Environment:>>> #>>> # SIGFPE (0x8) at pc=0xb7f2b1fb, pid=24928, tid=3074923424 # # JRE>>> version: 6.0_25-b06 # Java VM: Java HotSpot(TM) Client VM (20.0-b11>>> mixed mode, sharing>>> linux-x86 )>>> # Problematic frame:>>> # C [ld-linux.so.2+0x91fb] double+0xab # # An error report file with>>> more information is saved as:>>> # /home/shu/hadoop-1.0.4/hs_err_pid24928.log>>> #>>> # If you would like to submit a bug report, please visit:>>> # http://java.sun.com/webapps/bugreport/crash.jsp>>> # The crash happened outside the Java Virtual Machine in native code.>>> # See problematic frame for where to report the bug.>>> #>>> Aborted>>> -->>> Ostia Software Solutions Limited, 6 The Mill Building, The Maltings,>>> Bray, Co. Wicklow, Ireland>>>>>> Registered in Ireland CRO No.507541 This email and any attachments to>>> it is, unless otherwise stated, confidential, may contain copyright>>> material and is for the use of the intended recipient only. If you>>> have received this email in error, please notify the sender by return>>> and deleting all copies. Any views expressed in this email are those>>> of the sender and do not form part of any contract between Ostia>>> Software Solutions Limited and any other party.>>>>>>> -->> Ostia Software Solutions Limited, 6 The Mill Building, The Maltings, Bray,>> Co. Wicklow, Ireland>>>> Registered in Ireland CRO No.507541 This email and any attachments to it>> is, unless otherwise stated, confidential, may contain copyright material>> and is for the use of the intended recipient only. If you have received this>> email in error, please notify the sender by return and deleting all copies.>> Any views expressed in this email are those of the sender and do not form>> part of any contract between Ostia Software Solutions Limited and any other>> party.

Thanks -- that would explain why I have not got into trouble yet as I am only usingMR and HDFS.On 1/17/2013 5:40 PM, Andy Isaacson wrote:> Some of the unit tests fail with 1.7. HDFS and MR mostly work OK, but> if you run into problems with 1.7 the first question will be "does it> work in 1.6?".>> -andy>> On Thu, Jan 17, 2013 at 11:19 AM, Chris Mawata <[EMAIL PROTECTED]> wrote:>> Do you know what causes 1.7 to fail? I am running 1.7 and so far have not>> done whatever it takes to make it fail.>>>>>> On 1/17/2013 1:46 PM, Leo Leung wrote:>>> Use Sun/Oracle 1.6.0_32+ Build should be 20.7-b02+>>>>>> 1.7 causes failure and AFAIK, not supported, but you are free to try the>>> latest version and report back.>>>>>>>>>>>> -----Original Message----->>> From: Sean Hudson [mailto:[EMAIL PROTECTED]]>>> Sent: Thursday, January 17, 2013 6:57 AM>>> To: [EMAIL PROTECTED]>>> Subject: Re: Problems>>>>>> Hi,>>> My Java version is>>>>>> java version "1.6.0_25">>> Java(TM) SE Runtime Environment (build 1.6.0_25-b06) Java HotSpot(TM)>>> Client VM (build 20.0-b11, mixed mode, sharing)>>>>>> Would you advise obtaining a later Java version?>>>>>> Sean>>>>>> -----Original Message----->>> From: Jean-Marc Spaggiari>>> Sent: Thursday, January 17, 2013 2:52 PM>>> To: [EMAIL PROTECTED]>>> Subject: Re: Problems>>>>>> Hi Sean,>>>>>> This is an issue with your JVM. Not related to hadoop.>>>>>> Which JVM are you using, and can you try with the last from Sun?>>>>>> JM>>>>>> 2013/1/17, Sean Hudson <[EMAIL PROTECTED]>:>>>> Hi,>>>> I have recently installed hadoop-1.0.4 on a linux machine.>>>> Whilst working through the post-install instructions contained in the>>>> ï¿½Quick Startï¿½>>>> guide, I incurred the following catastrophic Java runtime error (See>>>> below).>>>> I have attached the error report file ï¿½hs_err_pid24928.logï¿½. I have>>>> submitted a Java bug report, but perhaps it is a known hadoop-1.0.4>>>> version problem.>>>>>>>> I am a first time user of Hadoop and would welcome guidance on this>>>> problem,>>>>>>>> Regards,>>>>>>>> Sean Hudson.>>>>>>>> shu@meath-nua:~/hadoop-1.0.4> bin/hadoop jar hadoop-examples-1.0.4.jar>>>> grep input output 'dfs[a-z.]+'>>>> #>>>> # A fatal error has been detected by the Java Runtime Environment:>>>> #>>>> # SIGFPE (0x8) at pc=0xb7f2b1fb, pid=24928, tid=3074923424 # # JRE>>>> version: 6.0_25-b06 # Java VM: Java HotSpot(TM) Client VM (20.0-b11>>>> mixed mode, sharing>>>> linux-x86 )>>>> # Problematic frame:>>>> # C [ld-linux.so.2+0x91fb] double+0xab # # An error report file with>>>> more information is saved as:>>>> # /home/shu/hadoop-1.0.4/hs_err_pid24928.log>>>> #>>>> # If you would like to submit a bug report, please visit:>>>> # http://java.sun.com/webapps/bugreport/crash.jsp>>>> # The crash happened outside the Java Virtual Machine in native code.>>>> # See problematic frame for where to report the bug.>>>> #>>>> Aborted>>>> -->>>> Ostia Software Solutions Limited, 6 The Mill Building, The Maltings,>>>> Bray, Co. Wicklow, Ireland>>>>>>>> Registered in Ireland CRO No.507541 This email and any attachments to>>>> it is, unless otherwise stated, confidential, may contain copyright>>>> material and is for the use of the intended recipient only. If you>>>> have received this email in error, please notify the sender by return>>>> and deleting all copies. Any views expressed in this email are those>>>> of the sender and do not form part of any contract between Ostia>>>> Software Solutions Limited and any other party.>>>>>>> -->>> Ostia Software Solutions Limited, 6 The Mill Building, The Maltings, Bray,>>> Co. Wicklow, Ireland>>>>>> Registered in Ireland CRO No.507541 This email and any attachments to it>>> is, unless otherwise stated, confidential, may contain copyright material>>> and is for the use of the intended recipient only. If you have received this

Leo, I downloaded the suggested 1.6.0_32 Java version to my home directory, but I am still experiencing the same problem (See error below).The only thing that I have set in my hadoop-env.sh file is the JAVA_HOME environment variable. I have also tried it with the Java directory added to PATH.

Registered in Ireland CRO No.507541 This email and any attachments to it is, unless otherwise stated, confidential, may contain copyright material and is for the use of the intended recipient only. If you have received this email in error, please notify the sender by return and deleting all copies. Any views expressed in this email are those of the sender and do not form part of any contract between Ostia Software Solutions Limited and any other party. Ostia Software Solutions Limited, 6 The Mill Building, The Maltings, Bray, Co. Wicklow, Ireland

Registered in Ireland CRO No.507541 This email and any attachments to it is, unless otherwise stated, confidential, may contain copyright material and is for the use of the intended recipient only. If you have received this email in error, please notify the sender by return and deleting all copies. Any views expressed in this email are those of the sender and do not form part of any contract between Ostia Software Solutions Limited and any other party.

It's strange. You should not faced that. I faced same kind of issueson a desktop with memory errors. Can you install memtest86 and fulltytest your memory (one pass is enought) to make sure you don't haveissues on that side?

> Hi Ke,> We are still looking at possible complications of the VM> environment. I will post whatever we discover.>> Thanks for your interest,>> Sean>> *From:* ke yuan <[EMAIL PROTECTED]>> *Sent:* Friday, January 25, 2013 2:45 AM> *To:* [EMAIL PROTECTED]> *Subject:* Re: Problems>> is there anything done with hardware? i used thinkpad t430,this problem> occurs,but i used about 100 machines ,there is nothing to do with this> ,all the machines is redhat 6.0,and the jdk is jdk1.5 to jdk1.6 , so i> think there is something to do with the hardware，any idea?>> 2013/1/22 Jean-Marc Spaggiari <[EMAIL PROTECTED]>>>> Hi Sean,>>>> Will you be able to run the memtest86 on this VM? Maybe it's an issue>> with the way the VM is managing the memory?>>>> I ran HBase+Hadoop on a desktop with only 1.5G. So you should not have>> any issue with 6GB.>>>> I don't think the issue you are facing is related to hadoop. Can you>> try to run a simple Java application in you JVM? Something which will>> use lot of memory. And see if it works?>>>> JM>>>> 2013/1/22, Sean Hudson <[EMAIL PROTECTED]>:>> > Hi Jean-Marc,>> > The Linux machine on which I am attempting to>> get>> > Hadoop running is actually Linux running in a VM partition. This VM>> > partition had 2 Gigs of RAM when I first encountered the problem. This>> RAM>> > allocation has been bumped up to 6 Gigs, but the problem still persists,>> > i.e>> >>> > bin/hadoop jar hadoop-*-examples.jar grep input output 'dfs[a-z.]+'>> still>> > crashes out as before.>> >>> > Is there a minimum RAM size requirement?>> > Will Hadoop run correctly on Linux in a VM partition?>> >>> > I had attempted to run Hadoop in>> Pseudo-Distributed>> >>> > Operation mode and this included modifying the conf/core-site.xml,>> > conf/hdfs-site.xml and the conf/mapred-site.xml files as per the Quick>> Start>> >>> > instructions. I also formatted a new distributed-filesystem as per the>> > instructions. To re-test in Standalone mode with 6 Gigs of RAM, I>> reversed>> > the changes to the above three .xml files in /conf. However, I don't>> see a>> > way to back-out the distributed-filesystem. Will the existence of this>> > distributed-filesystem interfere with my Standalone tests?>> >>> > Regards,>> >>> > Sean Hudson>> >>> > -----Original Message----->> > From: Jean-Marc Spaggiari>> > Sent: Friday, January 18, 2013 3:24 PM>> > To: [EMAIL PROTECTED]>> > Subject: Re: Problems>> >>> > Hi Sean,>> >>> > It's strange. You should not faced that. I faced same kind of issues>> > on a desktop with memory errors. Can you install memtest86 and fullty>> > test your memory (one pass is enought) to make sure you don't have>> > issues on that side?>> >>> > 2013/1/18, Sean Hudson <[EMAIL PROTECTED]>:>> >> Leo,>> >> I downloaded the suggested 1.6.0_32 Java version to my home>> >> directory, but I am still experiencing the same problem (See error>> >> below).>> >> The only thing that I have set in my hadoop-env.sh file is the>> JAVA_HOME>> >> environment variable. I have also tried it with the Java directory>> added>> >> to>> >>>> >> PATH.>> >>>> >> export JAVA_HOME=/home/shu/jre1.6.0_32>> >> export PATH=$PATH:/home/shu/jre1.6.0_32>> >>>> >> Every other environment variable is defaulted.>> >>>> >> Just to clarify, I have tried this in Local Standalone mode and also in>> >> Pseudo-Distributed Mode with the same result.>> >>>> >> Frustrating to say the least,>> >>>> >> Sean Hudson>> >>>> >>>> >> shu@meath-nua:~/hadoop-1.0.4> bin/hadoop jar hadoop-examples-1.0.4.jar>> >> grep>> >>>> >> input output 'dfs[a-z.]+'>> >> #>> >> # A fatal error has been detected by the Java Runtime Environment:>> >> #>> >> # SIGFPE (0x8) at pc=0xb7fc51fb, pid=23112, tid=3075554208

s

+

Prashant Sharma 2013-01-25, 12:02

NEW: Monitor These Apps!

All projects made searchable here are trademarks of the Apache Software Foundation.
Service operated by Sematext