I have simple hive select count(*) query,which results in the followingexception. I am using Cloudera cdh3u1 ( hadoop/hbase/hive). However I amable to do "select * from t1" from hive CLI.

Here is output after running "select count(*) from t1".

hive> select count(*) from t1;Total MapReduce jobs = 1Launching Job 1 out of 1Number of reduce tasks determined at compile time: 1In order to change the average load for a reducer (in bytes): set hive.exec.reducers.bytes.per.reducer=<number>In order to limit the maximum number of reducers: set hive.exec.reducers.max=<number>In order to set a constant number of reducers: set mapred.reduce.tasks=<number>org.apache.hadoop.ipc.RemoteException: IPC server unable to read callparameters: java.lang.NoSuchMethodException:org.apache.hadoop.fs.permission.FsPermission$2.<init>() at org.apache.hadoop.ipc.Client.call(Client.java:1107) at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226) at $Proxy4.setPermission(Unknown Source) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) atsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) atorg.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82) atorg.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59) at $Proxy4.setPermission(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.setPermission(DFSClient.java:855) atorg.apache.hadoop.hdfs.DistributedFileSystem.setPermission(DistributedFileSystem.java:560) atorg.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:123) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:839) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) atorg.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127) atorg.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833) at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:807) atorg.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:657) atorg.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:123) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:130) atorg.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1063) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:900) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:748) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:209) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:286) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:513) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) atsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.util.RunJar.main(RunJar.java:186)Job Submission failed with exception'org.apache.hadoop.ipc.RemoteException(IPC server unable to read callparameters: java.lang.NoSuchMethodException:org.apache.hadoop.fs.permission.FsPermission$2.<init>())'FAILED: Execution Error, return code 1 fromorg.apache.hadoop.hive.ql.exec.MapRedTaskhive>Thanks,

looks like the user who uses the statement has not the correct rights.org.apache.hadoop.fs.permission.FsPermission$2.<init>())'

- Alex

On Fri, Dec 16, 2011 at 8:59 AM, jingjung Ng <[EMAIL PROTECTED]> wrote:> Hi,>> I have simple hive select count(*) query,which results in the following> exception. I am using Cloudera cdh3u1 ( hadoop/hbase/hive). However I am> able to do "select * from t1" from hive CLI.>> Here is output after running "select count(*) from t1".>> hive> select count(*) from t1;> Total MapReduce jobs = 1> Launching Job 1 out of 1> Number of reduce tasks determined at compile time: 1> In order to change the average load for a reducer (in bytes):> set hive.exec.reducers.bytes.per.reducer=<number>> In order to limit the maximum number of reducers:> set hive.exec.reducers.max=<number>> In order to set a constant number of reducers:> set mapred.reduce.tasks=<number>> org.apache.hadoop.ipc.RemoteException: IPC server unable to read call> parameters: java.lang.NoSuchMethodException:> org.apache.hadoop.fs.permission.FsPermission$2.<init>()> at org.apache.hadoop.ipc.Client.call(Client.java:1107)> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226)> at $Proxy4.setPermission(Unknown Source)> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)> at> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)> at> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)> at java.lang.reflect.Method.invoke(Method.java:597)> at> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)> at> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)> at $Proxy4.setPermission(Unknown Source)> at org.apache.hadoop.hdfs.DFSClient.setPermission(DFSClient.java:855)> at> org.apache.hadoop.hdfs.DistributedFileSystem.setPermission(DistributedFileSystem.java:560)> at> org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:123)> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:839)> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833)> at java.security.AccessController.doPrivileged(Native Method)> at javax.security.auth.Subject.doAs(Subject.java:396)> at> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127)> at> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833)> at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:807)> at> org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:657)> at> org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:123)> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:130)> at> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1063)> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:900)> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:748)> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:209)> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:286)> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:513)> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)> at> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)> at> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)> at java.lang.reflect.Method.invoke(Method.java:597)> at org.apache.hadoop.util.RunJar.main(RunJar.java:186)> Job Submission failed with exception> 'org.apache.hadoop.ipc.RemoteException(IPC server unable to read call> parameters: java.lang.NoSuchMethodException:> org.apache.hadoop.fs.permission.FsPermission$2.<init>())'> FAILED: Execution Error, return code 1 from

looks like the user who uses the statement has not the correct rights.org.apache.hadoop.fs.permission.FsPermission$2.<init>())'

- Alex

On Fri, Dec 16, 2011 at 8:59 AM, jingjung Ng <[EMAIL PROTECTED]> wrote:> Hi,>> I have simple hive select count(*) query,which results in the following> exception. I am using Cloudera cdh3u1 ( hadoop/hbase/hive). However I am> able to do "select * from t1" from hive CLI.>> Here is output after running "select count(*) from t1".>> hive> select count(*) from t1;> Total MapReduce jobs = 1> Launching Job 1 out of 1> Number of reduce tasks determined at compile time: 1> In order to change the average load for a reducer (in bytes):> set hive.exec.reducers.bytes.per.reducer=<number>> In order to limit the maximum number of reducers:> set hive.exec.reducers.max=<number>> In order to set a constant number of reducers:> set mapred.reduce.tasks=<number>> org.apache.hadoop.ipc.RemoteException: IPC server unable to read call> parameters: java.lang.NoSuchMethodException:> org.apache.hadoop.fs.permission.FsPermission$2.<init>()> at org.apache.hadoop.ipc.Client.call(Client.java:1107)> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226)> at $Proxy4.setPermission(Unknown Source)> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)> at> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)> at> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)> at java.lang.reflect.Method.invoke(Method.java:597)> at> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)> at> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)> at $Proxy4.setPermission(Unknown Source)> at org.apache.hadoop.hdfs.DFSClient.setPermission(DFSClient.java:855)> at> org.apache.hadoop.hdfs.DistributedFileSystem.setPermission(DistributedFileSystem.java:560)> at> org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:123)> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:839)> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833)> at java.security.AccessController.doPrivileged(Native Method)> at javax.security.auth.Subject.doAs(Subject.java:396)> at> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127)> at> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833)> at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:807)> at> org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:657)> at> org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:123)> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:130)> at> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1063)> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:900)> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:748)> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:209)> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:286)> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:513)> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)> at> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)> at> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)> at java.lang.reflect.Method.invoke(Method.java:597)

New Impetus webcast on-demand ‘Big Data Technologies for Social Media Analytics’ available at http://bit.ly/nFdet0.

Visit http://www.impetus.com to know more. Follow us on www.twitter.com/impetuscallingNOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

looks like the user who uses the statement has not the correct rights.org.apache.hadoop.fs.permission.FsPermission$2.<init>())'

- Alex

On Fri, Dec 16, 2011 at 8:59 AM, jingjung Ng <[EMAIL PROTECTED]<mailto:[EMAIL PROTECTED]>> wrote:> Hi,>> I have simple hive select count(*) query,which results in the following> exception. I am using Cloudera cdh3u1 ( hadoop/hbase/hive). However I am> able to do "select * from t1" from hive CLI.>> Here is output after running "select count(*) from t1".>> hive> select count(*) from t1;> Total MapReduce jobs = 1> Launching Job 1 out of 1> Number of reduce tasks determined at compile time: 1> In order to change the average load for a reducer (in bytes):> set hive.exec.reducers.bytes.per.reducer=<number>> In order to limit the maximum number of reducers:> set hive.exec.reducers.max=<number>> In order to set a constant number of reducers:> set mapred.reduce.tasks=<number>> org.apache.hadoop.ipc.RemoteException: IPC server unable to read call> parameters: java.lang.NoSuchMethodException:> org.apache.hadoop.fs.permission.FsPermission$2.<init>()> at org.apache.hadoop.ipc.Client.call(Client.java:1107)> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226)> at $Proxy4.setPermission(Unknown Source)> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)> at> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)> at> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)> at java.lang.reflect.Method.invoke(Method.java:597)> at> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)> at> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)> at $Proxy4.setPermission(Unknown Source)> at org.apache.hadoop.hdfs.DFSClient.setPermission(DFSClient.java:855)> at> org.apache.hadoop.hdfs.DistributedFileSystem.setPermission(DistributedFileSystem.java:560)> at> org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:123)> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:839)> at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:833)> at java.security.AccessController.doPrivileged(Native Method)> at javax.security.auth.Subject.doAs(Subject.java:396)> at> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127)> at> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:833)> at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:807)> at> org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:657)> at> org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:123)> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:130)> at> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1063)> at org.apache.hadoop.hive.ql.Driver.execute(Driver.javaAlexander Lorenzhttp://mapredit.blogspot.com

New Impetus webcast on-demand ‘Big Data Technologies for Social Media Analytics’ available at http://bit.ly/nFdet0.

Visit http://www.impetus.com to know more. Follow us on www.twitter.com/impetuscalling<http://www.twitter.com/impetuscalling>NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.________________________________

New Impetus webcast on-demand ‘Big Data Technologies for Social Media Analytics’ available at http://bit.ly/nFdet0.

Visit http://www.impetus.com to know more. Follow us on www.twitter.com/impetuscallingNOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.

+

Vivek Mishra 2011-12-18, 10:30

NEW: Monitor These Apps!

All projects made searchable here are trademarks of the Apache Software Foundation.
Service operated by Sematext