This directory is intentionally read-only by regular users who want to readthe logs and analyse them. Am I missing some configuration data for Hivethat will tell it to only store metadata elsewhere? I alreadyhave hive.metastore.warehouse.dir set to another location where I havewrite permission.

Can you perform a simple test of hadoop fs -cat hdfs:///logs/2014/03-24/actual_log_file_name.seq by the same user? Also what are the configurations setting for the following?

hive.metastore.execute.setugi

hive.metastore.warehouse.dir

hive.metastore.uris

Thanks,Rahman

On Mar 24, 2014, at 8:17 AM, Oliver <[EMAIL PROTECTED]> wrote:

CONFIDENTIALITY NOTICENOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.

Hi Oliver,In order to create external tables you must be having write access to the folder.Also in order to create external table just give the location of the folder in which your file is located.For example if your file "actual_log_file_name.seq" is stored in "03-24" folder then try one attempt as:CREATE EXTERNAL TABLE mylogs (line STRING) STORED AS SEQUENCEFILE LOCATION 'hdfs:///logs/2014/03-24/Otherwise I too has the same problem and I found out that without write permission user can not create external table.Other alternate is to copy this file under your hdfs directory then create a table (not every time useful).

Can you perform a simple test of hadoop fs -cat hdfs:///logs/2014/03-24/actual_log_file_name.seq by the same user? Also what are the configurations setting for the following?

hive.metastore.execute.setugi

hive.metastore.warehouse.dir

hive.metastore.uris Thanks,Rahman

On Mar 24, 2014, at 8:17 AM, Oliver <[EMAIL PROTECTED]> wrote:

Hi,CONFIDENTIALITY NOTICENOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.

So, It depends on the configurations in my previous email and the proxy settings for Hive users in core-site.xml.: <property> <name>hadoop.proxyuser.hive.groups</name> <value>users</value> <description>Allow the superuser hive to impersonate any members of the group users. Required only when installing Hive. </description></property>where $HIVE_USER is the user owning Hive Services. For example, hive.

<property> <name>hadoop.proxyuser.hive.hosts</name> <value>$Hive_Hostname_FQDN</value> <description>Hostname from where superuser hive can connect. Required only when installing Hive. </description></property>

CONFIDENTIALITY NOTICENOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.

On 24 March 2014 16:45, Abdelrahman Shettia <[EMAIL PROTECTED]>wrote:Yes, I can access that file with the same user using "hadoop fs -cat" aswell as other tools (I've been using Pig up until this point).

I'm not setting this explicitly anywhere.

I have this set in my HiveQL script:SET hive.metastore.warehouse.dir=/user/oliver/warehouse;

This directory already exists, since I created it before running the script.

Try to set these properties in core-site.xml: Using * will allow everyoneto impersonate Hive and the cluster needs to be restarted.

<property> <name>hadoop.proxyuser.hive.groups</name> <value>users</value> <description>Allow the superuser hive to impersonate any members ofthe group users. Required only when installing Hive. </description></property>

where *$HIVE_USER* is the user owning Hive Services. For example, hive.

<property> <name>hadoop.proxyuser.hive.hosts</name> <value>$Hive_Hostname_FQDN</value> <description>Hostname from where superuser hive can connect.Required only when installing Hive. </description></property>

Also, Enable the following configuration in Hive-site.xml:

hive.metastore.execute.setugi

In addition, Please use the directory path only while you are creating thetable and it would be better to have 'hadoop' as the supergroup.

Hope this helps.

Thanks

On Mon, Mar 24, 2014 at 9:59 AM, Oliver <[EMAIL PROTECTED]> wrote:CONFIDENTIALITY NOTICENOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.

NEW: Monitor These Apps!

All projects made searchable here are trademarks of the Apache Software Foundation.
Service operated by Sematext