‘asciidoc‘ is also required if you want to build a tarball release at some point with ‘make prod’. If not you will get this error:

make[1]: Leaving directory `/root/hue/apps'
make[1]: Entering directory `/root/hue/docs'
--- Generating sdk doc at /root/hue/build/docs/sdk/sdk.html
--- Generated /root/hue/build/docs/sdk/sdk.html
--- Generating release notes at /root/hue/build/docs/release-notes
/bin/bash: line 1: a2x: command not found
/bin/bash: line 1: a2x: command not found
/bin/bash: line 1: a2x: command not found
/bin/bash: line 1: a2x: command not found
/bin/bash: line 1: a2x: command not found
/bin/bash: line 1: a2x: command not found
/bin/bash: line 1: a2x: command not found
/bin/bash: line 1: a2x: command not found
/bin/bash: line 1: a2x: command not found
/bin/bash: line 1: a2x: command not found
/bin/bash: line 1: a2x: command not found
/bin/bash: line 1: a2x: command not found
/bin/bash: line 1: a2x: command not found
/bin/bash: line 1: a2x: command not found
/bin/bash: line 1: a2x: command not found
/bin/bash: line 1: a2x: command not found
/bin/bash: line 1: a2x: command not found
/bin/bash: line 1: a2x: command not found
/bin/bash: line 1: a2x: command not found
/bin/bash: line 1: a2x: command not found
mv: cannot stat ‘release-notes/*.html’: No such file or directory

215 Comments

hi, I have install hadoop 2.5 manually on 4 macmini (1 namenode, 3 slaves). From tutorial above, i didn’t see how to connect hue on hadoop, or it’s automatically connected without adding some configuration ? Or on hue it’s hadoop integration inside, so we don’t have to install hadoop manually?

I have succesfully install hue outside of the cluster with tutorial above with adding some other package(libsasl2-dev python-ldap libmysqlclient-dev) that I have to install first before make apps command, but everytime I succesfully login on hue, I have some error message: http://10.42.11.43:50070/webhdfs/v1/ Failed to access filesystem root, do you have any suggestion?

I give up to install hue outside the cluster, so i install on my namenode, it works fine but now error message has change to “Current value: http://localhost:50070/webhdfs/v1
Filesystem root ‘/’ should be owned by ‘hdfs’ “.
And i can’t upload any file to my hdfs, it’s like read only hdfs, what should I do?

I just edit on hue/desktop/conf/pseudo-distributed.ini, change line from “default_hdfs_superuser=hdfs” to “default_hdfs_superuser=my_user” and it solved, but I still can’t upload anything to hdfs #sigh

[solved] ha ha ha, LOL my netbook(10″) just have a small screen, so the upload button won’t appear on my screen, Im sorry before to ask a stupid question like above, FYI, I use this Hue on hadoop cluster version 2.5.1 on the Ubuntu 14.04 LTS and it works fine, now it’s time to explore the HA of Hue

Hey! I have the same problem and not able to solve it. modified ue/desktop/conf/pseudo-distributed.ini
Home screen shows :
Current value: http://localhost:50070/webhdfs/v1
Failed to access filesystem root

I have a cluster of two machines and using apache hbase and apache hadoop. I have to use hue so that I can interect with hbase or hdfs through GUI. I have installed it successfully on my machine(ubuntu 14.04) but it is showing nothing about hdfs or tables etc. and gives error like

1.oozie server is not running

2.could not connect to local:9090

HBase thrift server cannot be contacted
How to do setting og hue so that it should connect to my running cluster.

Hi there,
I followed your instruction and it worked straight away !
I am using Ubuntu 14.04.1 with Hadoop 2.6.
I am currently working on a small Hadoop Cluster with 7 datanodes and a single namenode, mostly using pig so far but planning to start using other tools and most probably Spark to replace mapreduce.
Thanks a lot !!
Sebastien

Hi there,
i just wanted to update my previous comment in order to eventually help other people to make their Hue server work. I tried to access my Hue server from outside, ie from another computer with no success, even with the proper config with the file pseudo-distributed.ini within /hue/desktop/conf.
I launched the command sudo ./build/env/bin/hue runserver and got:
Validating models…

0 errors found
Django version 1.4.5, using settings ‘desktop.settings’
Development server is running at http://127.0.0.1:8000/
Quit the server with CONTROL-C.

Then I did a bit of search on the internet and found out that I had to run the following command:
sudo ./build/env/bin/hue runserver 0.0.0.0:8888
It then worked fine :o)
This may help a few of us who are not expert yet :o)

Thanks.
Sebastien
(ps: my cluster has now 7 datanodes, should end up with 15. I also started looking at openstreetmap files in order to process them within my cluster)

I am struggling with Oozie 4.1.0 and Hadoop 2.6.
I would like to install Oozie in order for Hue to be operational, I managed to set up Hue and it does work.
However, Oozie installation seems (at least to me) a hard way to go.
I tried the following command ./bin/oozie-setup.sh sharelib create -fs hdfs://master:9000
I then got the following error:[email protected]:~/oozie$ ./bin/oozie-setup.sh sharelib create -fs hdfs://master:9000
setting CATALINA_OPTS=”$CATALINA_OPTS -Xmx1024m”
the destination path for sharelib is: /user/covage/share/lib/lib_20150125183216

About the error, we are not the best for non Hue questions but did you check that your Oozie was compiled or has the good hadoop jars? If you use the regular ‘hdfs’ command to create a file, does it work too?

I have configured hue in pseudo distributed cluster , and can perform file browser operations. And i have configured oozie and oozie server is running. but when i submit pig job, the following error occurs.

Error: E0501 could not perform authorization operation , User: labuser is not allowed to impersonate labuser.

labuser is the username given while hue login.

I have changed proxy user in hadoop core-site.xml as hue and also in oozie -site.xml
as hue as mentioned in configuration. but this error is displayed. if i change to labuser error displayed as hue is not allowed to impersonate hue.

I’m using hadoop 2.6 from source with 1 namenode and 3 datanode (on Ubuntu 14.04 LTS server), but every time I install Hue (3.7) outside the cluster (hadoop 2.6), it always cannot write tmp folder on HDFS, I still can create folder on HDFS trough hue but I cannot upload any file to HDFS trough HUE , so I try to install hue on my namenode everything is going fine, any suggestion?

FYI I have made a changed on pseudo-distributed.ini to point on my namenode IP address like this when I install hue outside the cluster:
fs_defaultfs=hdfs://10.42.11.117:8020
webhdfs_url=http://10.42.11.117:50070/webhdfs/v1

I have enabled webhdfs on hdfs-site.xml on every node and put some proxyuser.hue on every node just like you told
and this is error log that I talk about:
Current value: http://10.42.11.117:50070/webhdfs/v1
Failed to create temporary file “/tmp/hue_config_validation.8426362435317035138”

then I tried create some new folder instead a file trough file browser, it success without any problem
then I tried to create a new file trough file browser on HUE and it’s error, here the log:
Cannot perform operation. Note: you are a Hue admin but not a HDFS superuser (which is “hadoop”).
(‘Connection aborted.’, gaierror(-2, ‘Name or service not known’))

This is not always the case , at least during an update, e.g., yum update . I received these exact errors, and they were only resolved after combing through the hdfs-namenode logs and finding an error stating that the file system image had an old layout version, and the service needs to be restarted with the upgrade option, e.g., /etc/init.d/hadoop-hdfs-namenode upgrade.

Hi I am new to hue.and i am facing some problem when i am starting hue.
if i go to file browser this error is showing .
Cannot access: /user/hadoop. The HDFS REST service is not available. Note: You are a Hue admin but not a HDFS superuser (which is “hdfs”).

Yes I am using hduser.do u have any proper setup guide to install and execute hue on ubuntu .Bcoz i am using ubuntu 14.04 and over that i have installed apche hadoop 2.6.0 and create a hduser add the hduser to group hdoop and i installed hadoop in hduser .then i installed hue in hduser ,after installing i run hue server and open link of localhost:8000 for hue UI and logged in as hduser but it is showing error when i am going to home page or filebrowser page .please suggest me the set up…i have followed you configure on ubuntu 14.04 guide .please suggest some thing or installation video kind of thing bcoz i was trying it for last 7 days .and i couldn’t resolve the issue.please suggest something.

Hi I am facing this issue after doing everthing a sper ur instruction so what should i do for it.

Configuration files located in /home/bigtapp/hue/desktop/conf

Potential misconfiguration detected. Fix and restart Hue.

hadoop.hdfs_clusters.default.webhdfs_url Current value: http://localhost:50070/webhdfs/v1
Failed to access filesystem root
desktop.secret_key Current value:
Secret key should be configured as a random string. All sessions will be lost on restart
Hive Editor Failed to access Hive warehouse: /user/hive/warehouse
Impala Editor No available Impalad to send queries to.
Oozie Editor/Dashboard The app won’t work without a running Oozie server
Pig Editor The app won’t work without a running Oozie server
Spark The app won’t work without a running Livy Spark Server

when i open http://localhost:50070/webhdfs/v1 the page showing this error
{“RemoteException”:{“exception”:”UnsupportedOperationException”,”javaClassName”:”java.lang.UnsupportedOperationException”,”message”:”op=NULL is not supported”}}

I have Hadoop-2.6.0 installed on psedo mode for desktop, and running scripts on command line. I wanted Hue as my Web UI to talk to my standalone installation. Please advise if this is possible with the above Hue installation. If yes, where is this conflict as errored out.

Cannot access: /home/user/HA/data/tmp/usercache/user/appcache/application_1431511586495_0001/container_1431511586495_0001_01_000002/pig-job_1431511586495_0001.log. Note: You are a Hue admin but not a HDFS superuser (which is “user”).

[Errno 2] File /home/user/HA/data/tmp/usercache/user/appcache/application_1431511586495_0001/container_1431511586495_0001_01_000002/pig-job_1431511586495_0001.log not found

Are you also Jayakrishna from a week ago? If so, I would try logging into hue as the “HA” user. That error looks like you are logged into Hue as the “admin” user and your Hadoop setup makes it look like the Hue user doesn’t have permission to look in the HDFS path “/home/user/HA/…”. If that doesn’t fix it, I would recommend taking this this to the mailing list in case other people are running into the same problems: https://groups.google.com/a/cloudera.org/forum/#!forum/hue-user.

Hey, I am new to hbase and wanted to explore hue for processing data. I have already configured hbase in pseudo distributed mode. I have also installed maven and verified it using mvn -version command. Following this tutorial when I reached “make apps” command it is getting stuck while downloading some repository. The output is as follows –
cd /home/hduser/hue/maven && mvn install
[INFO] Scanning for projects…
[INFO]
[INFO] ————————————————————————
[INFO] Building Hue Maven Parent POM 3.8.1-SNAPSHOT
[INFO] ————————————————————————
[INFO]
[INFO] — maven-enforcer-plugin:1.0:enforce (default) @ hue-parent —
Downloading: https://repo.maven.apache.org/maven2/org/codehaus/plexus/plexus-interactivity-api/1.0-alpha-4/plexus-interactivity-api-1.0-alpha-4.jar
Any help would be appreciated.
Thanks in advance.

I am trying to setup dev environment to use *spark notebook*. I was able to get up without issues and when I try to run using **sudo ./build/env/bin/hue runserver** – it looks like running without any issues. However, I am able to access webUI using **http://127.0.0.1:8000/**. Please note that I don’t have Hadoop environment setup. Would it be still possible to use Hue only for Spark?

Validating models…

0 errors found
July 10, 2015 – 07:07:58
Django version 1.6.10, using settings ‘desktop.settings’
Starting development server at http://127.0.0.1:8000/
Quit the server with CONTROL-C.

I am sure that i followed the demo and adding new action successfully using the demo.
I wanner know if the hue.ini configuration will have confluence on Job Designer? At the present time,I just use Hue to manage my hadoop job,I can upload and download file from hdfs by Hue.So what is the problem? ’cause i work in Beijing, i can’t know ur reply,so can we talk by email?
Thanks!

/about INFO:
desktop.secret_key Current value:
Secret key should be configured as a random string. All sessions will be lost on restart
SQLITE_NOT_FOR_PRODUCTION_USE SQLite is only recommended for small development environments with a few users.
Hive Editor Failed to access Hive warehouse: /user/hive/warehouse
Impala Editor No available Impalad to send queries to.
Oozie Editor/Dashboard The app won’t work without a running Oozie server
Pig Editor The app won’t work without a running Oozie server
Spark The app won’t work without a running Livy Spark Server

Hi.
I downloaded the same release you mentioned. I doubt auto completion and showing table columns working in this. I saw some fix in apps\beeswax\src\beeswax\server\dbms.py. But which is not there in tarball release
Regards,

Please help me with this error. I am getting while making HUE installer:

Installed /home/hue/hue/desktop/libs/notebook/src
make[2]: Leaving directory `/home/hue/hue/desktop/libs/notebook’
make ipdb[0.1dev-r1716] ipython[0.10] nose[0.11.3] coverage[3.7.1] nosetty[0.4] werkzeug[0.6] windmill[1.3] pylint[0.28.0]
make[2]: Entering directory `/home/hue/hue/desktop’
— Installing development tool: ipdb[0.1dev-r1716]
/home/hue/hue/build/env/bin/python2.7 /home/hue/hue/build/env/bin/easy_install -f http://archive.cloudera.com/desktop-sdk-python-packages/ \
-H pypi.python.org,archive.cloudera.com -qq ipdb==0.1dev-r1716
/home/hue/hue/build/env/local/lib/python2.7/site-packages/pkg_resources/__init__.py:2510: PEP440Warning: ‘xdiagnose (3.6.3build2)’ is being parsed as a legacy, non PEP 440, version. You may find odd behavior and sort order. In particular it will be sorted as less than 0.0. It is recommend to migrate to PEP 440 compatible versions.
PEP440Warning,
/home/hue/hue/build/env/local/lib/python2.7/site-packages/pkg_resources/__init__.py:2510: PEP440Warning: ‘python-debian (0.1.21-nmu2ubuntu2)’ is being parsed as a legacy, non PEP 440, version. You may find odd behavior and sort order. In particular it will be sorted as less than 0.0. It is recommend to migrate to PEP 440 compatible versions.
PEP440Warning,
Download error on http://archive.cloudera.com/desktop-sdk-python-packages/: [Errno -2] Name or service not known — Some packages may not be found!
Download error on https://pypi.python.org/simple/ipdb/: [Errno -2] Name or service not known — Some packages may not be found!
Couldn’t find index page for ‘ipdb’ (maybe misspelled?)
Download error on https://pypi.python.org/simple/: [Errno -2] Name or service not known — Some packages may not be found!
No local packages or download links found for ipdb==0.1dev-r1716
error: Could not find suitable distribution for Requirement.parse(‘ipdb==0.1dev-r1716′)
make[2]: *** [ipdb[0.1dev-r1716]] Error 1
make[2]: Leaving directory `/home/hue/hue/desktop’
make[1]: *** [/home/hue/hue/build/.devtools] Error 2
make[1]: Leaving directory `/home/hue/hue/desktop’
make: *** [desktop] Error 2

please help
i have done hue install process,and hue runserver is success
but i cant’t connect hue web UI,i try port 127.0.0.1:8000 and 127.0.0.1:8888 even myhost:8000 and myhost:8888,what’s wrong with this ?

I have installed Hue on my local Ubuntu 16.04 machine. I am struggling to point it to my remote Hdp 2.5 cluster. I have made the configurations as required on the cluster-side as well as in the hue.ini file on my local machine following the [link](http://gethue.com/hadoop-hue-3-on-hdp-installation-tutorial/) But the Hive editor on Hue portal says : “No databases found.” and “Could not connect to :10000”. Am I missing something? Thank you.

I am very new to big data technologies. I have setup hadoop on my PC. I was installing Apache Hue too on ubuntu. I am facing an issue with Maven when I run “make apps” command. Kindly guide me here. Any help is appreciated.