Your comment on this question:

Email me at this address if a comment is added after mine:Email me if a comment is added after mine

Privacy: Your email address will only be used for sending these notifications.

1 answer to this question.

Your answer

Your name to display (optional):

Email me at this address if my answer is selected or commented on:Email me if my answer is selected or commented on

Privacy: Your email address will only be used for sending these notifications.

0 votes

It is because the parent directories do not exist yet either. Try hdfs dfs -mkdir -p /user/Hadoop/twitter_data. The -p flag indicates that all nonexistent directories leading up to the given directory are to be created as well.

As for the question you posed in the comments, simply type into your browser http://<host name of the namenode>:<port number>/

And also I suggest you to try this as well

use the below steps command to create the directory:

1) don't run the hadoop and format the namenode:-

$ hadoop namenode -format

2) run hadoop by :-

$ start-all.sh

3)now first make the initial directory then create the another in same directory: