Java – Hadoop Map-Reduce output file exception

Hadoop Map-Reduce output file exception… here is a solution to the problem.

Hadoop Map-Reduce output file exception

I’m getting this error while running a single-node hadoop cluster on amazon d2.2Xlarge. I also can’t view my output. Can anyone provide me with the correct steps to fix this?

"Caused by: org.apache.hadoop.util.DiskChecker$DiskErrorException: Could not
 find any valid local directory for output/file.out"

Here are the steps I performed.

bin/hdfs dfsadmin -safemode leave                            
bin/hadoop fs -mkdir /inputfiles    
bin/hadoop dfsadmin -safemode leave    
bin/hadoop fs -mkdir /output    
bin/hdfs dfsadmin -safemode leave       
bin/hadoop fs -put input1 /inputfiles    
bin/hdfs dfsadmin -safemode leave   
bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.2.jar  
wordcount /inputfiles /output

Solution

You should not create an output directory for Map Reduce jobs.

Delete this command

bin/hadoop fs -mkdir /output  

and change the last command to

bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.2.jar  
wordcount /inputfiles /output1

Make sure you have permission to

create output1 under /

If not, I prefer the directory structure below.

/

home/your_user_name/input is used to enter directory files and

/

home/your_user_name/output is used for the output directory.

Related Problems and Solutions