Mongo-Hadoop simple test failed due to NPE
This is an open question posted in the support forum But since I didn’t get any response, I thought I should try to ask questions here.
I have one that uses MongoDB as
Data layer. Currently I’m using Mongo’s Map reduce mechanism, but, me
I’m facing some performance issues. So I thought about using Hadoop
Implement the logic.
I’ve successfully run the example of the Treasury yield and came to mind
Create a simple project just to learn about the mongo-hadoop driver.
So I created a project with the appropriate jar file inserted into the build
path and run it.
Here is my java code:
final Configuration conf = new Configuration();
MongoConfigUtil.setInputURI( conf, "mongodb://
username:[email protected]/locations" );
MongoConfigUtil.setOutputURI( conf, "mongodb://localhost/
test.out" );
System.out.println( "Conf: " + conf );
final Job job = new Job( conf, "word count" );
job.setJarByClass( WordCount.class );
job.setMapperClass( TokenizerMapper.class );
job.setCombinerClass( IntSumReducer.class );
job.setReducerClass( IntSumReducer.class );
job.setOutputKeyClass( Text.class );
job.setOutputValueClass( IntWritable.class );
job.setInputFormatClass( MongoInputFormat.class );
job.setOutputFormatClass( MongoOutputFormat.class );
System.exit( job.waitForCompletion( true ) ? 0 : 1 ); "
But I get this error :
Conf: Configuration: core-default.xml, core-site.xml
12/05/20 14:12:03 WARN util. NativeCodeLoader: Unable to load native-
hadoop library for your platform... using builtin-java classes where
applicable
12/05/20 14:12:03 WARN mapred. JobClient: Use GenericOptionsParser for
parsing the arguments. Applications should implement Tool for the
same.
12/05/20 14:12:03 WARN mapred. JobClient: No job jar file set. User
classes may not be found. See JobConf(Class) or
JobConf#setJar(String).
12/05/20 14:12:03 INFO mapred. JobClient: Cleaning up the staging area
file:/tmp/hadoop-maximos/mapred/staging/maximos1261801897/.staging/
job_local_0001
Exception in thread "main" java.lang.NullPointerException
at java.util.concurrent.ConcurrentHashMap.get(ConcurrentHashMap.java:
796)
at com.mongodb.DBApiLayer.doGetCollection(DBApiLayer.java:116)
at com.mongodb.DBApiLayer.doGetCollection(DBApiLayer.java:43)
at com.mongodb.DB.getCollection(DB.java:81)
at
com.mongodb.hadoop.util.MongoSplitter.calculateSplits(MongoSplitter.java:
51)
at
com.mongodb.hadoop.MongoInputFormat.getSplits(MongoInputFormat.java:
51)
at org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:
962)
at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:979)
at org.apache.hadoop.mapred.JobClient.access$600(JobClient.java:174)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:897)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:416)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.j ava:
1093)
at
org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:
850)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:500)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:530)
at
com.mongodb.hadoop.examples.wordcount.WordCount.main(WordCount.java:
100)
What am I doing wrong? Is this a Mongo, Hadoop, or Mongo-Hadoop problem?
Solution
You seem to have forgotten to specify the name of the collection from which you got the data.
In the example, the line looks like this:
MongoConfigUtil.setInputURI( conf, "mongodb://localhost/test.in" );
However, in your code I see:
MongoConfigUtil.setInputURI( conf, "mongodb://
username:[email protected]/locations" );
I’m not sure if locations is a collection name or a database name, if it’s a collection, then you can try prefixing it with the database name. If it is a database, add .yourcollectionname to the end of it.