The Eclipse Hadoop plugin displays “java.io.EOFException” when trying to connect… here is a solution to the problem.
The Eclipse Hadoop plugin displays “java.io.EOFException” when trying to connect
- I’m trying to use this page as Hadoop set up my Eclipse
- I’m using the Hadoop eclipse plugin jar from here
- My core-site.xml looks like the following:
<property> <name>fs.default.name</name> <value>hdfs://localhost:54310</value> <description>The name of the default file system. A URI whose scheme and authority determine the FileSystem implementation. The uri's scheme determines the config property (fs. SCHEME.impl) naming the FileSystem implementation class. The uri's authority is used to determine the host, port, etc. for a filesystem.</description> </property> </configuration>
- My mapred-site.xml has the following
<property> <name>mapred.job.tracker</name> <value>localhost:54311</value> <description>The host and port that the MapReduce job tracker runs at. If "local", then jobs are run in-process as a single map and reduce task. </description> </property>
I set the hadoop position to under the mapreduce perspective in Eclipse
Location Name: local Map Reduce Master
- Host: localhost
- port: 54310
DFS Master
- Host: localhost
- port: 54311
I get an error when I try to connect:
Error: Call to localhost/127.0.0.1:54311 failed on local exception: java.io.EOFException
- Can someone help me with this?
Thanks
Solution
I didn’t have much of a problem. A Google search will give you https://issues.apache.org/jira/browse/MAPREDUCE-1280
Use jar files in your eclipse
$ cat mapred-site.xml
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!-- Put site-specific property overrides in this file. -->
<configuration>
<property>
<name>mapred.job.tracker</name>
<value>localhost:8021</value>
</property>
</configuration>
I set master to 8021 and DFS master to 8020.
I think you made the same mistake as me. Swap your port number in your Eclipse configuration and it should work