Java – Access Hadoop from a remote machine

Access Hadoop from a remote machine… here is a solution to the problem.

Access Hadoop from a remote machine

I

set up hadoop (pseudo-distributed) on the server VM and I’m trying
Use the Java API to access HDFS.

The

fs.default.name on my server is hdfs://0.0.0.0:9000 (like localhost:9000, it won’t accept requests from remote websites).

I can connect to a server on port 9000

$ telnet srv-lab 9000
Trying 1*0.*.30.95...
Connected to srv-lab
Escape character is '^]'.
^C

This indicates to me that the connection should work. The Java code I use is:

try {
        Path pt = new Path(
                "hdfs://srv-lab:9000/test.txt");
        Configuration conf = new Configuration();
        conf.set("fs.default.name", "hdfs://srv-lab:9000");
        FileSystem fs = FileSystem.get(conf);
        BufferedReader br = new BufferedReader(new InputStreamReader(
                fs.open(pt)));
        String line;
        line = br.readLine();
        while (line != null) {
            System.out.println(line);
            line = br.readLine();
        }
    } catch (Exception e) {
        e.printStackTrace();
    }

But what I get is:

java.net.ConnectException: Call From clt-lab/1*0.*.2*2.205 to srv-lab:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

So any hint as to why the connection is rejected even though the connection works fine via telnet?

Solution

Your HDFS entry is incorrect. fs.default.name must be set to hdfs://srv-lab:9000. Set it up and restart your cluster. This will solve the problem

Related Problems and Solutions