Java – A loadFileSystems error occurred when calling a program that uses libhdfs

A loadFileSystems error occurred when calling a program that uses libhdfs… here is a solution to the problem.

A loadFileSystems error occurred when calling a program that uses libhdfs

The code is libhdfs test code.

 int main(int argc, char **argv)
{
    hdfsFS fs = hdfsConnect("hdfs://labossrv14", 9000);
    const char* writePath = "/libhdfs_test.txt";
    hdfsFile writeFile = hdfsOpenFile(fs, writePath, O_WRONLY| O_CREAT, 0, 0, 0);
    if(!writeFile)
    {
        fprintf(stderr, "Failed to open %s for writing!\n", writePath);
        exit(-1);
    }
    char* buffer = "Hello, libhdfs!";
    tSize num_written_bytes = hdfsWrite(fs, writeFile, (void*)buffer, strlen(buffer)+1);
    if (hdfsFlush(fs, writeFile))
    {
        fprintf(stderr, "Failed to 'flush' %s\n", writePath);
        exit(-1);
    }
    hdfsCloseFile(fs, writeFile);
}

I worked hard to compile this code, but I couldn’t run the program. The error message is as follows.

loadFileSystems error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
hdfsBuilderConnect(forceNewInstance=0, nn=labossrv14, port=9000, kerbTicketCachePath=(NULL), userName=(NULL)) error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
hdfsOpenFile(/libhdfs_test.txt): constructNewObjectOfPath error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)
Failed to open /libhdfs_test.txt for writing!

I > official document according to Play with this stuff. AND I FOUND THAT THE PROBLEM COULD BE THAT THE CLASSPATH IS INCORRECT.
Below is my CLASSPATH, which is a combination of the classpath generated by “hadoop classpath –glob” and the lib path of jdk and jre.

export CLASSPATH=/home/junzhao/hadoop/hadoop-2.5.2/etc/hadoop:/home/junzhao/hadoop/hadoop-2.5.2/share/hadoop/common/lib/*:/home/junzhao/hadoop/hadoop-2.5.2/ share/hadoop/common/*:/home/junzhao/hadoop/hadoop-2.5.2/share/hadoop/hdfs:/home/junzhao/hadoop/hadoop-2.5.2/share/hadoop/hdfs/lib/*:/home/junzhao/hadoop/hadoop-2.5.2/ share/hadoop/hdfs/*:/home/junzhao/hadoop/hadoop-2.5.2/share/hadoop/yarn/lib/*:/home/junzhao/hadoop/hadoop-2.5.2/share/hadoop/yarn/*:/home/junzhao/hadoop/hadoop-2.5.2/ share/hadoop/mapreduce/lib/*:/home/junzhao/hadoop/hadoop-2.5.2/share/hadoop/mapreduce/*:/contrib/capacity-scheduler/*.jar:/usr/lib/jvm/java-8-oracle/lib:/usr/lib/jvm/ java-8-oracle/jre/lib:$CLASSPATH

Does anyone have some good solutions? Thanks!

Related Problems and Solutions