Java – HDFS write results in “CreateSymbolicLink error (1314): A required privilege is not held by the client.”

HDFS write results in “CreateSymbolicLink error (1314): A required privilege is not held by the client.”… here is a solution to the problem.

HDFS write results in “CreateSymbolicLink error (1314): A required privilege is not held by the client.”

An attempt was made from an example map minifier for Apache Hadoop. The following exception occurred while running the map reduce job. Tried hdfs dfs -chmod 777/ but it didn’t solve the problem.

15/03/10 13:13:10 WARN mapreduce. JobSubmitter: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with
ToolRunner to remedy this.
15/03/10 13:13:10 WARN mapreduce. JobSubmitter: No job jar file set.  User classes may not be found. See Job or Job#setJar(String).
15/03/10 13:13:10 INFO input. FileInputFormat: Total input paths to process : 2
15/03/10 13:13:11 INFO mapreduce. JobSubmitter: number of splits:2
15/03/10 13:13:11 INFO mapreduce. JobSubmitter: Submitting tokens for job: job_1425973278169_0001
15/03/10 13:13:12 INFO mapred. YARNRunner: Job jar is not present. Not adding any jar to the list of resources.
15/03/10 13:13:12 INFO impl. YarnClientImpl: Submitted application application_1425973278169_0001
15/03/10 13:13:12 INFO mapreduce. Job: The url to track the job: http://B2ML10803:8088/proxy/application_1425973278169_0001/
15/03/10 13:13:12 INFO mapreduce. Job: Running job: job_1425973278169_0001
15/03/10 13:13:18 INFO mapreduce. Job: Job job_1425973278169_0001 running in uber mode : false
15/03/10 13:13:18 INFO mapreduce. Job:  map 0% reduce 0%
15/03/10 13:13:18 INFO mapreduce. Job: Job job_1425973278169_0001 failed with state FAILED due to: Application application_1425973278169_0001 failed 2 times due
to AM Container for appattempt_1425973278169_0001_000002 exited with  exitCode: 1
For more detailed output, check application tracking page:http://B2ML10803:8088/proxy/application_1425973278169_0001/Then, click on links to logs of each attemp
t.
Diagnostics: Exception from container-launch.
Container id: container_1425973278169_0001_02_000001
Exit code: 1
Exception message: CreateSymbolicLink error (1314): A required privilege is not held by the client.

Stack trace:

ExitCodeException exitCode=1: CreateSymbolicLink error (1314): A required privilege is not held by the client.

at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
    at org.apache.hadoop.util.Shell.run(Shell.java:455)
    at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715)
    at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:211)
    at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
    at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

Shell output:

1 file(s) moved.

Container exited with a non-zero exit code 1
Failing this attempt. Failing the application.
15/03/10 13:13:18 INFO mapreduce. Job: Counters: 0

Solution

Win 8.1 + hadoop 2.7.0 (build from source).

  1. Run the command prompt in administrator mode

  2. Execute etc\hadoop

  3. \hadoop-env.cmd

  4. Run sbin\start-dfs.cmd

  5. Run sbin\start-yarn.cmd

  6. Now try running your job

Related Problems and Solutions