Wednesday, October 14, 2009

Hadoop Error : "ls: Cannot access .: No such file or directory."

When the server was switched, and I started a fresh hadoop (psedo) cluster on the new machine, I was greeted with following error.

"ls: Cannot access .: No such file or directory."

In the log, I found an exception

"org.apache.hadoop.ipc.RemoteException: java.io.IOException: File could only be replicated to 0 nodes, instead of 1"

I went on the goosehunt to see if the causes is multiple NICs etc. But no banana.

Since, I was using the default configuration, the data directory was configured on /tmp. There was a possibility that some restrictions on the /tmp were placed. I changed the data location to my home area and recreated the cluster. The logs didn't show any exception, but "ls" command still refused to show anything meaningful.

After browsing what seemed like a million pages, I jumped on this page, where Philip had written this:
"hadoop dfs -ls" defaults to /user/, so if that directory doesn't exist, it'll show you that error. Try "hadoop dfs -ls /".
Voila. That was true. "hadoop fs -ls /" showed me "/tmp" as the only location. I created /user/ directory, and now "hadoop fs -ls" started behaving correctly.

So, if you have got this error, before you look at any exceptions, try this out. It should save your few hours.

Labels:


This page is powered by Blogger. Isn't yours?

Subscribe to Posts [Atom]