3

After start-dfs.sh, I can navigate to http://localhost:9870. The NameNode seems to be running just fine.

Then I click on "Utilities -> Browse the file system" and I get this prompted in the web browser:

Failed to retrieve data from /webhdfs/v1/?op=LISTSTATUS: Server Error

Digging into the logfile ($HADOOP_HOME/logs/hadoop-xxx-namenode-xxx.log), I find this:

2018-11-30 16:47:25,097 WARN org.eclipse.jetty.servlet.ServletHandler: Error for /webhdfs/v1/
java.lang.NoClassDefFoundError: javax/activation/DataSource
    at com.sun.xml.bind.v2.model.impl.RuntimeBuiltinLeafInfoImpl.(RuntimeBuiltinLeafInfoImpl.java:457)
    at com.sun.xml.bind.v2.model.impl.RuntimeTypeInfoSetImpl.(RuntimeTypeInfoSetImpl.java:65)
    at com.sun.xml.bind.v2.model.impl.RuntimeModelBuilder.createTypeInfoSet(RuntimeModelBuilder.java:133)

So a class is missing. Why is this and how do I fix the problem?

Martin Andersson
  • 14,356
  • 8
  • 77
  • 106
  • The stack trace shows that you have the JAXB implementation on your class path but you are missing JAF. Add it and you will at least get past the NCDF error. – Alan Bateman Dec 01 '18 at 08:33
  • Yeah that's what I posted as an answer to my own question below. The URL is in the code snippet towards the bottom. – Martin Andersson Dec 02 '18 at 09:13

1 Answers1

9

Java 9 deprecated the java.activation module. Java 11 removed it completely.

Java 9 and Java 10 users could add the module back on Hadoop's classpath. Put this in $HADOOP_CONF_DIR/hadoop-env.sh(not tested):

export HADOOP_OPTS="${HADOOP_OPTS} --add-modules java.activation "

Java 11 users must first download the jar dependency and make it available on the classpath. But were does it go?

I found that putting the jar in any one of these locations will make Hadoop automagically pick it up with the effect that the online file explorer start working:

$HADOOP_HOME/share/hadoop/common
$HADOOP_HOME/share/hadoop/common/lib
$HADOOP_HOME/share/hadoop/mapreduce
$HADOOP_HOME/share/hadoop/mapreduce/lib

Not sure what exactly the consequences are putting the file in one or the other folder. But, I like to confine my hacks as much as possible and since I already have a separate configuration directory (i.e., not $HADOOP_HOME/etc/hadoop) I'd like to put it there. Having the jar file in any other location also requires of us to add this path to the HADOOP_CLASSPATH variable.

So, copy-paste into your terminal:

URL=https://jcenter.bintray.com/javax/activation/javax.activation-api/1.2.0/javax.activation-api-1.2.0.jar
wget $URL -P $HADOOP_CONF_DIR/lib
echo 'export HADOOP_CLASSPATH+=" $HADOOP_CONF_DIR/lib/*.jar"' >> $HADOOP_CONF_DIR/hadoop-env.sh

As a final note, I think it's safe to say that one can not expect Hadoop to work well on anything but really old Java versions. Googling reveals that still open tickets exist for Java 9, 10 and 11. So essentially, this is a Hadoop problem. Having that said, although we solved one problem of getting the online file explorer to work, there will for sure be many other issues down the line.

Martin Andersson
  • 14,356
  • 8
  • 77
  • 106