2

I have a cluster to use Hadoop 2.6.5 (one master that works as namenode and datanode, and two slaves) that I made using VirtualBox (easch node has Xubuntu 16.04 installed).

A priori, the installation is right because I ran a wordcount example and it was OK.

In master:50070 (where I see the namenode information), I get this:

"Max Non Heap Memory is -1 B." 

Do you know what does it mean? I couldn't find the answer, and I want to check it out because after running wordcount I tried to run my own program and it was not succesfull, in despite of it run OK in my single node installation of Hadoop.

I hope to be clear, please let me know if you need more information.

Thank you!

mjbsgll
  • 666
  • 5
  • 20
  • Please update your answer with the error you get when you run your program in cluster mode. – Ramesh Maharjan May 21 '17 at 01:43
  • But the `Max Non Heap Memory is -1 B.` message is there always when I enter in http://master:50070/dfshealth.html#tab-overview . I think, it is not dependent of the application that I ran. That url shows me the following: `Security is off. Safemode is off. 2543 files and directories, 2081 blocks = 4624 total filesystem object(s). Heap Memory used 535.94 MB of 3.87 GB Heap Memory. Max Heap Memory is 3.87 GB. Non Heap Memory used 48.08 MB of 48.88 MB Commited Non Heap Memory. Max Non Heap Memory is -1 B. ` – mjbsgll May 21 '17 at 20:28

1 Answers1

1

Even though I haven't used Hadoop that much, when I go to the Web UI I get the same message "Max Non Heap Memory is -1 B.", as far as I know that only means that the memory usage is undefined.

Link to source of my answer

  • Oh, I see... it is not an error. Thank you! Now I need to find why my application crashes on multi node cluster and not in a single node one. – mjbsgll May 21 '17 at 20:51