How Do You Troubleshoot “Segmentation fault” Errors in Hadoop/HDFS?

Problem scenario
You get a "Segmentation fault" or "Segmentation fault (core dumped)" error when you run any "hdfs" command. (This is open source Hadoop and not a proprietary, or vendor's, version.) What should you do?

Solution

Root cause
There is probably an infinite loop/recursion problem. (Segmentation faults involve writing to the stack. Eventually the memory gets filled up.) There is some configuration problem with your Hadoop / hdfs installation. If the installation used to work, look at backup configurations in Git. If it never worked, this problem will not be easy to correct/fix.

Possible solution
Reinstall Hadoop. You may want to see this posting for assistance.

Leave a comment

Your email address will not be published. Required fields are marked *