How Do You Troubleshoot “Segmentation fault” Errors in Hadoop/HDFS?

Problem scenario
You get a "Segmentation fault" or "Segmentation fault (core dumped)" error when you run any "hdfs" command. What should you do?

Solution

Root cause
There is probably an infinite loop/recursion problem. (Segmentation faults involve writing to the stack. Eventually the memory gets filled up.) There is some configuration problem with your Hadoop / hdfs installation. If the installation used to work, look at backup configurations in Git. If it never worked, this problem will not be easy to correct/fix.

Possible solution
Reinstall Hadoop.

(This is the for open source version and not a specific vendor's implementation of Hadoop.)

Leave a comment

Your email address will not be published. Required fields are marked *