How Do You Know If Yarn or Hadoop’s NameNode Services Are Running?

Problem scenario
How do you know if YARN or Hadoop’s NameNode services are running or not?

Solution
Use the “sudo jps” command.  Some Linux users may not have sufficient permissions to install it.  

If the command is not found, go to the option below for your distribution of Linux:

If you are running a RedHat derivative (e.g., Fedora, CentOS etc.),

 » Read more..

How Do You Troubleshoot the Maven Build Failure of No “POM in this directory”?

Problem scenario
You need to use this following command on a CentOS/RedHat/Fedora Linux server in a /home/hadoop/ directory:

mvn package -Pdist,native -DskipTests -Dtar

The Maven build is failing.  When you run the command with the “-e” flag for verbosity you see this:

[INFO] BUILD FAILURE

[WARNING] The requested profile “dist” could not be activated because it does not exist.
[WARNING] The requested profile “native” could not be activated because it does not exist.

 » Read more..

How Do You Troubleshoot an hdfs Command with the Error “java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)…ConnectionRefused”?

Problem scenario
When trying to run an hdfs command you see this message:

“WARN ipc.Client: Failed to connect to server: localhost/127.0.0.1:54310: try once and fail.
java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:681)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:777)
at org.apache.hadoop.ipc.Client$Connection.access$3500(Client.java:409)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1542)
at org.apache.hadoop.ipc.Client.call(Client.java:1373)
at org.apache.hadoop.ipc.Client.call(Client.java:1337)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)
at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:787)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:398)
at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:163)
at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:155)
at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:335)
at com.sun.proxy.$Proxy11.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1700)
at org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1436)
at org.apache.hadoop.hdfs.DistributedFileSystem$27.doCall(DistributedFileSystem.java:1433)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1433)
at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:64)
at org.apache.hadoop.fs.Globber.doGlob(Globber.java:269)
at org.apache.hadoop.fs.Globber.glob(Globber.java:148)
at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1685)
at org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:326)
at org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:235)
at org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:218)
at org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:103)
at org.apache.hadoop.fs.shell.Command.run(Command.java:165)
at org.apache.hadoop.fs.FsShell.run(FsShell.java:315)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
at org.apache.hadoop.fs.FsShell.main(FsShell.java:378)
ls: Call From contint/10.10.10.10 to localhost:54310 failed on connection exception: java.net.ConnectException: Connection refused;  » Read more..

How Does Azure Compare to AWS?

Problem scenario
You want to know what are some key differences between Azure and AWS.  You do not know where to begin.

Answer (Updated on 1/5/18.)
Advantages of Azure:  When you create a server, you have to click “Purchase.”  AWS almost hides the monetary transaction from the process of creating a server.  Thus it is more clear that you will be billed for using an Azure server. 

 » Read more..

How Do You Copy a File into HDFS without the Error “No such file or directory”?

Problem scenario
You are trying to run a basic Hadoop command to copy a file into HDFS.  You get this error:  copyFromLocal: `hdfs://localhost:54310/user/…’: No such file or directory

How do you copy a file from your OS into HDFS?

Solution
Do one of the following:
Option 1.  Run this command to create a new directory (substitute “jdoe” with the name of your user):

hdfs dfs -mkdir -p /user/jdoe/contint
# Now repeat your copy command

Option 2. 

 » Read more..

How Do You Install kubectl, kubeadm, and kubelet on a CentOS/RHEL/Fedora Server?

Problem scenario
You have a Red Hat derivative distribution of Linux.  You want to use some core Kubernetes utilities.  How do you install kubectl, kubeadm, and kubelet on Linux?

Solution
1.  Create a file /etc/yum.repos.d/kubernetes.repo2.  Have this be the content:

[kubernetes]
name=Kubernetes
baseurl=https://packages.cloud.google.com/yum/repos/kubernetes-el7-x86_64
enabled=1
gpgcheck=1
repo_gpgcheck=1
gpgkey=https://packages.cloud.google.com/yum/doc/yum-key.gpg
https://packages.cloud.google.com/yum/doc/rpm-package-key.gpg

2.  Run this: sudo yum -y install kubelet kubeadm kubectl

 » Read more..

How Do You Troubleshoot a Fatal HDFS Error?

Problem scenario
You run an hdfs command and you get this:

[Fatal Error] core-site.xml:2:6: The processing instruction target matching “[xX][mM][lL]” is not allowed.
17/09/25 04:21:00 FATAL conf.Configuration: error parsing conf core-site.xml
org.xml.sax.SAXParseException; systemId: file:/home/hadoop/hadoop/etc/hadoop/core-site.xml; lineNumber: 2; columnNumber: 6; The processing instruction target matching “[xX][mM][lL]” is not allowed.
        at org.apache.xerces.parsers.DOMParser.parse(Unknown Source)
        at org.apache.xerces.jaxp.DocumentBuilderImpl.parse(Unknown Source)
        at javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java:150)
        at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2531)
        at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2519)
       

 » Read more..

How Do You Troubleshoot Nginx Not Being Removed from Ubuntu Linux Like It Should?

Problem scenario
You want to remove Nginx from an Ubuntu server.  You run this: sudo apt-get remove nginx

But you receive this error:

Reading package lists… Done
Building dependency tree
Reading state information… Done
You might want to run ‘apt-get -f install’ to correct these:
The following packages have unmet dependencies:
 nginx-dbg : Depends: nginx (= 1.12.1-1~xenial)
E: Unmet dependencies.

 » Read more..

How Do You Eliminate Highlighting of Text in MS Word 2016?

Problem scenario
Some text in MS Word 2016 is highlighted with a color as part of its formatting.  No matter what you do (including highlighting it and clicking “No color”) the text remains highlighted.   What should you do?

Solution
Highlight the text (with either the mouse or by holding shift and pressing an arrow key) then press Ctrl and the Spacebar (hold control and tap the spacebar). 

 » Read more..

How Do You Get around the Message “AUTHENTICATING FOR org.freedesktop.systemd1.manage-units”?

Problem scenario
You are running a yum command or running a script on a RHEL server.  You get this prompt:

“==== AUTHENTICATING FOR org.freedesktop.systemd1.manage-units ===
Authentication is required to manage system services or units.
Authenticating as: Cloud User (ec2-user)
Password:

What do you do to troubleshoot this prompt as no password seems to work?

Solution
Cancel out of the prompt. 

 » Read more..