Goal: You want to issue a command (e.g., an export command) that has a password in it. But you do not want anyone looking over your shoulder to see it. You do not want the password to be visible in the history of the shell.
Solution: Type this 'read -s foobar'
Then press enter. The cursor will go to the next line. Type in the password. The characters will not echo to the screen. The cursor will not advance. Press enter. Now you can use the variable like this: $foobar
For example, you can use export special_setting=connectionString@format:jdoe:$foobar
You will now have exported a setting that will be interpret "$foobar" as the password you entered. The content of the variable will never be displayed. It is important to remember that the variable name in the first "read -s foobar" statement has no dollar sign "$". While the variable's reference, for evaluation, uses a dollar sign "$". This is a convention in bash scripts as well as interactive bash commands.
How to Troubleshoot PowerShell Downloading a File From the Internet
Problem scenario: When trying to use PowerShell to download a file, get you this error 'Error calling "DownloadFile" with "2" argument(s): "An Exception occurred during a WebClient request."'
Solution: Make sure you have ample space on the disk and make sure the destination folder exists. A typo in the path or if the folder not existing could cause this error. Insufficient disk space could also cause the problem. Is the file you are trying to download very big?
How to Install Hadoop on an AWS Instance of RedHat Linux or an Azure Instance of CentOS
Updated 1/5/18
THESE DIRECTIONS ARE OUTDATED. They are here as a reference for legacy purposes only. For directions on how to install Hadoop on a RedHat or CentOS server, see this article.
Problem scenario
You want to install an open source version on a RedHat derivative distribution of Linux in a public cloud. How do you do this?
Solution
These directions will allow you install Hadoop on a RedHat derivative (e.g., CentOS or RHEL) in a public cloud. These directions are for a single node deployment. (For a multi-node deployment, see these directions.) They include a script, directions how to run the script, and other necessary commands. These directions have been tested to work on RedHat 7.4 in AWS and CentOS 6.8 in Azure. The script was designed to install the open source version of Hadoop 2.8.1.
1. Log into CentOS or RHEL.
2. Run these interactively as they are not easily or safely scripted:
sudo adduser hadoop
sudo passwd hadoop #respond with the password of your choice
ssh-keygen -t rsa -P "" # press enter to the prompt
sudo cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
sudo chmod 0600 ~/.ssh/authorized_keys
ssh 127.0.0.1 # respond with 'y' to accept the fingerprint
cd /tmp
3. Create a script with the content below from "#!/bin/bash" to "Proceed with the manual steps". Beware that this script could overwrite some files. This script was intended for a new OS with no data or special configuration on it.
#!/bin/bash
# Written by continualintegration.com
yum install -y java-1.7.0-openjdk-devel wget
cd /home/hadoop
wget http://apache.claz.org/hadoop/common/hadoop-2.8.1/hadoop-2.8.1.tar.gz
tar xzf hadoop-2.8.1.tar.gz
mv hadoop-2.8.1 hadoop
echo '
export HADOOP_HOME=/home/hadoop/hadoop
export HADOOP_INSTALL=$HADOOP_HOME
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin' >> ~/.bashrc
source ~/.bashrc
sed -i '/export JAVA_HOME/c\export JAVA_HOME=/usr/.' /home/hadoop/hadoop/etc/hadoop/hadoop-env.sh
echo '<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License. See accompanying LICENSE file.
-->
<!-- Put site-specific property overrides in this file. -->
<configuration>
<property>
<name>hadoop.tmp.dir</name>
<value>/app/hadoop/tmp</value>
<description>A base for other temporary directories.</description>
</property>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:54310</value>
<description>The name of the default file system. A URI whose
scheme and authority determine the FileSystem implementation. The
uri has a scheme that determines the config property (fs.SCHEME.impl) naming
the FileSystem implementation class. The uri has authority that is used to
determine the host, port, etc. for a filesystem.</description>
</property>
</configuration>
' > /home/hadoop/hadoop/etc/hadoop/core-site.xml
echo "Proceed with the manual steps"
4. sudo bash nameOfscript.sh # You must run script above as a sudoer or the root user.
5. To get all the hdfs commands to work, one way would be to allow root to ssh to the local server and run "sudo bash" to kick off scripts to start the Hadoop daemons. There are other ways, but here is how to do it with a reliance on the sudo user:
sudo su -
vi /etc/ssh/sshd_config #To allow root user to log in
#For Centos with Azure, change the "PermitRootLogin no" to "PermitRoot Login yes"
#For RHEL with AWS, change the "#PermitRootLogin yes" to "PermitRoot Login yes"
service sshd stop
service sshd start
ssh-keygen -t rsa -P "" #press enter to the default prompt.
If you are using RedHat in AWS, use this command:
cat /home/ec2-user/.ssh/id_rsa.pub >> /root/.ssh/authorized_keys
If you are using CentOS in Azure, use this command (but substitute centos with the username):
cat /home/centos/.ssh/id_rsa.pub >> /root/.ssh/authorized_keys
Run these regardless of what OS or public cloud provider you are using:
cat /root/.ssh/id_rsa.pub >> /root/.ssh/authorized_keys
ssh root@localhost
ssh root@localhost #do it again
exit # the second SSH session
exit # the first SSH session
6. Complete the configuration and start the NameNode service while logged in as root:
hdfs namenode -format
bash /home/hadoop/hadoop/sbin/start-dfs.sh
THESE DIRECTIONS ARE OUTDATED. They are here as a reference for legacy purposes only. For directions on how to install Hadoop on a RedHat or CentOS server, see this article.
How to Install Suricata on an AWS Instance of SUSE Linux
These "directions" to install Suricata on SUSE Linux include a script and how to run it. The script was designed to install Suricata 3.1dev on an AWS instance of SUSE 12 SP1. This script was based on these here. This script requires that your AWS SUSE Linux server is in a security group that has access to the Internet. The script takes approximately five minutes to run. But bandwidth and resources on your instance may vary.
Step #1 Log in and become root.
Step #2 Run this command: vi /tmp/installer.sh
Step #3 Input this script starting with the "#!/bin/bash" line below. The final line is "&& ldconfig"
#!/bin/bash
#Written by continualintegration.com
zypper -n install gcc
zypper -n install zlib-devel
zypper -n install libtool make
zypper -n install libpcre1
zypper -n install autoconf
zypper -n install automake
zypper -n install gcc-c++
zypper -n install pcre-devel
zypper -n install libz1
zypper -n install file-devel
zypper -n install libnet1
zypper -n install libpcap1
zypper -n install libpcap-devel
zypper -n install libnet-devel
zypper -n install libyaml-devel
zypper -n install libyaml-0-2
zypper -n install git-core
zypper -n install wget
zypper -n install libcap-ng0
zypper -n install libcap-ng-devel
zypper -n install libmagic1
zypper -n install file-magic
zypper -n install mozilla-nss
zypper -n install mozilla-nss-devel
zypper -n install mozilla-nspr
zypper -n install mozilla-nspr-devel
zypper -n install mozilla-nss-tools
zypper -n install GeoIP
zypper -n install libGeoIP-devel
git clone git://phalanx.openinfosecfoundation.org/oisf.git \
&& cd oisf/\
&& git clone https://github.com/ironbee/libhtp.git -b 0.5.x \
&& ./autogen.sh \
&& ./configure --prefix=/usr/ --sysconfdir=/etc/ --localstatedir=/var/ \
--disable-gccmarch-native --enable-gccprotect \
--enable-geoip \
--with-libnss-libraries=/usr/lib64 \
--with-libnss-includes=/usr/include/nss3 \
&& make clean && make && make install-full \
&& ldconfig
Step #4 Run the script with this command: sudo bash /tmp/installer.sh
Step #5 Run this command to confirm Suricata is installed: suricata -V
How To Get Hadoop Installed on Ubuntu When There Is a Java Error
Problem scenario: On Ubuntu, after installing Hadoop, when you enter the command '/usr/local/hadoop/bin/hadoop namenode -format' and you get this error:
"Error: JAVA_HOME is not set and could not be found."
Solution: Verify Java is installed ('java -version'). If it is not installed, you can use 'apt-get install -y default-jre'.
Next, go to hadoop-env.sh. Find the export JAVA_HOME stanza. Change the '${JAVA_HOME}' value to '/usr' with no quotes.

How Do You Get All the Security Updates and OS Updates for Your Ubuntu Server to Be Applied?
Problem scenario
When you log into Ubuntu, you see a message like this:
8 packages can be updated.
5 updates are security updates.
You run this: sudo apt-get -y update
This completes. But next time you log in, you see the same message about 8 packages can be updated. What do you do?
Solution
Run this command: sudo apt-get -y upgrade
If you are looking for deploying Hadoop to Ubuntu, see this article.
How to Install Suricata on an AWS Instance of Ubuntu Linux
Updated on 11/28/17
Problem scenario
You want to install Suricata on Ubuntu Linux. How do you do this?
Solution
These directions to install Suricata include a script and how to run it. The optional script in 2.c was based on these here. The optional script in 2.c script requires that your AWS Ubuntu Linux server is in a security group that has access to the Internet. The optional script takes approximately five minutes to run. But bandwidth and resources on your instance may vary. It is advisable to use step 2.a and skip steps 2.b, 2.c, and 3.
Step #1 Log in.
Step #2.a Run this command: sudo apt-get -y update; sudo apt-get -y install suricata
Now go to step #4. If for some reason you do not want to install it from the AWS .deb packages, skip step #2a and go to step #2.b
Step #2.b If you did not do step #2.a, run this command: vi /tmp/installer.sh
Step #2.c If you did not do step #2.a, do this step. This step involves copying the content below to be installer.sh. This script starts with the "#!/bin/bash" line. The final line of the script is "ldconfig"
#!/bin/bash
apt-get install -y libpcre3
apt-get install -y libpcre3-dbg
apt-get install -y libpcre3-dev
sleep 5
apt-get install -y build-essential
apt-get install -y autoconf
apt-get install -y automake
apt-get install -y libtool
apt-get install -y libpcap-dev
apt-get install -y libnet1-dev
apt-get install -y libyaml-0-2
apt-get install -y libyaml-dev
apt-get install -y zlib1g
apt-get install -y zlib1g-dev
apt-get install -y libcap-ng-dev
apt-get install -y libcap-ng0
apt-get install -y make
apt-get install -y libmagic-dev
apt-get update
sleep 5
apt-get install -y libpcre3
apt-get install -y libpcre3-dbg
apt-get install -y libpcre3-dev
apt-get install -y build-essential
apt-get install -y autoconf
apt-get install -y automake
apt-get install -y libtool
apt-get install -y libpcap-dev
apt-get install -y libnet1-dev
apt-get install -y libyaml-0-2
apt-get install -y libyaml-dev
apt-get install -y zlib1g
apt-get install -y zlib1g-dev
apt-get install -y libcap-ng-dev
apt-get install -y libcap-ng0
apt-get install -y make
apt-get install -y libmagic-dev
apt-get update
cd /bin
mkdir suricata
cd suricata
wget http://pkgconfig.freedesktop.org/releases/pkg-config-0.28.tar.gz
tar -xvzf pkg-config-0.28.tar.gz
cd pkg-config-0.28
./configure --with-internal-glib
# to not downgrade glib (ldd --version shows 2.19)
make
make install
cd /bin/suricata
wget http://www.openinfosecfoundation.org/download/suricata-1.4.6.tar.gz
tar -xvzf suricata-1.4.6.tar.gz
cd suricata-1.4.6
./configure --prefix=/usr --sysconfdir=/etc --localstatedir=/var
make install-full
ldconfig
Step #3 Run the script with this command: sudo bash /tmp/installer.sh
Step #4 Run this command to confirm Suricata is installed: suricata -V
How to Use Ant (when You Receive an Error about build.xml Not Existing)
Problem Scenario: You want to use Ant to invoke your customized build.xml file. But when you run it, you get "
Target "build.xml" does not exist in the project ..."
Solution: Call ant with no other variables. Do not use "ant build.xml". Use "ant" by itself.
Miscellaneous: Most CentOS/RedHat repositories will have Ant available. As root, the command "yum install ant" should get it installed. You will have to respond "y" to some of the prompts. Here is a sample build.xml file that works with Ant.
<?xml version="1.0"?>
<project name="Continual Integration Project" default="info">
<target name ="info">
<echo>You correctly invoked Ant!</echo>
</target>
</project>
What Are the Minimum Packages Necessary for Kerberos on CentOS?
Question 1: What are the packages that must be installed on a CentOS server to have it serve as the Kerberos server?
Answer 1: These three packages: krb5-server krb5-libs krb5-auth-dialog
Question 2: What are the packages that must be installed on a CentOS server to have it serve as the Kerberos client?
Answer 2: These three packages: krb5-workstation krb5-libs krb5-auth-dialog
How Can Jenkins Invoke PsExec?
Problem scenario: When using Jenkins, PsExec never seems to work when called as a batch command in a .bat file or PowerShell script. Jenkins logs an error about the PsExec command not being found.
Solution / workaround: PsExec seems to never work if called by a PowerShell script or a batch script via Jenkins. The only exception seems to be if a Scheduled Task calls a .bat file or PowerShell script that uses PsExec. First, create a Scheduled Task that calls a .bat file or PowerShell script that invokes PsExec. Second, create a Jenkins job that has a Build Step of a Windows Batch command or Windows PowerShell command (as either will suffice). Design the PowerShell or batch command to invoke the Scheduled Task, use a command like this:
SCHTASKS /RUN /TN foobar
where "foobar" is the name of the Scheduled Task that you want to run. Now the Jenkins job will call the Scheduled Task which will in turn invoke PsExec.
Alternatively, you may want to go to Services and change the user for the Jenkins service. The service will need to be restarted after you enter the credentials. Finally, PowerShell scripts are known to potentially act differently if they are executed with a local user or from a remote server. Scheduled Tasks can be a workaround if the remote execution of a PowerShell script that fails but works when it is run locally. Don Jones has experienced this problem. This problem seems to happen mostly with very complex or involved PowerShell scripts (e.g., those that install big applications).