What is an Ingress Resource in Kubernetes?

Question
What is an Ingress resource in Kubernetes?

Answer
It is a Kubernetes technique of exposing services via an individual IP address (page 135 of Kubernetes in Action by Luksa). In TCP/IP networking, the Ingress port allows inbound traffic to route somewhere. Kubernetes supports other IP address to service mapping methods (e.g., NodePort or LoadBalancer). NodePort operates on layer 4 of the OSI seven-layer model (according to this posting).

How Do You Install Splunk in a Docker Container?

Problem scenario
You want to run Splunk from a Docker container. What do you do?

Solution
Prerequisites
Install Docker. If you need assistance, see this posting.

Procedures
1. Run this command: docker pull splunk/splunk:latest

2. Run this command, but replace “simpleword” with the password that you want the administrator account for the web UI to have:

docker run -d -p 8000:8000 -e ‘SPLUNK_START_ARGS=–accept-license’ -e ‘SPLUNK_PASSWORD=simpleword’ splunk/splunk:latest

3.

What is a Deployment in Kubernetes?

Question
What is a deployment in Kubernetes?

Answer
A deployment is a resource that is designed for “deploying applications and updating them declaratively” (page 261 of Kubernetes in Action by Luksa). You may hear the phrase “Deployment controller.” This is a reference to a component of the Controller Manager in the Kubernetes Control Plane (page 262 of Kubernetes in Action by Luksa).

What are Higher-level Resources and Constructs Compared to Lower-level Resources and Constructs?

Question
You have read about high/low level or higher-level (or lower-level) resources and constructs in the context of Kubernetes. What does this designation mean?

Answer
From a high level to a low level we may see a data center, filled with racks, which hold physical servers. Similarly from a high level to a low level we may see Kubernetes nodes supporting pods that have individual Docker containers.

What is a Service in Kubernetes?

Question
A Kubernetes cluster will have a pod running on a node. The ReplicationController will create a copy (or copies) of a pod to ensure it is available. What is a service in the context of a Kubernetes cluster?

Answer
“A Service in Kubernetes is an abstraction which defines a logical set of Pods and a policy by which to access them.” This quote was taken from https://kubernetes.io/docs/tutorials/kubernetes-basics/expose/expose-intro/

A service is a networking service with a static IP address that connects requests to pods in the Kubernetes cluster (page 48 of Kubernetes in Action by Luksa).

What is the Difference between a Readiness Probe and a Liveness Probe in Kubernetes?

Question
What is a difference between a readinessprobe and a livenessprobe besides how the corresponding fields are used in defining a Pod using YAML?

Answer
At most a failed liveness probe will result in the restarting of a container. At most a failed readiness probe will result in the removing a pod from the endpoint of a service (page 150 of Kubernetes in Action).

How Do You Troubleshoot the kubectl Message “Error from server (NotAcceptable): the server was unable to respond with a content type that the client supports”?

Problem scenario
You run a kubectl command, but you see this message:
Error from server (NotAcceptable): the server was unable to respond with a content type that the client supports

What should you do?

Solution
1. Run this command: kubectl version

There should be a “Client Version” and a “Server Version”. The two are probably different.

How Do You Set up Nginx as an HTTP Load Balancer for Other Instances of Nginx Running in Docker?

Problem scenario
You have many Docker containers running Nginx.  You want to leverage these instances for users to go to one web site and then be automatically routed to different underlying Nginx instances in Docker containers.  How do you create a single website for web clients to go to with a reverse proxy balancing the load behind-the-scenes?

Solution
Overview
We accomplish an example with four Docker containers each using a free version of Nginx. 

How Do You Set up Nginx as an HTTP Load Balancer So Client Requests (from Web Browsers) Go to Certain Nginx Servers More Frequently Than Others?

Problem scenario
You have certain Nginx servers with ample resources whereas others have minimal resources.  Based on geographic locations and data center bandwidth locations and costs, you want to assign fractions of the web traffic from client workstations (requests from web browsers) to different Nginx servers more than others.  You do not want round-robin, equal distribution of traffic.  You want customized HTTP load balancing in accord with unequal configurations.  How do you distribute this traffic proportionately according to your desired specifications?

How Do You Delete a Kubernetes Cluster in AWS?

Problem scenario
You try to manually delete EC-2 instances (both worker nodes and the master node of Kubernetes), but this does not work. The EC-2 instances are re-created. How do you delete the cluster?

Solution
Prerequisite
Install and configure the AWS CLI; if you need assistance, see this posting.

Procedures
Run these three commands but substitute “contint.k8s.local” with the name of your cluster:

export KOPS_CLUSTER_NAME=contint.k8s.local
export KOPS_STATE_STORE=s3://$KOPS_CLUSTER_NAME-state
kops delete cluster –name contint.k8s.local –yes …