What is Kubernetes?

Copy URL

Red Hat named a Leader in the 2023 Gartner® Magic Quadrant™

Red Hat was positioned highest for ability to execute and furthest for completeness of vision in the Gartner 2023 Magic Quadrant for Container Management.

Kubernetes (also known as k8s or “kube”) is an open source container orchestration platform that automates many of the manual processes involved in deploying, managing, and scaling containerized applications.

Originally developed and designed by engineers at Google as the Borg project, Kubernetes was donated to the Cloud Native Computing Foundation (CNCF) in 2015. Red Hat® was one of the first companies to work with Google on Kubernetes, even prior to launch, and has become the 2nd-most leading contributor to the Kubernetes upstream project. 

A working Kubernetes deployment is called a cluster, which is a group of hosts running Linux® containers. You can visualize a Kubernetes cluster as two parts: the control plane and the compute machines, or nodes.

A diagram showing the infrastructure of a Kubernetes cluster

Each node is its own Linux environment, and could be either a physical or virtual machine. Each node runs pods, which are made up of containers. 

The control plane is responsible for maintaining the desired state of the cluster, such as which applications are running and which container images they use. Compute machines actually run the applications and workloads. The control plane takes commands from an administrator (or DevOps team) and relays those instructions to the compute machines.

This handoff works with a multitude of services to automatically decide which node is best suited for the task. Services decouple work definitions from the pods and automatically get service requests to the right pod—no matter where it moves in the cluster or even if it’s been replaced. It allocates resources and assigns the pods in that node to fulfill the requested work.

Kubernetes runs on top of an operating system (Red Hat Enterprise Linux, for example) and interacts with pods of containers running on the nodes.

The desired state of a Kubernetes cluster defines which applications or other workloads should be running, along with which images they use, which resources should be made available to them, and other such configuration details.

There is little change to how you manage containers using this type of infrastructure. Your involvement just happens at a higher level, giving you better control without the need to micromanage each separate container or node. 

Where you run Kubernetes is up to you. This can be on bare metal servers, virtual machines (VMs), public cloud providers, private clouds, and hybrid cloud environments. One of Kubernetes’ key advantages is it works on many different kinds of infrastructure.

Hear more from Red Hat leaders about how containers are revolutionizing the open hybrid cloud.

Docker can be used as a container runtime that Kubernetes orchestrates. When Kubernetes schedules a pod to a node, the kubelet (the service that makes sure each container is running) on that node will instruct Docker to launch the specified containers.

The kubelet then continuously collects the status of those containers from Docker and aggregates that information in the control plane. Docker pulls containers onto that node and starts and stops those containers.

The difference when using Kubernetes with Docker is that an automated system asks Docker to do those things instead of the admin doing so manually on all nodes for all containers.

Ready to get started with Kubernetes? In this on-demand course, you’ll learn about containerizing applications and services, testing them using Docker, and deploying them on a Kubernetes cluster using Red Hat OpenShift®.

Kubernetes can help you deliver and manage containerized, legacy, and cloud-native apps, as well as those being refactored into microservices. 

In order to meet changing business needs, your development team needs to be able to rapidly build new applications and services. Cloud-native development starts with microservices in containers, which enables faster development and makes it easier to transform and optimize existing applications. 

Application development with Kubernetes

Production apps span multiple containers, and those containers must be deployed across multiple server hosts. Kubernetes gives you the orchestration and management capabilities required to deploy containers, at scale, for these workloads.

Kubernetes orchestration allows you to build application services that span multiple containers, schedule those containers across a cluster, scale those containers, and manage the health of those containers over time. With Kubernetes you can take effective steps toward better IT security.

Kubernetes also needs to integrate with networking, storage, security, telemetry, and other services to provide a comprehensive container infrastructure.

Kubernetes explained - diagram

Once you scale this to a production environment and multiple applications, it's clear that you need multiple, colocated containers working together to deliver the individual services. 

Linux containers give your microservice-based apps an ideal application deployment unit and self-contained execution environment. And microservices in containers make it easier to orchestrate services, including storage, networking, and security.

This significantly multiplies the number of containers in your environment, and as those containers accumulate, the complexity also grows.

Kubernetes fixes a lot of common problems with container proliferation by sorting containers together into "pods." Pods add a layer of abstraction to grouped containers, which helps you schedule workloads and provide necessary services—like networking and storage—to those containers. 

Other parts of Kubernetes help you balance loads across these pods and ensure you have the right number of containers running to support your workloads.

With the right implementation of Kubernetes—and with the help of other open source projects like Open vSwitch, OAuth, and SELinux— you can orchestrate all parts of your container infrastructure.

Kubernetes is open source and as such, there’s not a formalized support structure around that technology—at least not one you’d trust your business to run on. If you had an issue with your implementation of Kubernetes while running in production, you’d likely be frustrated. And your customers would be, too.

Think of Kubernetes like a car engine. An engine can run on its own, but it becomes part of a functional car when it’s connected with a transmission, axles, and wheels. Just installing Kubernetes is not enough to have a production-grade platform. Kubernetes needs additional components to become fully functional. You’ll need to add authentication, networking, security, monitoring, logs management, and other tools. That’s where Red Hat OpenShift comes in—it’s the complete car. 

Red Hat OpenShift is Kubernetes for the enterprise. It includes all the extra pieces of technology that make Kubernetes powerful and viable for the enterprise, including registry, networking, telemetry, security, automation, and services. Red Hat OpenShift includes Kubernetes as a central component of the platform and is a certified Kubernetes offering by the CNCF.

With Red Hat OpenShift Container Platform, your developers can make new containerized apps, host them, and deploy them in the cloud with the scalability, control, and orchestration that can turn a good idea into new business quickly and easily. If you’re looking to deploy or move your Kubernetes workloads to a managed cloud service, OpenShift is also available as a cloud-native Kubernetes platform on Amazon Web Services (AWS), Microsoft Azure, Google Cloud, IBM Cloud, and other providers. 

Building on a foundation of OpenShift, you can use Red Hat Advanced Cluster Management and Red Hat Ansible® Automation Platform together to help you efficiently deploy and manage multiple Kubernetes clusters across regions, including public cloud, on-premise, and edge environments.

Learn how Red Hat helps you build and automate hybrid environments

Use case: Building a cloud platform to offer innovative banking services

Emirates NBD, one of the largest banks in the United Arab Emirates (UAE), needed a scalable, resilient foundation for digital innovation. The bank struggled with slow provisioning and a complex IT environment. Setting up a server took 2 months, while making changes to large, monolithic applications took more than 6 months.

Using Red Hat OpenShift Container Platform for container orchestration, integration, and management, the bank created Sahab, the first private cloud run at scale by a bank in the Middle East. Sahab provides applications, systems, and other resources for end-to-end development, from provisioning to production, through an as-a-Service model. 

With its new platform, Emirates NBD improved collaboration between internal teams and with partners using application programming interfaces (APIs) and microservices. And by adopting agile and DevOps development practices, the bank reduced app launch and update cycles.

There's a lot more to do with containers.

Keep reading

Article

What's a Linux container?

A Linux container is a set of processes isolated from the system, running from a distinct image that provides all the files necessary to support the processes.

Article

Containers vs VMs

Linux containers and virtual machines (VMs) are packaged computing environments that combine various IT components and isolate them from the rest of the system.

Article

What is container orchestration?

Container orchestration automates the deployment, management, scaling, and networking of containers.

More about containers

Products

An enterprise application platform with a unified set of tested services for bringing apps to market on your choice of infrastructure.

Resources

Podcast

Command Line Heroes Season 1, Episode 5:
"The Containers Derby"

E-Book

Boost agility with hybrid cloud and containers

Training

Free training course

Running Containers with Red Hat Technical Overview

Free training course

Containers, Kubernetes and Red Hat OpenShift Technical Overview

Free training course

Developing Cloud-Native Applications with Microservices Architectures