LOADING

Blog

Docker Containers – What Business Leaders Need to Know

Ranjan Dailata
Solution Architect at  | + posts

Ranjan Dailata is an Aezion Solution Architect and Aezion Labs Lead.

Docker Containers have become an essential element in modern, high-performance IT operations practices – particularly in the cloud computing era. This article defines what containers are and why they are important to your business whether you are responsible for just managing a single server or running IT operations at scale.

Docker Containers Background
IT operations are responsible for managing and maintaining efficient and reliable computing infrastructure that supports the range of computing tasks performed by a business. These tasks are facilitated through enterprise resource planning applications that support Human Resources, Finance, Customer Relationship Management, Project Management, Operations Management and Workflow, Logistics, Reporting and Analytics, and more. While these applications differ in function all share a common dependence on efficient, reliable, and responsive computing resources. These resources include operating system, processor, RAM, storage and networking elements. Historically, these individual elements were organized and managed as physical server units, then virtual machines with the advent of virtualization technology.

Virtual Machines improved overall computing resources and IT operations efficiency through increased sharing of physical hosts and host files and libraries. This reduction in physical servers and increased utilization of host files and libraries led to reduction in Capital and Operations Expenditure, and improvements in Developer and Customer Experience.

Containers extend the efficiency trajectory of Virtual machines by allowing apps to run in a dramatically simplified and light-weight environment compared to physical servers and virtual machines. Containers disassociate dedicated application dependencies from shareable OS elements. These shareable elements are abstracted and packaged as single-instance, shareable resources which further improve resource utilization.

Containers and Docker

Containers were introduced as an extension of the Linux Operating System in 2001. They are an evolution and formalization of namespace isolation and resource governance techniques used in pre-Linux Operating Systems such as Solaris Zones, Unix chroot and BSD Jails. The Docker Containers specification presented a common packaging model, toolset and deployment model that dramatically simplified containerization and application deployment on Linux hosts. The specification was realized as Docker images that contained shared host and VM files and libraries. This evolution led to further improvement in computing resource utilization – maximizing resource sharing by eliminating VM-related overhead – and significant improvement in IT operations and applications management. The result is further improvement in Capital Expenditure, Operations Expenditure and Customer Experience.

The benefits of Docker Containers were introduced to Windows hosting environments with Windows Server 2016. To support this initiative, Microsoft established a partnership with Docker to extend the Docker API and toolset to support containers operating on Windows Server hosts. The Microsoft extensions permits the same Docker client to manage both Linux and Windows Server containers – extending Docker utility to Windows Server while preserving the DevOps efficiencies and user experience made possible by Docker. This initiative by Microsoft created a true win-win scenario for all parties.

Why Docker Containers are Important

Docker Containers are important for small and large IT operations. To understand this, let’s review the DevOps benefits of Docker-based containers:

1.Application performance improvements. This is enabled through the sharing a single Operating System kernel across multiple containers. The result is more efficient and granular application packaging which en fast container startup and because the startup package is smaller and OS components are excluded from the container startup process.

2.Faster Provisioning. Containers are dramatically faster to provision because they are significantly lighter-weight to build and define versus Virtual Machine images, and they are provisioned via software on pre-provisioned infrastructure.

3.Efficient Resource Utilization. Containers are also more efficient at resource utilization than Virtual Machines with siloed OSs and OS-based resources.

4.Simple, high availability. This is because containers can run on different underlying hardware. If one host goes down, traffic can be re-routed from the Edge to live application containers running elsewhere.
Smooth scaling. Containers enable smooth scaling without downtime or architectural changes. Scaling is difficult with VM-centric hosting which requires reboots, and often rearchitecting, to resize.

5.Configuration consistency. Every container can be exactly the same. The hosting platform is a large, resource sharing matrix. Containers are provisioned automatically on identical infrastructure managed via consistent, automated tools which minimizes server sync issues.

6.These are direct benefits if you are responsible for managing a large IT operation. You and your DevOps team can experience them in your day to day operations. However, these benefits also apply if you are responsible for administering a single server or even a single website. This is because best of breed hosting providers such as Azure or AWS (a) have platform economics that produce lower costs for comparable, small-to-large scale server deployments, and (b) have largely adopted containers – so by utilizing one of them you indirectly experience these benefits.

View the previous post in this series: Machine Learning – What Business Leaders Need to Know

Ranjan Dailata
Solution Architect at  | + posts

Ranjan Dailata is an Aezion Solution Architect and Aezion Labs Lead.