CONTAINERS – Motivations and Challenges – Why is it so damn popular?

In our previous posts, we had discussed about what a container is and how it works. Now, let us explore why it is becoming so popular and grabbing attention from different sections. As per our analysis, the main reason for this is the perceived benefits and wide applicability of containers. VM’s role is limited to only IT operations and infrastructure but container’s role is pretty widespread – be it in the areas of development, operations or infrastructure. Following explanations will make this amply clear:

Developers love containers as it empowers them to quickly move an application and its underlying dependencies across different environments and frees them from setting / configuring the whole thing repeatedly. This saves both set-up and troubleshooting time in case of an environment mismatch. Once an image is created, it can be quickly moved from developer’s machine to a test environment, from staging to production and also from a machine in a data center to a virtual machine in a private or public Cloud. The best part here is that an app image created by container can be stored, versioned and easily shipped, thereby easing the process of quickly recreating the running environment.
Since a container contains an application and the application platform requires to run it without underlying OS, its small footprint and quick provisioning abilities are getting hugely popular in DevOps and CI/ CD environments to achieve faster release cycles.
A recent survey (conducted by O’Reilly Media in collaboration with Ruxit) reveals that container technologies, especially Docker, are being adopted rapidly to make application deployment easier, faster, friendlier and more flexible for developers. The major use of containers will be to enable developers to work with more agility, especially when they are building applications that need to be frequently updated and require predictability and consistency throughout the application life cycle.

Infrastructure architects see great potential in container technology as it attempts to overcome a lot of challenges inherent in VM-based virtualizations. Containers do much of what a virtual machine can do and that too with far less resource overhead. A container need not be much larger than the application it contains, but you may need to allocate a GB of memory to just boot up a VM. Furthermore, this gigabyte stays allocated and you have to bear its cost.
Additionally, container-based deployment has a potential to save tens-of-millions of Dollars annually of a data center or Cloud provider in power and hardware costs. Does companies’ rising interest to adopt this technology as early as possible ring a bell now?

Application/ Solution Architects too are excited about discovering container’s usage to achieve scale, improve performance and realize microservices-based architecture. You can also leverage containers’ efficiency in data analytics, high-performance computing (HPC) and other highly parallel & short-lived workloads.

The gaining popularity of containers is also shadowed with some hype. To separate hype from reality, it’s necessary that we ask ourselves the following questions:

  • What a container suited for and/ or is trying to replace?
  • Does it provide all the required tools and support?
  • What’s the market view about it?

Any technology innovation happening out there solves some of the core problems of IT/ business. These technologies get mature and robust over a period of time based on real world usage inputs. Similar is the case with containers. Since container technology is seen as an alternative to VM-based virtualization, its maturity is being compared with the solution available in VM space.
Usually, for any new technology or innovation, the initial discussions focus only on the potential benefits it can bring to a business. Robustness of this technology or architecture comes into picture only when enterprises talk/ think about adopting it. Same is happening with container technology. Though both virtual machines (VMs) and containers are forms of virtualization allowing multiple applications to share same hardware, they possess very different technical features.
Container is undoubtedly a more efficient and economical option to do most of the things you can do with VM/ hypervisors. It is seen as the next generation of virtualization but, as of today, it is still in infancy stage. A large-scale enterprise deployment using containers faces challenges because of lack of proven solutions around admin control and management. In contrast, VM-based workloads are very well supported by robust management solutions.
The organizations who have already adopted containers have discovered a challenge that when deploying dozens, hundreds or thousands of containers that make up an application(s), you require advances in both management and orchestration for tracking and managing them all. This requirement is missing/ not fully supported in the available tools. Also, there is a lack of certification or digital structure to meet compliance needs.

A recently conducted survey (by O’Reilly Media in collaboration with Ruxit 2015) has highlighted various challenges faced by enterprises running container-based workloads (please refer to Fig 1 below).

Fig 1: Challenges faced by enterprises running container-based workloads – Technology maturity & unavailability of tools top the list

From the survey output, it is quite clear that technology maturity is the biggest challenge (it’s difficult to keep track of all the container-related tools and projects released recently). Also, unavailability of proven tools for container orchestration in a large-scale deployment and lack of proper management & automation tools to integrate in the existing enterprise environment/ ecosystem are some other major hurdles. A research by Forrester has also highlighted some more concerns like security and variable performance. These are the areas where there is a need for a strong solution which will help plan large-scale enterprise deployment around containers.

To conclude on maturity, we can say that core container technology is getting robust and feature-rich day-by-day and attempting to provide what a VM can provide. Linux-based containers are ahead in the race. Except for a few features like file mount, loading core modules for H/W encryptions etc., it does provide around 80% of VM functions.

But apart from core engine, you also need a complete ecosystem of tools for managing and orchestrating containers so as to push them along the adoption or propel them into the mainstream. Future looks to be promising as Docker is a solution which is moving in that direction and will play a key role in providing all supporting tools to give right control on managing containers. Additionally, container technology is backed by big leaders like IBM, Google, Amazon and Microsoft. Also, there is an open standard being formulated as a part of OCI (Open Container Initiative) for container format and runtime.

Our next post will talk about starting your Container journey, the Hexaware way.

References

Posted by Kailash Oza
Comments (0)
September 14th, 2016

Comments (0)

Leave a Reply

Your email address will not be published. Required fields are marked *