Contact

Docker - Build, Ship, Run - is the the ultimate solution for developers who need to get all their apps running seamlessly on the same old servers! Smooth transitioning of your software application through several development cycles can be a challenge. The challenges become serious when you move into production cycle. Issues come in different forms and sizes: tracking dependencies, updating only those components that need change, scaling the application and so on.

The buzz for the technology, Docker has been around for around two years, and now the buzz has evolved into a loud sound - a roar. This is because more and more companies have begun to use Docker

What is Docker?

Docker is an open source tool that makes it easier to deploy apps with the help of containers. Applications are packaged in containers and then shipped to whichever platform it is meant to work in.Containers make it easier for a developer to package an application with all the necessary dependencies and ships it out in a single easy-to-use package.

Docker Containerization is a method by which the application is broken down into easily manageable sizes, functional and individually packaged, complete with all the dependencies added to them. Scaling the application and updating each of the components, thus, becomes easy.

The biggest advantage of Docker is the assurance it gives the developer that the application will run smoothly on any Linux machine irrespective of the fact that the machine itself has customized settings that may or may not be different from the machine in which the codes were created and deployed initially.

An advantage of Docker is that applications can use the same Linux kernel that are already existing in the host computer. This improves the performance of the applications while reducing its size considerably.This aids in better application performance, with cloud-like flexibility (it works with any infrastructure that works with containers).This makes it simple for developers and system administrators who need a simple and standardized interface.

Docker supports both Linux and Microsoft. The big shift happened when Docker announced a commercial partnership with Microsoft in September 2016. With this major development, developers can use a Commercially Supported Docker Engine (CS Docker Engine) and Docker Datacenter to create hybrid Windows applications.

Both Docker and Microsoft signed an agreement where CS Docker Engine will be made available to Windows Server 2016 free of cost. This not only aids in modernizing application environment, but makes it quicker and easier to build, ship and run distributed applications.

With the aid of Docker Datacenter, developers can create a heterogeneous environment for both Windows and Linux workloads. As per the clauses within the agreement, Docker Datacenter must provide an integrated platform for IT professionals to add all the necessary protocols to the applications with no compromise to agility or portability.

With the integration of both Docker Datacenter and Windows Server Containers, IT professionals can strategize their applications, while shipping 7 times faster, and giving the operations team enough flexibility to migrate to the cloud seamlessly and quickly.

The aim of containerizing legacy applications through Windows Server containers using Docker capabilities makes it easier for everyone in the team - the developers, testers & deployment team to operate in a full DevOps environment.

Containers vs. Virtual Machines

In the first glance, you may feel that Containers and Virtual Machines have the same features and functionalities. This is because both provide isolated environments for running an app. But look deeper and you can see that are so much different from each other.

The underlying framework in both is different. Think of Virtual Machines as self-contained houses and Containers as apartments. You will instantly be able to see how different they are from each other. A self-contained house will not have to depend on anyone to give them the basic necessities like plumbing, heating, electrical and so on.

But that is not the case with apartments - they have to use shared facilities. In the same way, Docker Containers share the underlying resources in the Docker host. And developers will have to build the Docker image and add whatever they need on to that to run their application. They will take the basics from the Docker image.

VM however, doesn’t function in the same way. It is a self-contained and fully functional operating system, but the developers can strip out the components on the basis of the application they are building.

VM machines in their most basic level allows for Docker hosts to run in them. As for Docker, it doesn’t care where you run the containers. Docker container-based services can interact with Virtual Machine services seamlessly, provided you develop the right networking for it.

Why Should You Use Docker - Some Benefits

1. Removes Environmental Inconsistencies

All the applications can enjoy the same environment. The developer no longer has to complain that the production environment happens only locally and will not get transferred to other servers, live sites and so on. Any improvements or changes made in one environment will be shared, the build process takes place only once.

2. Helps in Sharing and Distributing Content

Docker is integrated with a software sharing and distribution mechanism that allows for sharing and using container content. This eases the tasks of both the developer and the operations team. This adds to the portability across machines, making Docker extremely popular among developers.

3. Lightweight in Structure

Docker is lightweight because the containers are lightweight and since it is easy to run the containers without the extra weight of the hypervisor, it is possible to run more containers on your choice of hardware combination.

The lightweight nature of Docker makes it easy to manage whatever dynamic workloads that come your way. Hence, it is fast, viable and a definitely economical way when compared to hypervisor-based virtual machines.

The reason why Docker is so lightweight is that Docker images are made of read-only templates in which each image comes in a series of layers. These layers combine together with the help “union file systems” to form a single image.

4. Proper Version Control and Component Reuse

If needed you can roll back into the previous Docker image version. This is because of the layered image file system of Docker. Docker itself is not a version control system but it tracks a whole virtual chroot, binaries and so on.

Also, you can reuse all the components from the preceding layers of the container.

5. Provides Isolation of Resources

Isolation of resources adds to the security of running containers on a given host.Namespaces is a technology that Docker uses for isolated workspaces (called containers). Each time you run a container, a namespace will be created for it, and there will be a layer of isolation making each container a separate, but secure application. Access will be limited to that namespace only. The resource isolation feature of Docker makes it easy to identify issues and tweak them.

6. Faster Shipment of Apps

Docker Hub and Docker Engine play an important role in eliminating the time required by most developers when doing the integration work. This makes it easier to ship the faster than expected - almost 7 times faster.

While Docker Engine is a portable runtime and packing tool, Docker Hub is a cloud service used for sharing applications and automating workflows.

7. High Scalability for Apps

Scalability in design is enjoyed by the designers right from day one. For example, if you want to host your application in cloud and you are using Amazon EC2 Container Service (ECS), then you can enjoy full scalability because you can configure how many instances you want the application to run in cluster. ECS allows you to configure EC2 machine clusters easily.

8. Easy Portability across Machines

Docker makes distributed apps to be dynamic and portable. It can be run anywhere - on the cloud, on the premises, private cloud, public cloud, bare metal etc.

9. Open Source Technology

The Docker containers are based on open standards. It means that anyone can contribute to the Docker tool and tweak it to meet their needs, especially if they don’t find the features already available.

10. High Security by Default

One of the features of Docker is that it is secure. Containers in Docker provides an extra layer of protection to each application. They also isolate applications from one another and the underlying infrastructure.

11. Empowers Innovation among Developers

Github makes the source code shareable, thereby generating and encouraging innovation by developers. Docker Hub and commercial hub makes it possible for spreading innovation even in the way they package, deploy and handle applications.

12. Eliminates Maintenance Cost for Enterprises

As Docker offers a number of functionalities, Docker, it relatively eliminates any extra cost involved in supporting and maintaining existing applications. Moreover, the run time cost for Docker is negligible and close to zero.

13. Integrates with a Number of Infrastructure tools

Docker can be easily integrated into a variety of infrastructure tools like Amazon Web Services, Ansible, , IBM Bluemix, Jenkins, Google Cloud Platform, Oracle Container Cloud Service, Microsoft Azure to name a few.

Use Cases of Docker

1. CI/CD

Continuous Integration and Continuous Deployment (CI/CD) are the most common use cases of Docker.Merging development with CI testingand CD allows developers to build collaborative codes, test them in any environment, the biggest advantage being the ability to catch bugs as early as possible.

Docker can integrate tools with Jenkins and GitHub making it easier for developers to build codes, test them in GitHub and trigger a build in Jenkins, finally adding the image in Docker registries.

Also, Docker supports Webhooks - “user defined HTTP callbacks” that allows you to build integrations in response to subscribed events that are triggered in the Docker Hub repository.When an event is triggered, a HTTP POST JSON payload is sent to the configured URL of the Webhooks. In this way, you can easily trigger CI builds or update bug tracking systems.

BBC is a perfect example of using a new CI environment with new Docker containers. They eliminated the 30 minute wait time they had to face initially while scheduling news.

About 500 developers had 26,000 jobs running in parallel, while the lock down nature of the previous CI environment made it extremely difficult not only to make changes, but also added to the 30 minute wait time to get the jobs scheduled.

They tried other approaches like Sideloading to supplement the languages and versions not in the CI environment. But that did not work, and put an even heavier load on the BBC News team.This led them to adopt Docker while migrating to the cloud, consolidating new CI environments, building internal CD tooling solutions. This allowed them to build, ship and run applications anywhere.

2. DevOps

The traditional barrier that once existed between the developers and IT operations team became virtually non-existent with the new DevOps initiative. About 44% of the enterprises have already adopted this trend within their organization.

Docker with DevOps get the developers and operations team to work together, where both understand the challenges faced by each other, apply DevOps practices and work on a mature level.

Chris Buckley, the Director of DevOps Business Insider uses DevOps to create a local development environment. They build and ship into development, put in to QA and then finally to production. DevOps ensures that the exact same stack will be run on the user’s computer as it was meant to be.

3. Infrastructure Optimization

Through Virtualization, a new way of managing applications had emerged, but heavyweight nature of VMs leading to reduced storage capacity and high VM and hypervisor licensing costs became causes of concern.

Enter Docker and all these problems are instantly solved because it allows for the seamless transportation across various platforms and application environment. Managing applications through cloud services, virtual servers and physical servers are made easy with Docker.

SwissCom, a Swiss based telecommunications company went from 400 VMS to 20 VMS cutting down their resources and maximizing their infrastructure drastically. Their challenge was to deploy their products faster and Docker gave them the technology that they successfully used. They were able to reduce costs while maximizing available resources.

4. Docker Containers As A Service (CAAS) for the Government

Docker can help enterprises modernize their application architecture. With the pressure that’s coming for government based organizations, the ability to build, ship and run can be done quickly and cost-effectively only with the help of Docker.They can deploy scalable services securely on a wide variety of platforms while improving flexibility and maximizing capacity.

The US Government is a perfect example of using Docker to run their applications successfully. They were able to modernize their systems and applications to make the components (and IT services) of their system and easily transportable/shareable with other agencies within the government.

Some Important Docker Services

Now, that we have seen the advantages and the areas where it has really proved its worth, let’s take a quick look at some of the important services/projects offered by Docker.

1. Docker Engine

Docker Engine is the foundation of the modern application platform and is responsible for creating and running Docker containers. It is supported on Linux, Windows, Cloud and MacOS. It is lightweight, open source and powerful and integrated with a workflow to build and containerize applications.

Thanks to its built-in orchestration, it forms the core of Docker. It contains a very simple user interface and makes the porting environment easy from a single container on a single host to multiple applications across a number of hosts.

2. Docker Datacenter (Docker for the Enterprise)

Open and flexible, Docker Datacenter helps developers to leverage an integrated platform for both developers and IT operations team where container management and deployment services are together for end-to-end agile application portability.

Datacenter makes it easy to manage, monitor and secure images both within the registry and those deployed across various clusters. Enterprise software is secured during the applications move from development to a production runtime.

3. Docker Hub

Docker Hub functions as a hosted registry service that helps you store, manage, share and integrate images across various developer workflows. It is safe, secure and allows for quick collaboration.

You can automate and deploy your workflow by connecting Continuous Integration (CI) and Continuous Delivery (CD) or Git systems. Each time an image is shared it goes through an integration test.Enterprises wanting a SaaS hosted service can make use of Docker Cloud subscription to store, distribute and manage their images.

4. Docker Compose

Docker Compose is a tool that developers deploy to define and run all multi-container Docker applications. A single command allows you to make maximum use of the benefits of Compose.>You can isolate multiple environments from each other on a single host, even if they are of the same name, you can easily separate them.

The data volume will be copied from the old container when you create a new container. None of your data will be lost. Each time you create a container, Compose uses the configuration that was used previously.

This is a major benefit because you don’t have to waste time making changes to the environment. Customizing the composition for different environments is also an easy process with Compose.

How to Deploy a LAMP Stack in Docker

LAMP Stack is an open source web platform that helps you to deploy dynamic web sites and servers. It is the acronym for Linux, Apache, MySQL, and PHP/Python/Perl. LAMP with its four layers, provide a solid foundation so that developers can easily deploy high performance web applications.It is known as stack, because the main components are placed one on top of the other, and they come as default in all the Linux distributions. The components in LAMP are Free or Open Source Software (FOSS).

The benefits come in three different ways:

  • FOSS makes the applications for free download, so people can access them without making any kind of payment.
  • FOSS licenses are open so there are very few restrictions for developing and deploying LAMP based projects.
  • FOSS makes it easy for people to have access to the source code to make the necessary changes to the application, making the development process easy, simple and flexible.

There are two ways to make LAMP stack with Docker containers.By downloading the preconfigured LAMP stack Docker image from the Docker HUB.

Pull the base image from the HUB and then configuring all the components one by one.

Here are the steps and snippets:

Installation

To install Docker LAMP stack, you can first install Docker on Ubuntu 14.04, or use any other installation method of your choice>

Docker Engine is supported on Cloud, Linux, Windows and Mac OS. wget -qO- https://get.docker.com/ | sh docker –version

Docker is treated like other services. sudo service docker status sudo service docker start

Test the installation using this code sudo docker run ubuntu:14.04 /bin/echo 'Hello world'

Overview of Docker

Docker containers are built atop Docker images to run applications and servers. There are separate containers for Apache and MySQL. You can have minimum usage of images, or complex ones depending on usage.

After creating the images, the developer can run a container on top of the image. Here are typical Docker commands for doing it:

# show available images
sudo docker images

# show running containers, or all containers
sudo docker pssudo docker ps -a

# to stop / start a container
sudo docker stop <name | id> sudo docker start <name | id>

# to see info about a container, or specific info
sudo docker inspect <name | id>sudo docker inspect -f '{{ .NetworkSettings.IPAddress }}' <name | id>

# to delete a container
sudo docker rm <name | id>

# to delete all containers
sudo docker rm -f $(sudo docker ps -a -q)

# to delete an image
sudo docker rmi <name | id>

Docker Lamp Containers

Docker containers may close when they are not involved in any active processes. Ensure that the Docker container is running. Use Docker image from Docker Hub.

# pull the docker image
sudo docker pull mysql

# run the container on top of the image
sudo docker run -p 3900:3306 --name mysql -e MYSQL_ROOT_PASSWORD=root -d mysql:latest

# test connection (if running MySQL locally, stop it first)

# password is ‘root’ (see above run command)

mysql -uroot -p -h 127.0.0.1 -P 3900

In the above steps, a MySQL image provided by Docker is pulled and a container was ran on top of it. The -p option specifies which local port should be bound to the Docker port.

According to the container’s instructions, the default MySQL password was passed into the container. At this point there is an active container running a MySQL server.This can be verified with the following command.

sudo docker ps

Login into the container and create a new user for the database.

mysql -uroot -p -h 127.0.0.1 -P 3900&
mysql> CREATE USER 'php'@'%' IDENTIFIED BY 'pass';
mysql> GRANT ALL PRIVILEGES ON *.* TO'php'@'%' WITH GRANT OPTION;
mysql> FLUSH PRIVILEGES;
mysql> exit;

For the Apache, and PHP container, we can build an image based on a Dockerfile. To get a better understanding of how they are built, use Docker Hub to view existing containers, and study the Dockerfiles and repos. The Apache Dockerfile can look like this.FROM ubuntu:14.04

RUN apt-get update
RUN apt-get install -y apache2
RUN apt-get install -y php5 php5-common php5-cli php5-mysql php5-curl

EXPOSE 80

CMD ["/usr/sbin/apache2ctl", "-D", "FOREGROUND"]

Specify the OS, which commands to run to set up the container, which ports to open, and which command to run when the container is started. Here, the Apache service is being ran in the foreground. From the Dockerfile the image can be built, where the name:tag is specified with the -t option, and the following period specifies where the Dockerfile is located.

sudo docker build -t jessecascio/local:apache .

After the image has been built, it can be seen with the images

commandsudo docker images

Now that image has been created a container can be ran on top of the image. However, since this container is a part of the stack it should be linked to the mysql Docker

# verify the two containers are running, and link exists

sudo docker ps
sudo docker inspect -f "{{ .HostConfig.Links }}" apache

The above run command included the –link option which specifies which container to link to along with an optional alias, and the -v option which maps a local directory to a directory in the container. Though links are not required, it might be useful to do it.When you link the containers, it forces Docker to share information about the linked containers via Linux ENV vars and the /etc/hosts file. Now that the Apache server container is running, this URL will bring up the Apache welcome page in the browser: http://localhost:8080/ Since the /var/www/htmldirectories are linked test.php can be added# in /var/www/html/test.php

<?php
phpinfo();

Verify in browser :

http://localhost:8080/test.php

Make a database connection to check the connection between containers.

# in /var/www/html/db.php
<?php
// Could also use getenv('MYSQL_PORT_3306_TCP_ADDR')
// But recommended to use the host entry which survives server restart
$dsn = 'mysql:host='.gethostbyname('mysql');
$usr = 'php';
$pwd = 'pass';
try {
$dbh = new PDO($dsn, $usr, $pwd);
}
catch (PDOException $e) {
die('Connection failed: ' . $e->getMessage());
}
echo 'Connection made!!!';

Now, the two Docker containers are running and communicating. Finally, we have created a two container Docker LAMP stack!

Credits: JessesNet />

Docker Conference

If you want to take part in Docker Conferences, and learn more about using Docker in your applications, you can attend the DockerCon events held every year. DockerCon 2016 was held in Seattle, WA from June 19th-21st, it being a 2.5 day conference. It is a workshop where a number of speakers will be talking about Docker, its previous editions, and all about the growing community and ecosystem of Docker.

If you are a developer, system administrator, C-level executive, Operations team of DevOps, you can easily participate in these conferences. DockerCon 2017 will be held in Austin TX from April 17th to 20th and is expected to be one of the largest tech conferences of its time. It would be the perfect opportunity for you to meet so many developers and technical experts and exchange ideas about Docker with them.

For registering for DockerCon 2017, please visit this link. https://dockercon.smarteventsc...

In Closing

Docker has developed into a useful containerization tool for developing applications. The issues between developers and the operations team have become practically non-existent. It is now easier to build an app without worrying about how it would run on each engineer’s computer because it will run the same in whichever platform you need it to run on - Mac, Linux, Windows.

Docker has been successful in meeting the needs of many enterprises. They can make their apps easily portable, quickly packaged, deployed and managed. And the good news is that you can deploy Docker containers in cloud environment and integrate it with DevOps environments without any hiccups.

Docker allows you to get more applications running on the same hardware when compared to other technologies. Want to know more about how you can implement Docker for your next app development? We've got an expert team to help you.

Contact Us Today!

SHARE THIS BLOG ON

Other Posts

STAY UP-TO-DATE WITH US

Subscribe to our newsletter and know all that’s happening at Cabot.

SHARE