In this blog I will review the problem of how an application works in the development environment but not the UAT environment. I will also explain the components involved in packaging an application using Docker. I will also look at the cause and the solution.
|An application works in the development environment but not in UAT environment.||The problem arises when the development environment has an upgraded version of the software and the UAT servers have an older version or are missing libraries or different configurations.||Using Docker Container, pack the code along with configurations, tools and other frameworks required to run the code. In this way, the code will work in any environment.|
In case your application is broken into several microservices, you can have various Docker containers for each micro-service.
What is Docker?
Docker is an open-source tool designed to create, deploy and run applications with ease by using containers. Docker fits in the deployment phase of the DevOps pipeline.
DevOps can be defined as a culture that primarily focuses on improved collaboration, communication and integration between Development and Operations teams.
DevOps improves collaboration and productivity by:
- Automating infrastructure provision
- Automating workflows for building, testing and deploying applications
- Continuously measuring application performances
What is a Container?
A Container is a package which has everything except the Operating System to run the software application.
Containers versus Virtual Machines (VM)
Every Virtual Machine has its own Operating System which is the reason why the boot up process takes a longer time. Virtual Machines share the host’s hardware with other VMs on the same host.
Containers, virtualize the Operating System – every container has its own CPU, memory, block I/O, network stack and uses the host’s Operating System.
Containers have a short boot up process. They offer increased efficiency, better utilization and are portable.
Containers versus Virtual Machines (VM)
A Docker file is a text document that contains all the commands a user could call on the command line to assemble an image.
Every time, we are going to pick up a base image and build on top of that image.
For example, in the below Docker File, we are taking the base image “tomcat” and adding our web application war file.
ADD LeaveManagementApp.war /usr/local/tomcat/webapps
CMD “catalina.sh” “run”
- Docker Image is built from the Docker File.
- Docker Images are made up of multiple layers which are a read only file system
- A layer is created for each instruction in the Docker File and placed on top of previous layer
docker build –t LeaveManagementImage:1.0
Using the Docker build command, a Docker Image can be created.
Once the Docker Image is build, it can be stored or shared through Docker Hub. Just like GitHub, we can create an account in Docker Hub, create public or private repositories and maintain the Docker Images.
Using the command below, we can set the Docker Hub configuration to our image:
docker tag LeaveManagementImage:1.0 myrepo/LeaveManagementImage:1.0
And then finally push the image to Docker Hub:
docker push myrepo/LeaveManagementImage:1.0
Now that the image is available on Docker Hub, you’ll be able to run it anywhere. If you try to use it on a new machine that doesn’t have it yet, the Docker client will automatically try and download it from Docker Hub.
Docker Containers are sort of encapsulated environments in which you run applications. A Container is defined by the image and only have access to resources that are defined in the image.
The machine where the container has to run should have a Docker client installed so that Docker commands can be executed.
You can use the Docker pull command to get the image from Docker hub to local machine
docker pull LeaveManagementImage
Use Docker run command to fetch the image as well as create a new container from that image
docker run -itd LeaveManagementImage
Once the container is created or has started using the run command, the container can be stopped, paused or started based on the requirement.
By using Docker, you can package your application and all its dependencies together in the form of Containers which ensures your application work seamlessly in any environment.
If you have any questions, please leave your comments or suggestions below. For more blogs, check out https://blog.appsassociates.com/.
Ravi Teja Kalaga is a Senior Consultant in the Integrations Practice at Apps Associates. He has worked on Java/J2EE, Struts, Spring frameworks and integration tools like MuleSoft, Dell Boomi, Tibco. He has been providing integrations solutions to various clients. He is a certified MuleSoft Integration Developer and Dell Boomi Architect.