Developers ranked Docker as the #1 most wanted platform, #2 most loved platform, and #3 most broadly used platform in the 2019 Stack Overflow Developer Survey. Nearly 90,000 developers from around the world responded to the survey. So we asked the community why they love Docker, and here are 10 of the reasons they shared:
“I love docker because it takes environment specific issues out of the equation – making the developer’s life easier and improving productivity by reducing time wasted debugging issues that ultimately don’t add value to the application.” @pamstr_
“Docker completely changed my life as a developer! I can spin up my project dependencies like databases for my application in a second in a clean state on any machine on our team! I can‘t not imagine the whole ci/cd-approach without docker. Automate all the stuff? Dockerize it!” @Dennis65560555
There are over one million Dockerfiles on GitHub today, but not all Dockerfiles are created equally. Efficiency is critical, and this blog series will cover five areas for Dockerfile best practices to help you write better Dockerfiles: incremental build time, image size, maintainability, security and repeatability. If you’re just beginning with Docker, this first blog post is for you! The next posts in the series will be more advanced.
Important note: the tips below follow the journey of ever-improving Dockerfiles for an example Java project based on Maven. The last Dockerfile is thus the recommended Dockerfile, while all intermediate ones are there only to illustrate specific best practices.
In a development cycle, when building a Docker image, making code changes, then rebuilding, it is important to leverage caching. Caching helps to avoid running build steps again when they don’t need to.
However, the order of the build steps (Dockerfile instructions) matters, because when a step’s cache is invalidated by changing files or modifying lines in the Dockerfile, subsequent steps of their cache will break. Order your steps from least to most frequently changing steps to optimize caching.
Docker Hub is home to the world’s largest library of container images. Millions of individual developers rely on Docker Hub for official and certified container images provided by independent software vendors (ISV) and the countless contributions shared by community developers and open source projects. Large enterprises can benefit from the curated content in Docker Hub by building on top of previous innovations, but these organizations often require greater control over what images are used and where they ultimately live (typically behind a firewall in a data center or cloud-based infrastructure). For these companies, building a secure content engine between Docker Hub and Docker Trusted Registry (DTR) provides the best of both worlds – an automated way to access and “download” fresh, approved content to a trusted registry that they control.
Ultimately, the Hub-to-DTR workflow gives developers a fresh source of validated and secure content to support a diverse set of application stacks and infrastructures; all while staying compliant with corporate standards. Here is an example of how this is executed in Docker Enterprise 3.0:
DTR allows customers to set up a mirror to grab content from a Hub repository by constantly polling it and pulling new image Continue reading
Modern applications can come in many flavors, consisting of different technology stacks and architectures, from n-tier to microservices and everything in between. Regardless of the application architecture, the focus is shifting from individual containers to a new unit of measurement which defines a set of containers working together – the Docker Application. We first introduced Docker Application packages a few months ago. In this blog post, we look at what’s driving the need for these higher-level objects and how Docker Enterprise 3.0 begins to shift the focus to applications.
Since our founding in 2013, Docker – and the ecosystem that has thrived around it – has been built around the core workflow of a Dockerfile that creates a container image that in turn becomes a running container. Docker containers, in turn, helped to drive the growth and popularity of microservices architectures by allowing independent parts of an application to be turned on and off rapidly and scaled independently and efficiently. The challenge is that as microservices adoption grows, a single application is no longer based on a handful of machines but dozens of containers that can be divided amongst different development teams. Continue reading
Over the past two years Docker has worked closely with customers to modernize portfolios of traditional applications with Docker container technology and Docker Enterprise, the industry-leading container platform. Such applications are typically monolithic in nature, run atop older operating systems such as Windows Server 2008 or Windows Server 2003, and are difficult to transition from on-premises data centers to the public cloud.
The Docker platform alleviates each of these pain points by decoupling an application from a particular operating system, enabling microservice architecture patterns, and fostering portability across on-premises, cloud, and hybrid environments.
As the Modernizing Traditional Applications (MTA) program has matured, Docker has invested in tooling and methodologies that accelerate the transition to containers and decrease the time necessary to experience value from the Docker Enterprise platform. From the initial application assessment process to running containerized applications on a cluster, Docker is committed to improving the experience for customers on the MTA journey.
Enterprises develop and maintain exhaustive portfolios of applications. Such apps come in a myriad of languages, frameworks, and architectures developed by both first and third party development teams. The first step in the containerization journey is to determine which applications are Continue reading
Last week, Docker hosted our 4th annual Mid-Atlantic and Government Docker Summit, a one-day technology conference held on Wednesday, May 29 near Washington, DC. Over 425 attendees in the public and private sector came together to share and learn about the trends driving change in IT from containers, cloud and DeVops. Specifically, the presenters shared content on topics including Docker Enterprise, our industry-leading container platform, Docker’s Kubernetes Service, Container Security and more.
Attendees were a mix of technology users and IT decision makers: everyone from developers, systems admins and architects to Sr. leaders and CTOs.
Highlights include a keynote by Docker’s EVP of Customer Success, Iain Gray, and a fireside chat by the former US CTO and Insight Ventures Partner, Nick Sinai, and current Federal US CIO, Suzette Kent.
The fireside highlighted top of mind issues for Kent and how that aligns with the White House IT Modernization Report; specifically modernization of current federal IT infrastructure and preparing and scaling the workforce. Kent mentioned, “The magic of IT modernization is marrying the technology with the people and the Continue reading
Delivered as part of Docker Enterprise 3.0, Docker Desktop Enterprise is a new developer tool that extends the Docker Enterprise Platform to developers’ desktops, improving developer productivity while accelerating time-to-market for new applications.
It is the only enterprise-ready Desktop platform that enables IT organizations to automate the delivery of legacy and modern applications using an agile operating model with integrated security. With work performed locally, developers can leverage a rapid feedback loop before pushing code or docker images to shared servers / continuous integration infrastructure.
\Imagine you are a developer & your organization has a production-ready environment running Docker Enterprise. To ensure that you don’t use any APIs or incompatible features that will break when you push an application to production, you would like to be certain your working environment exactly matches what’s running in Docker Enterprise production systems. This is where Docker Enterprise 3.0 and Docker Desktop Enterprise come in. It is basically a cohesive extension of the Docker Enterprise container platform that runs right on developers’ systems. Developers code and test locally using the same tools they use today and Docker Desktop Enterprise helps to quickly iterate and then produce a containerized service that is Continue reading
There are many tutorials and guides available for getting started with Kubernetes. Typically, these detail the key concepts and outline the steps for deploying your first Kubernetes cluster. However, when organizations want to roll out Kubernetes at scale or in production, the deployment is much more complex and there are a new set of requirements around both the initial setup and configuration and the ongoing management – often referred to as “Day 1 and Day 2 operations.”
Docker Enterprise 3.0, the leading container platform, includes Docker Kubernetes Service (DKS) – a seamless Kubernetes experience from developers’ desktops to production servers. DKS makes it simple for enterprises to secure and manage their Kubernetes environment by abstracting away many of these complexities. With Docker Enterprise, operations teams can easily deploy, scale, backup and restore, and upgrade a certified Kubernetes environment using a set of simple CLI commands. In this blog post, we’ll highlight some of these new features.
A real Kubernetes cluster deployment will typically involve design and planning to ensure that the environment integrates with an organization’s preferred infrastructure, storage and networking stacks. The design process usually requires cross-functional expertise to determine the instance Continue reading
We’ve talked a lot about how Docker Enterprise supports and simplifies Kubernetes. But how are organizations actually running Kubernetes on Docker Enterprise? What have they learned from their experiences?
Here are three of their stories:
When you visit the doctor’s office or hospital, there’s a very good chance McKesson’s solutions and systems are helping make quality healthcare possible. The company ranks number 6 in the Fortune 100 with $208 billion in revenue, and provides information systems, medical equipment and supplies to healthcare providers.
The technology team built the McKesson Kubernetes Platform (MKP) on Docker Enterprise to give its developers a consistent ecosystem to build, share and run software in a secure and resilient fashion. The multi-tenant, multi-cloud platform runs across Microsoft Azure, Google Cloud Platform and on-premise systems supporting several use cases:
Kubernetes is a powerful orchestration technology for deploying, scaling and managing distributed applications and it has taken the industry by storm over the past few years. However, due to its inherent complexity, relatively few enterprises have been able to realize the full value of Kubernetes; with 96% of enterprise IT organizations unable to manage Kubernetes on their own. At Docker, we recognize that much of Kubernetes’ perceived complexity stems from a lack of intuitive security and manageability configurations that most enterprises expect and require for production-grade software.
Docker Kubernetes Service (DKS) is a Certified Kubernetes distribution that is included with Docker Enterprise 3.0 and is designed to solve this fundamental challenge. It’s the only offering that integrates Kubernetes from the developer desktop to production servers, with ‘sensible secure defaults’ out-of-the-box. Simply put, DKS makes Kubernetes easy to use and more secure for the entire organization. Here are three things that DKS does to simplify (and accelerate) Kubernetes adoption for the enterprise:
DKS is the only Kubernetes offering that provides consistency across the full development lifecycle from local desktops to servers. Through the use of Version Packs, developers’ Kubernetes environments running Continue reading
Following on the heels of DockerCon SF, the team is packing their bags and heading to Barcelona for KubeCon + CloudNativeCon EU from May 20- 23. Docker employees, community members and Docker captains will be there speaking about and demonstrating Docker and Kubernetes.
Stop by Booth G14 to learn more about our Docker Kubernetes Services (DKS), which is part of the recently announced Docker Enterprise 3.0. Docker Enterprise 3.0 is the only container platform that provides a simple and integrated desktop-to-cloud experience for both Docker and Kubernetes.
Get involved in and learn more about some of the projects Docker has been working on with the Kubernetes community:
Also, there is an opportunity to join Docker and Microsoft in contributing to the Cloud Native Application Bundle (CNAB) specification – an Continue reading
We started working with Microsoft five years ago to containerize Windows Server applications. Today, many of our enterprise customers run Windows containers in production. We’ve seen customers containerize everything from 15 year old Windows .NET 1.1 applications to new ASP.NET applications.
If you haven’t started containerizing Windows applications and running them in production, here are five great reasons to get started:
Extended Support ends in January 2020. Rewriting hundreds of legacy applications to run on Windows Server 2016 or 2019 is a ridiculously expensive and time-consuming headache, so you’ll need to find a better way — and that’s Docker Enterprise.
You can containerize legacy Windows applications with Docker Enterprise without needing to rewrite them. Once containerized, these applications are easier to modernize and extend with new services.
The recently announced Kubernetes 1.14 includes support for Windows nodes. With Docker Enterprise, you will soon be able to use either orchestrator to run Windows nodes.
Once you Continue reading
Fresh off the heels of DockerCon and the announcement of Docker Enterprise 3.0, an end-to-end and dev-to-cloud container platform, I wanted to share some thoughts on what we mean when we say “complete container platform”.
A complete solution has to meet the needs of different kinds of applications and users – not just cloud native projects but legacy and brownfield applications on both Linux and Windows, too. At a high level, one of the goals of modernization – the leading reason organizations are adopting container platforms – is to rid ourselves of technical debt. Organizations want the freedom to create their apps based on the “right” stack and running in the “right” place, even though what’s “right” may vary from app to app. So the container platform running those applications should be flexible and open to support those needs, rather than rigidly tying application teams to a single OS or virtualization and cloud model.
To deliver high velocity innovation your developers are a key constituent for the container platform. That means the container platform should extend to their environment, so that developers are building and testing on the same APIs that will be used Continue reading
If you missed DockerCon in San Francisco this year or were unable to watch the livestream, no need to worry – we have you covered. You can catch all the demos, get the latest announcements and find out what is next for the Docker ecosystem by watching the replay sessions on demand.
On Tuesday, we kicked off the first day of DockerCon with product announcements, demos and customer guest speakers. During the session, we presented Docker Enterprise 3.0, the only desktop-to-cloud enterprise container platform enabling organizations to build and share any application and securely run them anywhere – from hybrid cloud to the edge. Additionally, we announced this year’s winners of the Customer Innovation awards, featuring Carnival, Citizens Bank, Liberty Mutual, Lindsay Corporation and Nationwide.
On-stage, the Docker team also demonstrated Docker Applications, Docker Kubernetes Service (DKS) and new features and capabilities in Docker Desktop Enterprise – all designed to accelerate the application development and deployment pipeline. They keynote closed with a demonstration from R.O.S.I.E, the robot built by two Liberty Mutual engineers using Docker.
To learn first hand everything featured Continue reading
This week at DockerCon 2019, we shared our strategy for helping companies realize the benefits of digital transformation through new enterprise solution offerings that address the most common application profile in their portfolio. Our new enterprise solution offerings include the Docker platform, new tooling and services needed to migrate your applications. Building on the success and the experience from the Modernize Traditional Applications (MTA) program and Docker Enterprise 3.0, we are excited to expand our solutions and play an even greater role in our customers’ innovation strategy by offering a complete and comprehensive path to application containerization.
When you hear about different application profiles, you may think about different languages or frameworks or even different application architectures like microservices and monoliths. But one of the benefits of containerization is that all application dependencies are abstracted away and what you have is a container that can be deployed consistently across different infrastructure.
In our work with many enterprise organizations, we’ve validated that the successful adoption of a container strategy is just as much about the people and processes as it is about the technology. There are 3 behavioral patterns that matter and that is dependent on what Continue reading
Today at DockerCon, we’re excited to announce Docker Enterprise 3.0 – the only desktop-to-cloud enterprise container platform enabling organizations to build and share any application and securely run them anywhere – from hybrid cloud to the edge.
With Docker Enterprise 3.0, developers can rapidly build multi-service container-based applications right from their desktop and package them in a standardized format that can be shared seamlessly and run anywhere. In addition, Docker Enterprise 3.0 expands its container platform leadership position with the introduction of new capabilities for automated lifecycle management and enhanced security.
Here are some of the highlights that you can look forward to in Docker Enterprise 3.0.
Enterprises are looking for ways to quickly adapt to new competitive challenges and changing customer requirements through the introduction of new applications. Docker Enterprise 3.0 introduces a number of capabilities that help organizations accelerate application delivery.
Docker Desktop Enterprise is a new developer tool that extends the Docker Enterprise Platform to developers’ desktops, improving developer productivity while accelerating time-to-market for new applications.