This pace of change oftens means a project you worked on a few months ago is now using “outdated” tools and technologies compared to your current project. And the older a project gets, the more difficult it is to maintain a development environment that supports the older libraries and runtime requirements.
In the past, developers have tried to solve these problems with configuration management tools, library dependency and versioning tools, full-on virtual machines to duplicate entire development environments, and more. But configuration drift is a problem that version management can’t always solve, and duplicating your entire development environment is the easiest way to introduce configuration drift (among other things).
Docker is virtualization at the application level, encapsulating a single application process with all of it’s configuration, runtime environment and dependencies. It will help you solve the “works on my machine” problem by nearly eliminating the need to configure the machine on which it runs. You deploy the application as an immutable binary object, and all of it’s configuration and runtime environment come with it.
That means you no longer have to worry about what version of Node.js your old project is using. You don’t need to re-install Babel.js v5 for an old project, and then v6 again for a new project. You can test the latest and greatest webpack, browserify and other tooling with zero conflict in your current projects.
In this 2 day course, you’ll learn the core of what Docker is, how it works, how you can take advantage of it as a developer working on multiple projects or with multiple technologies, and how to effectively and efficiently use docker in your day-to-day development efforts.
Day 1: The Basics of Docker
Learn the basics of what Docker is, how it works and how you will interact with it. Build and understand the difference between an image and a container, and run learn how to execute your own code within a container.
What is Docker?
An introduction to Docker and application virtualization, correcting common misunderstandings and introducing key terminology while answering the question, “what does Docker do for me?”
- Images and containers
- Core command-line tools
Why are there different versions of Docker? Which should you use? Nothing teaches faster than experience, so let’s get the experience started with installation and executing your first Docker container.
- Docker for <Mac / Windows> vs Docker Toolbox vs Docker Engine
- Run the “hello world” example container
- Install Docker and run “hello world” example
Image and Container Management
Working with Docker involves more than just running your application. You need to know what application containers and images are available on your machine, how to find new images, and how to remove old images that you no longer need.
- Finding and downloading images from dockerhub
- One image, multiple containers
- Cleaning up containers and images
- Download an image, run a container, clean it all up
Docker for Local Services
Docker can eliminate the need to install MySQL, MongoDB and other commonly used applications and services, directly on your computer. Learn how to take advantage of this to keep your system clean and running smoothly.
- Nginx and port mapping
- Mongodb and data volumes
- Node.js and basic networking
- Run nginx in front of Node.js, connecting to MongoDB
(pre-built docker images will be provided)
Build Your First Image
Now that you’re familiar with managing Docker containers, it’s time to build your own, starting with a simple example just to see how this works.
- Required and common instructions
- The container lifecycle
- Build your own image and run a container
Building a basic container is the first step, but now it’s time to run some code in Docker. You’ll start with a simple command shell in the container to manually run a small Node.js script.
- Choosing a base image
- Getting code into the image
- Running Node.js with an interactive shell
- Containerize a simple command-line Node.js app and run it
(pre-built code will be provided)
Day 2: Developing In Docker
Turn the basic knowledge of how Docker works into an effective development platform. Learn how to modify code for your container to use, how to debug within a container, run your build tools and other development processes, and automate multiple containers to run a full application stack.
Development vs Production Images
A production environment should not be littered with development concerns. Learn how to correctly keep them separate, while ensuring your development environment is in sync with production.
- Installing development tooling
- Environment variable and other configuration
- Build a development image from a production image
Installing npm modules
Rebuilding a Docker image and watching npm run endlessly can be incredibly frustrating. Learn the techniques to cut down the time spent with npm, and re-use already installed modules across containers.
- Data volumes and caching for speed
- Installing new modules directly in a container
- Native build modules (bcrypt, etc)
- Update the Dockerfile to cache node modules
- Install mongoose from npm in a container
- Reuse already installed modules in another container instance
Editing Code in a Container
Running code in a container is a necessary part of working with Docker. But how do you edit the code that is running, without having to rebuild the container every time?
- Interactive shell
- BYOE (Bring Your Own Editor)
- Edit a simple express.js project and show it running in your browser with your changes
Debugging in a Container
If you write code, you will also need to debug that code. Learn how to connect to an existing application instance or start a new instance with the debugger already attached.
- Command-line debugging
- “Remote” debugging with VS Code
- Attach to existing app instance
- Start app with debugger
- Debug an issue in Node.js code, correcting the code and verifying the fix
- Gulp / Grunt
- New shell, same container
- Jasmine unit tests
- Babel.js and Browserify
Multiple Containers, One Command
Manually running individual containers becomes tedious quickly – especially as an application grows and begins to use multiple services. Learn how to use docker-compose to automate an entire application stack with a single command.
- Creating a docker-compose.yml file
- Start, stop and tear down docker-compose containers
- Network mongodb and Node.js app
- Run a complete application stack – front-end and back-end – in docker-compose
About The Instructor
My name is Derick Bailey.
I’ve spent a lifetime (more than 20 years) building software for leading financial organizations and airlines, the U.S. government, and more. I’ve created critical systems for large healthcare organizations, saving them millions of dollars a year. I’ve helped small-town, independently owned businesses improve their bottom line with simple software solutions and complex process automation.
And I’ve helped countless developers through sites like Stack Overflow, writing sample code for Microsoft’s MSDN library, by publishing dozens of leading open source projects, including the most widely used add-on for Backbone.js (Marionette.js) and more.
Now, I want to help you and your team get up to speed with a technology that is changing the very nature of software development, deployment and production management.
Docker is the future of software development, and I’m here to help you turn it’s potential into productivity.
– Derick Bailey
– Creator of Marionette.js
– Owner of WatchMeCode.net