When agile appeared, it solved (some of) the problems we were facing at that time. It changed the idea that months-long iterations were the way to go. We learned that delivering often provides numerous benefits. It taught us to organize teams around all the skills required to deliver iterations, as opposed to horizontal departments organized around technical expertise (developers, testers, managers and so on). It taught us that automated testing and continuous integration are the best way to move fast and deliver often. Test-driven development, pair-programming, daily stand-ups and so on. A lot has changed since the waterfall days.
As a result, agile changed the way we develop software, but it failed to change how we deliver it.
Now we know that what we learned through agile is not enough. The problems we are facing today are not the same as those we were facing back then. Hence, the DevOps movement emerged. It taught us that operations are as important as any other skill and that teams need to be able not only to develop but also to deploy software. And by deploy, I mean reliably deploy often, at scale and without downtime. In today's fast-paced industry that operates at scale, operations require development and development requires operations. DevOps is, in a way, the continuation of agile principles that, this time, include operations into the mix.
What is DevOps? It is a cross-disciplinary community of practice dedicated to the study of building, evolving and operating rapidly-changing, resilient systems at scale. It is as much a cultural as technological change in the way we deliver software, from requirements all the way to production.
Let's explore technological changes introduced by DevOps that, later on, evolved into DevOps 2.0.
By adding operations into existing (agile) practices and teams, DevOps united, previously excluded, parts of organizations and taught us that most (if not all) of what we do after committing code to a repository can be automated. However, it failed to introduce a real technological change. With it, we got, more or less, the same as we had before but, this time, automated. Software architecture stayed the same but we were able to deliver automatically. Tools remained the same, but were used to their fullest. Processes stayed the same but with less human involvement.
DevOps 2.0 is a reset. It tries to redefine (almost) everything we do and provide benefits modern tools and processes provide. It introduces changes to processes, tools and architecture. It enables continuous deployment at scale and self-healing systems.
In this blog series, I'll focus on tools which, consequently, influence processes and architecture. Or, is it the other way around? It's hard to say. Most likely each has an equal impact on the others. Never the less, today's focus is tools. Stay tuned.
The DevOps 2.0 Toolkit
If you liked this article, you might be interested in The DevOps 2.0 Toolkit: Automating the Continuous Deployment Pipeline with Containerized Microservices book.
The book is about different techniques that help us architect software in a better and more efficient way with microservices packed as immutable containers, tested and deployed continuously to servers that are automatically provisioned with configuration management tools. It's about fast, reliable and continuous deployments with zero-downtime and ability to roll-back. It's about scaling to any number of servers, the design of self-healing systems capable of recuperation from both hardware and software failures and about centralized logging and monitoring of the cluster.
In other words, this book encompasses the full microservices development and deployment lifecycle using some of the latest and greatest practices and tools. We'll use Docker, Ansible, Ubuntu, Docker Swarm and Docker Compose, Consul, etcd, Registrator, confd, Jenkins, nginx and so on. We'll go through many practices and even more tools.