As enterprise applications changed over time to serve different business needs and computer systems evolved from the monolithic architecture of mainframes into client-server and distributed systems, mechanisms for communications between different tiers of computer systems also had to develop.
As a longtime data professional, I’ve seen these evolutions take place throughout my career—early on, I saw the XML revolution take place with service-oriented architecture (SOA), especially in enterprise IT organizations. Protocols such as the distributed computing object model (DCOM) and Common Object Broker Request Architecture (COBRA) led to the concept of SOA, which many point to as the precursor for the modern microservices architecture. The other difference I’ve seen with my clients is the use of microservices is more widespread among different types and sizes of organizations than SOA or Simple Object Access Protocol (SOAP) ever were.
SOA then evolved into an asynchronous messaging model using publishers and subscribers to exchange data through queuing systems, which allowed systems to scale. Microservices architectures build on top of this model, with one key difference: RESTful APIs handle nearly all communications. RESTful APIs attempt to reduce the complexity of having computers talk to each other. Additionally, having a standard protocol for communications allows developers to build interfaces more easily and become core sets of software deployments.
As microservices architectures grew in popularity, both infrastructure and development frameworks grew to support the model. On the infrastructure side, Kubernetes—a cloud-native container orchestration platform—provides core services such as software-defined network services, most notably load balancing and desired state management. Public cloud offerings from Microsoft, Amazon, and Google have also enabled microservices development by allowing organizations of any size to take advantage of cost-effective design patterns, such as
serverless functions and software-defined routing. Because each microservice can be developed and built independently, organizations can also embrace DevOps to enable continuous integration and
continuous delivery (CI/CD).
Microservices Example
Microservices break down applications into their core functionality, and each of these functions is called a
service. Each service is built and deployed independently, which allows services to fail without completely taking down other related services.
I often think about when I use an online store. When I search for a product, the search function is its own service. When I place items into a shopping cart, it’s another service. If there’s a recommendation engine, this is yet another service offered to me as the customer, running with a graph database for its data store. The microservices design also allows increased scalability for an application.
Microservices Architecture Framework vs. Monolithic vs. SOA
As mentioned earlier, microservices aren’t fundamentally new. For various reasons—including performance, costs, and flexibility—organizations moved from monolithic applications to a model built around web services. Each service in an SOA was organized around a specific business process, just as microservices are. SOA differed from microservices in that groups of applications communicated through an enterprise service bus, which adhered to a specific communications protocol.
Although SOA grew in acceptance, it was hampered by its complexity. In my experience working on SOA systems, the complexity made applications hard to build and nearly impossible to debug. SOA has a lasting impact, causing applications to become more discrete and application development teams to get smaller and more focused on a specific business function.
The Role of Containers in Microservices
Container infrastructure, like in Kubernetes, has made microservices much more viable. If you think back to physical servers, you had to have a physical server in your data center for each service. This model was never scalable as application complexity grew, giving way to virtualization.
Virtualization enables software-defined infrastructure, but the resulting virtual machines (VMs) are still heavy—each computing environment must contain an entire copy of the guest operating system. This means each virtual machine is quite large in terms of storage and takes a lot of time and network to move.
Therefore, developers moved largely to containers, which Netflix popularized. The modern container movement started with Docker and then moved to Kubernetes. Docker is a container runtime designed to allow a workload to share a kernel with the host operating system, which means you need to place only your application code and any library dependencies in a container. Like open-source modules, these containers are published into either public or private repositories.
One of the interesting benefits of Docker is an individual developer can have the whole stack of the application on their laptop, allowing them to develop on their machine, promote to testing environments in the cloud, and ultimately graduate into production. Since the software stack is the same in all those places, you should never run into the “It worked on my machine!” problem frequently seen in the past when problems turned up in production. The ease of deploying infrastructure on public clouds, such as Azure and AWS, further allows small teams to focus on writing code rather than managing hardware.
Key Advantages of Using Microservices Architecture
Microservices provide many benefits to organizations. Though the application architecture mainly focuses on technical outcomes, the software development process allows for more development to be done concurrently. The loose coupling of individual microservices also allows different development cycles for each service, meaning upgrades and new features can be deployed quicker.
Beyond the development benefits, the infrastructure supporting microservices is more inherently fault-tolerant than older systems. In older hardware architectures, to make systems highly available, you had to manage clustering, load balancing, and—in some cases—integration with back-end storage. This architectural style introduced a great deal of complexity to both applications and infrastructure. With modern cloud-native systems like Kubernetes,
high availability is built into the platform. If your container (or Kubernetes Pod) crashes, it’s redeployed onto another node to ensure the application stays online.
Why DevOps Benefits From Microservices
DevOps and microservices have evolved together, driven by consumer demand for various internet services, changing user requirements, and the ever-evolving security threat landscape. The connection between
DevOps benefiting from microservices starts with the approach organizations take to development. Typically, each service has its own dedicated DevOps team dealing with a reduced surface area, letting them become more attuned to fixing and improving their services and
application stack management.
The automation built into Kubernetes and other container platforms can also allow automated testing, packaging, and deployment for each service. This automated testing approach can provide shorter feedback loops, improving the time required to fix bugs in the codebase. Deploying new releases is as simple as pushing the new build into the container registry and updating the production Pod definition to refer to the container image.
As
DevOps continues to evolve, performance monitoring tools should also grow alongside these efforts. Learn how enterprises can gain end-to-end IT operations visibility across their on-premises and cloud instances with
SolarWinds Observability Self-Hosted (formerly known as Hybrid Cloud Observability). With flexible, node-based licensing, SolarWinds Observability Self-Hosted (formerly known as Hybrid Cloud Observability) offers total cost of ownership advantages through a comprehensive full-stack solution.