- The Problems of Virtualization – Where are we today and what problems 20 years of virtualization have caused, such as management, control, and VM sprawl.
- Looking Beyond Server Virtualization – When you use the phrase virtualization, our thoughts immediately turn to Intel Servers, hypervisors, and virtual machines. But the future power of virtualization lies in changing the definition. If we think of it as abstracting the dependency of software from specific hardware, it opens a range of new opportunities.
- Virtualization and the Drive to Infrastructure as Code – How the shift to a more software-defined world is going to cement the need for virtualization. Environments reliant on engineered systems are inflexible and slow to deploy. They’re going to become less useful and less prevalent. We need to be able to deploy our architecture in new ways and deliver it rapidly, consistently, and securely.
- The Virtual Future – As we desire increasing agility, flexibility, and portability across our infrastructure and need more software-defined environments, automation, and integration with public cloud, then virtualization, (although maybe not in the traditional way we think of it), is going to play a core part in our futures. The more our infrastructure is software, the more ability we’ll have to deliver the future so many enterprises demand.
Virtualization – How We Got Here and Does It Have a Future?
August 14, 2019
Monitoring and Observability
In over 20 years I’ve spent in the IT industry, I’ve seen many changes, but nothing has had a bigger impact as virtualization.
I remember sitting in a classroom back in the early 2000s with a server vendor introducing VMware. They shared how they enabled us to segment underused resources on Intel servers into something called a "virtual machine" and then running separate environments in parallel on the same piece of hardware—technology witchcraft.
This led to a seismic shift in the way we built our server infrastructure. At the time, our data centers and computer rooms were full of individual, often over-specified and underutilized servers running an operating system and application, all of which were consuming space, power, cooling, and costing a small fortune. Add in projects taking months to deploy, meaning projects were often slow, reducing innovation, and elongating the response to business demands.
Virtualization revolutionized this, allowing us to reduce server estates and lower the cost of data centers and infrastructure. This allowed us to deploy new applications and services more quickly, letting us better meet the needs of our enterprise.
Today, virtualization is the de facto standard for how we deploy server infrastructure. It once being an odd, cutting-edge concept is hard to believe. While it’s the standard deployment model, the reasons we virtualize have changed over the last 20 years. It’s no longer about resource consolidation—it’s more about simplicity, efficiency, and convenience.
But, virtualization (especially server-based) is also a mature technology designed for Intel servers, running Windows and Linux inside the sacred walls of our data center. While it has served us well in making our data centers more efficient and flexible, reducing cost and ecological impact, does it still have a part to play in a world rapidly moving away from these traditional ways of working?
Over the next few weeks, we’ll explore virtualization from where we are today, the problems virtualization has created, where it’s heading, and whether it remains relevant in our rapidly changing technology world.
In this series, we’ll discuss: