• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
The Virtualization Imperative
 

The Virtualization Imperative

on

  • 689 views

 

Statistics

Views

Total Views
689
Views on SlideShare
689
Embed Views
0

Actions

Likes
0
Downloads
15
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    The Virtualization Imperative The Virtualization Imperative Document Transcript

    • November 8, 2007 The Virtualization Imperative by Galen Schreck for IT Infrastructure & Operations Professionals Making Leaders Successful Every Day
    • For IT Infrastructure & Operations Professionals November 8, 2007 The Virtualization Imperative Why Virtualization Is The Key To IT Flexibility This is the first document in the “Virtualization Imperative” series. by Galen Schreck with Natalie Lambert, Robert Whiteley, Andrew Reichman, Simon Yates, and Rachel A. Dines EXECUT I V E S U M MA RY Virtualization has wide-ranging effects throughout your IT infrastructure. Even though virtualization has been making technology easier to use, it has become even trendier in the past few years. Why the sudden attention? Recent innovations in server or client virtualization are producing more and more business value than prior innovations in system architecture or programming languages, which were more confined to IT. This report examines the progress being made in virtualized infrastructure and how it is the key to building a more change-ready and agile foundation for your business. TABLE O F CO N T E N TS N OT E S & R E S O U R C E S 2 Virtualization Is Based On An Old Idea For this report, Forrester spoke with industry 4 There Are Four Key Areas Of Virtualization experts, technology vendors, and end user Within IT Infrastructure companies. 4 Server Virtualization Drives Consolidation And Flexibility Related Research Documents “Large Enterprises Lead In Demand For Storage 6 Storage Virtualization Has Traveled A Rocky Virtualization” Road January 17, 2007 8 Network Virtualization Creates Dynamic “Storage Virtualization Emerges . . . Slowly” Connectivity And Reusable Services January 17, 2007 10 Client Virtualization Is Changing Application Delivery “Decoding Virtualization’s Present And Future” January 9, 2007 RECOMMENDATIONS 14 Virtualization Is A Process, Not A Technology “Organic IT Challenges IT Organizational Practices” Fad August 18, 2005 WHAT IT MEANS “Organic IT 2004: Cut IT Costs, Speed Up Business” 14 Virtualization Will Keep Moving To Higher May 18, 2004 Levels © 2007, Forrester Research, Inc. All rights reserved. Forrester, Forrester Wave, RoleView, Technographics, and Total Economic Impact are trademarks of Forrester Research, Inc. All other trademarks are the property of their respective companies. Forrester clients may make one attributed copy or slide of each figure contained herein. Additional reproduction is strictly prohibited. For additional reproduction rights and usage information, go to www.forrester.com. Information is based on best available resources. Opinions reflect judgment at the time and are subject to change. To purchase reprints of this document, please email resourcecenter@forrester.com.
    • 2 The Virtualization Imperative For IT Infrastructure & Operations Professionals VIRTUALIZATION IS BASED ON AN OLD IDEA Despite the appearance of rising to stardom almost overnight, virtualization is in fact a technology that dates back to the earliest days of computing and is really an ongoing phenomenon with an evolving role in corporate IT infrastructure. Through the process of virtualization, we are now able to build ever more complex systems and applications with a minimum of new effort. At the core of virtualization is the concept of abstraction, which has been a key part of hundreds of critical innovations such as modern programming languages like C++ and Java and our old friend, the Domain Name System (DNS). How is this possible? Abstraction allows us to refer to objects like memory locations, servers, or disk drives in generic terms. By freeing ourselves from hard-wired associations, we can make our systems more receptive to change. For example: · Higher-level languages make programmers more productive. Instead of issuing low-level machine-readable instructions like assembly language, programmers can use higher-level commands to easily build complicated applications. Performing even simple mathematical calculations requires multiple CPU instructions — never mind the number of calculations required to open a window in a modern operating system. Yet programmers can perform actions like this with a handful of abstracted commands. Behind each abstracted command are layers of underlying code that handle the machine-specific details of opening a window. Sure, these layers of code are less than perfectly efficient, but modern CPUs are cheap compared with the cost of developing an application in machine code. · DNS makes servers interchangeable. In addition to replacing hard-to-remember IP addresses with people-friendly names, DNS serves an even more important function. In the event of server failure, scheduled maintenance, or system migration, administrators can point the DNS name of the original server to another machine. If we hard-coded IP addresses into our applications, maintenance of these relationships would soon become impossible. Instead of changing one entry in the DNS database whenever a server moved to another address, programmers would have to find every reference to its IP address and update it. Virtualization builds on the basic element of abstraction by using it to build complex systems. This has led to the development of products like virtual servers, disks, or firewalls. They act just like their physical counterparts, except that they exist only in software. Because they exist in software, they can be created at the click of a button. The applications that use them can’t tell the difference between the physical and the virtual because they provide all the same services and respond just like the real thing. November 8, 2007 © 2007, Forrester Research, Inc. Reproduction Prohibited
    • The Virtualization Imperative 3 For IT Infrastructure & Operations Professionals IT Needs Virtualization Because It Makes Systems More Change-Ready Forrester’s research on virtualization has spanned multiple technologies, addressed the effects of virtualization and strategies for implementations, and positioned virtualization as a key technology in Organic IT — Forrester’s long-term vision around data center architecture.1 We also predicted that virtualization would be a key element of that scale-out data center’s command-and-control brain, the fabric operating system.2 Obviously, a lot has happened in the intervening years, and while the underlying technologies and suppliers may have changed, several constants remain: · Commoditization. Virtualization enables the shift from expensive, proprietary hardware solutions toward commodity hardware by aggregating smaller or less reliable systems together. For example, cheaper and less reliable disks may be combined into one virtual disk that features better performance and reliability than any single component. Furthermore, it tends to make hardware more generic at the component level, which in this example means that proprietary drive-level features cease to matter. Sure enough, the disks themselves have become a commodity, and product differentiation occurs at the storage array or software level of the stack. · Simplification. Virtualization makes it easier to build complex systems. The layer of abstraction in the DNS example compartmentalizes the impact of IP address changes. Thus, by compartmentalizing complexity instead of passing it up the infrastructure stack, we are able to reuse entire modules without changing them. In the past, a module might have referred to a subroutine or Java applet, but today a module can be an entire virtual server and all the applications that run on it. · Automation. The problem with automation has always been that it is notoriously fragile. For example, an administrator writes a script to automate a series of tasks, but as soon as something changed — even slightly — the script doesn’t work correctly anymore. There was simply no point in automating too much, because the complexity of maintaining hard-wired automation would outweigh the benefits. But as virtualization simplifies and modularizes more of our infrastructure, more advanced automation becomes possible. · Flexibility. Prior to server virtualization, it would be impossible to take a snapshot of one server and redeploy it on a different machine for disaster recovery purposes. It probably wouldn’t even start until administrators adjusted the operating system configuration and device drivers. Today, however, the abstraction created by server virtualization products allows us to relocate a virtual machine (VM) with a few clicks on a screen. © 2007, Forrester Research, Inc. Reproduction Prohibited November 8, 2007
    • 4 The Virtualization Imperative For IT Infrastructure & Operations Professionals THERE ARE FOUR KEY AREAS OF VIRTUALIZATION WITHIN IT INFRASTRUCTURE Virtualization is taking place within four key IT domains. But virtualization is also maturing at four different rates, because each domain is managed separately, and virtualization is implemented in a different manner and for different reasons. In fact, most domains have more than one flavor of virtualization — some of which are complementary. The key IT infrastructure domains being virtualized are: · Servers. Firms are using server virtualization to — among other things — consolidate infrastructure and application software onto fewer, but increasingly more powerful, machines. · Storage. Virtualized storage allows you to create a dynamic pool of storage capacity regardless of the physical location of the disk or the type of media used. · Networks. Virtualization allows network administrators to operate multiple devices in the network as a single asset and manage the use of network resources more efficiently. · End user clients. Virtualization on the client focuses primarily on simplifying client software management and cordoning off sensitive applications for security purposes. But virtualization is certainly not limited to IT infrastructure. Many other forms of virtualization have taken place higher up in the IT stack, but they are beyond the scope of this report. They include business service management (BSM), software-as-a -service (SaaS), and services-oriented architecture (SOA). SERVER VIRTUALIZATION DRIVES CONSOLIDATION AND FLEXIBILITY Server virtualization is one of the best-known forms of virtualization and one that has been experiencing rapid adoption over the past several years — mostly on x86 systems.3 Server virtualization products all have one common attribute: They allow one physical server to appear as multiple machines, though they only exist in software. These VMs are created by a layer of software called a hypervisor, which is available from a number of sources including VMware and XenSource (recently acquired by Citrix). What are the basic things to know about server virtualization? · Hypervisors are not a new concept. Mainframes and proprietary Unix-based systems from vendors like HP, IBM, and Sun have long had the ability to partition a physical machine into multiple virtual ones. What is new, however, is the ability to partition a commodity Intel- or AMD-based machine the same way. The low utilization of x86 environments, their widespread proliferation, and their commodity economics are the driving factors behind the rapid adoption of server virtualization. November 8, 2007 © 2007, Forrester Research, Inc. Reproduction Prohibited
    • The Virtualization Imperative 5 For IT Infrastructure & Operations Professionals · Most x86 server virtualization tools come from VMware today. There are competing x86 server virtualization tools from companies like Microsoft, SWsoft, Virtual Iron, and XenSource. But VMware powers nearly all of the production x86 virtualization environments today, thanks to its product maturity and high performance. The next version of Windows Server, for example, will eventually have a built-in hypervisor.4 · Server virtualization is not for every app. Manufacturers of mainframes and proprietary Unix systems evolved their hardware to run VMs with no performance overhead, but x86 servers are just catching up. As a result, virtualization can impose some overhead on applications — making it important to size and test systems correctly. Some highly utilized applications, like databases, may be better off left on dedicated hardware. Others, like email servers, can be virtualized but require extra server capacity to function correctly. What Does Server Virtualization Do For IT? Many early adopters of x86 server virtualization began using it for test and development systems. Firms soon found that it let them survive with fewer systems, and it simplified setting up test environments from a library of stored images. While virtualization is still useful for consolidating underutilized servers, the technology has improved and now delivers: · Flexible workload management. Virtual servers can be easily relocated to other physical machines when workloads spike, and some products allow administrators to move applications without even disconnecting users. By automating live migration and setting thresholds for when it should happen, administrators can protect application response time without over- provisioning hardware. · Basic high-availability and disaster recovery services. The ability to copy a VM to a different server and run it with no modifications enables VMs to provide better availability than commodity servers alone.5 Granted, VMs are not as fault-tolerant as highly redundant — and expensive — servers with advanced replication tools. The difference is that VMs now bring 80% of these same benefits to inexpensive x86 systems, where typically you would have no high availability at all. · Rapid server migration and consolidation. Typical x86 servers are extremely underutilized — frequently in the single digits. These excess servers take up data center space, power, and cooling capacity. Consolidation has been a painful process in the past because running apps on the same server involved application compatibility, huge amounts of planning, and disruptions to application availability. VMs are extremely popular because they allow each application to run in its own isolated version of the operating system (OS), with different configurations if necessary. Disruptions are minimal because the snapshot and migration process can be automated by tools from Microsoft, PlateSpin, and VMware. © 2007, Forrester Research, Inc. Reproduction Prohibited November 8, 2007
    • 6 The Virtualization Imperative For IT Infrastructure & Operations Professionals What’s Next For Server Virtualization? Forrester expects virtual machines to be the default platform for most new x86 servers in the future. As hardware improvements make VM performance nearer to that of bare metal, you will be able to steadily increase the percentage of applications that can run virtualized. We already see early evidence of this trend: · Hypervisors will soon become a commodity. Already, companies like Microsoft and VMware are beginning to focus more on management tools, applications, and automation for the underlying virtualization engine. We expect that even high-end hypervisors will be free or nearly free, as in the case of Windows Server and Xen. Instead, customers will have to pay for increased capabilities like replication, automation, HA/DR, and other advanced features. · The size of the hypervisor has shrunk dramatically. VMware, for example, has unveiled a 32 MB hypervisor. So what? Now that VMware has stripped virtualization to its bare essentials, the hypervisor can easily be included on new servers and booted from firmware. With virtualization resident in the server at the time of purchase, usage will become nearly ubiquitous. Similarly, with the next version of Windows Server including a hypervisor, we expect that most x86 servers will be virtualized just two to three years from now. STORAGE VIRTUALIZATION HAS TRAVELED A ROCKY ROAD Advanced storage virtualization has been available in various forms for years, but it still suffers from very low adoption due to performance concerns and the disruption of installing it. The most basic form of storage virtualization is the redundant array of inexpensive disks (RAID), which is at the core of just about every disk-based storage system. But after the creation of RAID, virtualization technology split into many different flavors and architectures, which continue to struggle for dominance. As a result: · Storage virtualization tools have several typical architectures. Today, most storage virtualization products can be classified according to where they reside — on the array controller, in a network switch, on the host, or in an appliance. Where the virtualization lives matters because it determines certain architectural details that are beyond the scope of this report.6 Controller-based virtualization has slightly higher adoption because it is integrated into the vendor’s disk array, and is therefore 100% supported by a single vendor. Other forms of virtualization are add-on products that should be qualified by each storage vendor that you connect to them. But even in qualified heterogeneous environments, vendor finger-pointing can result. · Despite different architectures, storage virtualization tools have similar capabilities. Unlike server virtualization that allows one server to appear as many, storage virtualization allows many arrays to appear as one. In other words, administrators can compose a virtual volume that November 8, 2007 © 2007, Forrester Research, Inc. Reproduction Prohibited
    • The Virtualization Imperative 7 For IT Infrastructure & Operations Professionals is a composite of multiple physical arrays. The capability to span physical arrays was the original selling point for storage virtualization because it allows you to more fully utilize array capacity, and can also simplify migration to new equipment. · There are plenty of storage virtualization vendors — but few customers. Major storage suppliers like EMC, Hitachi Data Systems, IBM, Network Appliance, and Sun all offer some flavor of storage virtualization. Furthermore, a new crop of array hardware vendors like 3PAR and Compellent offer their own integrated versions of virtualization. Still more software and appliance vendors like DataCore, FalconStor, and LSI offer add-on storage virtualization solutions that bring virtualization to existing storage. Despite storage virtualization’s long track record, customer adoption remains low at just 22% of large US businesses we’ve surveyed.7 What Does Storage Virtualization Do For IT? Although customer adoption is low, many firms have realized that storage virtualization delivers real benefits to IT. The business case can be complicated by firms’ existing investments and the risk that comes from introducing a new technology. The abstraction provided by storage virtualization reduces the cost of IT by providing: · Simplified management. Storage virtualization consolidates day-to-day management tasks and utilization information into one console. When provisioning storage or extending an existing volume, administrators typically have to visit the proprietary management tool provided by the appropriate array vendor. Keeping track of all the storage and switching between consoles can consume a lot of an administrators’ time, because most firms have multiple storage vendors and multiple classes of storage. · Improved utilization. In addition to creating virtual volumes than can span physical arrays, many storage virtualization products have a capability called thin provisioning.8 In short, thin provisioning only allocates physical storage when data is actually written to it. This eliminates the kind of waste that results from application owners who request more storage than they actually need, preventing others from using it in the meantime. It also allows you to oversubscribe the storage infrastructure, only purchasing storage when arrays are truly full. · Painless migration. Storage virtualization is sometimes used for its migration capabilities alone. By providing a layer of abstraction between applications and the storage resources they access, administrators can transition users off older arrays while their applications are still up and running. · High-end features often found on expensive arrays. Many storage virtualization products — many of them third-party appliances — have high-end features like replication and mirroring that can be applied to cheaper arrays that lack them. This threatened to move more valuable features off the disk array, which is a big reason why storage virtualization was not immediately embraced by major storage suppliers. © 2007, Forrester Research, Inc. Reproduction Prohibited November 8, 2007
    • 8 The Virtualization Imperative For IT Infrastructure & Operations Professionals What’s Next For Storage Virtualization? For years customers have had a mixed reaction to storage virtualization products. Early on, many firms were not interested in purchasing critical infrastructure components from startups, but growth remained slow even after trusted storage vendors marketed their own. Forrester believes that: · Storage virtualization itself is not the application. On its own, storage virtualization still doesn’t have mass market appeal. Storage virtualization products still require custom-tailoring to ensure performance, scalability, and accurate capacity and performance management. Forrester believes that most storage virtualization as we know it today will slowly be superseded by products that use virtualization to deliver their application, but may not even feature it as their primary capability. · Application-based storage virtualization will increase. Applications like archiving, unstructured content discovery, and databases can all benefit from virtualization — and many even include it today. These systems may be hardware or software, but their primary application won’t be virtualization. EMC’s Centera, which is an archival storage system, relies on virtualization to provide its key function — but virtualization is not the application, archiving is. Similarly, HP’s Reference Information Manager for Databases, a database archiving application, requires virtualization to do its job — but never mentions the technology in the product data sheet. · Future storage subsystems will contain virtualization — but not at the block level. Beyond five years from now, we expect to see object-based storage subsystems. Rather than storing a single block, they might accept an entire file or database record as well as metadata that governs policies like security and retention. In this way, storage subsystems themselves will become more application-oriented, and virtualization will become a natural part of how information is managed — not a product or a feature. NETWORK VIRTUALIZATION CREATES DYNAMIC CONNECTIVITY AND REUSABLE SERVICES The network is the odd-man-out in your virtual infrastructure stack. Enterprises have built networks as a shared utility for years. They’re well utilized and already embrace an abstracted architecture, thanks to the Open Systems Interconnect (OSI) model. Moreover, bandwidth has been virtualized for more than a decade with virtual local area networks (VLANs) in the data center and virtual private networks (VPNs) out to remote sites and users. But there’s still room for automation and simplicity as: · Management tools span network components. Advancements in 10 gigabit Ethernet continue to standardize network connectivity and provide native server and storage interconnects on November 8, 2007 © 2007, Forrester Research, Inc. Reproduction Prohibited
    • The Virtualization Imperative 9 For IT Infrastructure & Operations Professionals a single IP fabric. Today’s enterprise networking vendors already offer the ability to manage multiple physical devices — like an entire stack of switches or multiple router chassis — as a single logical device. The catch? You need additional management tools like Opsware’s Network Automation System to truly apply abstracted policies across heterogeneous network components. · Network resource virtualization provides dynamic components. Several years back, we came across Inkra Networks, a company ahead of its time. Inkra’s idea was simple: create reusable networking services like firewalls that can be created on demand and wired virtually. In other words, a single appliance was able to appear as multiple firewalls, load balancers, and VLANs, with all the wiring in software. Although Inkra didn’t make it, we do see vendors like 3Leaf Systems’ V-8000 and Cisco’s VFrame marching on.9 These tools transform your commoditized transport into a single pool of resources, which can be dynamically applied across the data center, much like you would with virtualized servers. · VMware shakes things up with a virtualized switch. Perhaps the most disruptive form of network virtualization is VMware’s virtual switch, which is a standard component of its ESX server. The virtual switch acts just like a physical switch that directs traffic among unsuspecting VMs. Why is this disruptive? Because it changes the job of scaling networks from adding more ports to allocating more server CPU cycles. But there’s a downside. Virtual switches are still hamstrung by the amount of physical bandwidth you supply the server. For example, you could have a 10 gigabit virtual switch within your ESX server that communicates with the outside world via a single, oversubscribed 1 gigabit network interface card (NIC). Still, the power and flexibility of doing networking functions in software will dramatically change network costs. What Does Network Virtualization Do For IT? IT is finally moving beyond the age-old network adage of “just throw bandwidth at it.” As users become more mobile, applications become more distributed, and infrastructure further consolidates, the network becomes a critical asset to hold everything together. Latency — not bandwidth — is the key design criterion. Network virtualization helps by: · Removing potential bottlenecks. The network plays a critical role in enabling server, storage, and desktop virtualization. Not only do these require rapidly moving large chunks of data, but they also need dynamic remapping of network resources like VLANs, ports, and IP addresses. Virtualized network infrastructure provides the necessary application programming interfaces (API) to automate these tasks and provide high-performance, flexible network topologies. Neglecting the network could thwart many of the virtualization benefits like rapid provisioning of servers, resource clustering, and native failover mechanisms. · Simplifying network design and management. Embracing commodity network components and adopting Ethernet everywhere simplifies the number of moving parts in your network. For © 2007, Forrester Research, Inc. Reproduction Prohibited November 8, 2007
    • 10 The Virtualization Imperative For IT Infrastructure & Operations Professionals example, collapsing separate routers and switches into a converged Layer 2/3 device streamlines your core network architecture. And the maturation of virtualized security and acceleration hardware further reduces the hundreds of appliances cluttering wiring closets and data center perimeters. · Giving the network a services orientation. Exposing intelligence in your network infrastructure via a virtualized management layer provides high-performance, distributed services that would otherwise be done in centralized middleware. You can enable developers to “program” your network for complex services like dynamic partitioning for guest access or configuring enterprisewide quality of service levels. Cisco even goes so far as to host application functions like identity management, presence awareness, and location awareness as reusable Web services. What’s Next For Network Virtualization? Network virtualization will be reinvigorated by vendors focusing on orchestration tools that complement virtualization throughout the data center. As companies struggle with network performance bottlenecks we see that: · Network virtualization is a means to an end. As with storage, virtualization is not the application itself. Instead, it is applied as a critical component in building more agile data centers. Virtualization is a way of continuing to build a network as a shared utility. The problem? Only Cisco is creating virtualization-ready equipment. No other large networking vendors have stepped up with products relevant in a virtual environment. We predict that 2008 will be the coming out party for the remaining top-tier networking vendors like HP ProCurve and Nortel. Otherwise, the virtualization ecosystem will commoditize their gear and relegate them to simple plumbing. · A blurring of server and switch functions will reshape the vendor landscape. It’s back to the future with network switching. Originally, switching started as an application running on servers, but the need for higher-performance networks demanded dedicated hardware on purpose-built appliances. Hence the birth of Cisco. But the line is blurring again as VMware continues to add more intelligent switching features in its server virtualization software. In fact, Cisco has already acknowledged the shift and recently voted with its wallet, plunking down $150 million to join VMware’s board. Expect Cisco to further integrate with — or even license technology to — VMware. CLIENT VIRTUALIZATION IS CHANGING APPLICATION DELIVERY Unlike other forms of virtualization, client virtualization is rarely based on aggregating resources. For example, some forms run directly on the client, with the main advantage being isolation or abstraction. Other forms of client virtualization run on a server or blade PC hardware. Client November 8, 2007 © 2007, Forrester Research, Inc. Reproduction Prohibited
    • The Virtualization Imperative 11 For IT Infrastructure & Operations Professionals virtualization is primarily focused on simplifying the delivery and management of the OS and applications for end users with a variety of techniques like: · Local application virtualization and streaming. One of the latest trends in client virtualization is actually virtualizing the entire application. This form of virtualization includes products like Altiris Software Virtualization Solution, Citrix Presentation Server with Application Streaming, and Microsoft SoftGrid. Software deployments no longer require months of compatibility testing, and there is no need to install new software by loading up a DVD or downloading the application over the network. This technology allows you to deliver software to users on demand. The code may be cached on their machines so that they can work offline, but it remains synched with the copy stored in the data center. This means that application upgrades get applied only to the application library, and the changes are streamed out to users when necessary. Furthermore, each streamed application runs in its own isolated sandbox, eliminating application compatibility issues. · Hosted application virtualization. Alternatively known as presentation virtualization, this variant of client virtualization involves hosting multiple users accessing shared applications hosted on servers in the data center. Products like Citrix Presentation Server and Microsoft Terminal Server allow users to connect to applications though a thin client or software client. Although this sounds similar to hosted desktop virtualization, there is no hypervisor involved, and not every application runs well on Terminal Services, as many applications were not designed for the Windows Server OS. Furthermore, users are generally not able to modify or personalize an application that is delivered via hosted application virtualization. · Hosted desktop virtualization. There are a variety of hosted desktop virtualization products, such as Citrix Desktop Server and VMware Virtual Desktop Infrastructure. As with server virtualization, they use a hypervisor running in the data center on servers or a rack of blades. The data center hosts a customized desktop environment with OS, applications, and user data for each user, who connect to their virtual desktop through a thin client or any PC. This scenario simplifies client management and essentially offers the best features of a thin client environment — apps hosted in a secure environment and accessed remotely from a low-cost dumb terminal — and the best features of the traditional fat client — a customizable, personalized user experience with a one-to-one relationship between users and apps. · Local desktop virtualization. In contrast to hosted desktop virtualization, local desktop virtualization, from vendors such as Microsoft, Parallels, and VMware creates a hypervisor layer of abstraction directly on the local hardware to allow the user to run applications that are dependent on a specific operating system on any hardware. Like local application virtualization, this PC environment executes in its own bubble, which mitigates application conflicts between apps that live inside the VM and those running directly on the PC. Furthermore, local desktop virtualization allows multiple desktop environments to run on a single machine. © 2007, Forrester Research, Inc. Reproduction Prohibited November 8, 2007
    • 12 The Virtualization Imperative For IT Infrastructure & Operations Professionals What Does Client Virtualization Do For IT? Due to the diversity in client virtualization technologies, not every virtualization architecture delivers every benefit. In this section, we call out the differences between presentation, application, hosted desktop virtualization, and local desktop virtualization. A few benefits of virtualizing clients include: · Manageability. Most large organizations have difficulty ensuring compatibility between all of their desktop applications and their standard OS build. This is further complicated by patches, which can’t be feasibly tested against every single desktop configuration. Virtualized applications don’t suffer from this problem at all, because each application is isolated from the users’ operating system by a thin layer of software that prevents conflicts. Desktop virtualization products offer this capability to a lesser extent. Both virtualization techniques allow applications to run in a protected environment, which allows problematic applications to be isolated. · Migration. Each type of client virtualization simplifies migration in a unique way. In the case of presentation or hosted desktop virtualization, the systems being migrated are actually in the data center. In this case, migration does not require access to the users’ PCs, and it can be done with minimal disruption. In local desktop or application virtualization scenarios, users’ settings, whether software or desktop, are stored away from their PCs. For example, users can be given a new physical PC, and the next time they access the Internet, their applications or entire desktop will be streamed back to their PC — even preserving personal settings. · Instant access. Every type of client virtualization offers instant access to applications — but in slightly different ways. Hosted desktop and application virtualization allow users to instantly connect over the network to their applications. If users unplug from the Internet during hosted application virtualization sessions, they will not be able to keep working. But users with a hosted desktop can suspend their VM to use the next time they log on. Users of local application virtualization-based systems can access applications while connected to the network, and those apps will continue to run from a local cache after they unplug. But if the application was never used while connected to the network, it won’t be available from the cache later on. · Security. Due to their architectures, hosted application and desktop virtualization are probably the best known for their security. Because these virtualization technologies are remotely displaying an application or a desktop environment running in the data center, administrators can make it impossible to copy data out of the system. Hosted application virtualization is popular with companies that have outsourced functions like call centers that require customer service representatives to access corporate applications. Hosted desktop virtualization is popular with companies that have offshore developers who need personalized environments and access to corporate applications and resources. By using hosted desktop and application virtualization, November 8, 2007 © 2007, Forrester Research, Inc. Reproduction Prohibited
    • The Virtualization Imperative 13 For IT Infrastructure & Operations Professionals these applications and desktop environments are hosted inside the firewall, making them easier to manage and secure. What’s Next For Client Virtualization? Forrester expects a lot of activity to take place in the client virtualization space. While hosted application virtualization is an established market, we expect desktop and local application virtualization to have a lasting effect on: · Client management. Most client management tools are focused on the desktop — literally at the endpoint. For years, we’ve run agents on users’ PCs so that we can distribute software, patches, and changes. But local application virtualization with streaming capability sends software to the PC and runs it in a bubble that shields it from compatibility issues with the local operating system. In this scenario, patching and management of the applications takes place in the data center. The users’ PC could be running an off-the-shelf version of Windows straight from Dell or HP. Similarly, in a scenario where PCs are running local desktop virtualization, corporate applications could exist in a secure company-supplied version of the OS, while the underlying PC could be a user’s home PC or a corporate laptop. In either case, the need for management agents on the physical desktop decreases, allowing us to rely on native application update capabilities, while other security resides within the virtualization product. · Mobility. The ability to take our entire desktop environment with us in a VM is a new frontier. It would allow users to carry nothing more than a USB stick, iPod, or other memory device that could be inserted in a kiosk, public terminal, or other temporary PC. By virtualizing our desktop environments, we are no longer wedded to a single device — all our apps, settings, and data can be carried with us or accessed from any Internet-enabled device. Certainly migration becomes easier, but we can also move between a thin client on our desktop, an ultra-slim portable for travel, and a home computer in the kitchen. · The end of the corporate provisioned machine. Desktop virtualization changes the corporate computing model. No longer do machines need to be acquired and provisioned to users by IT. Now users can buy their own PC and use either a hosted desktop or a virtualized desktop to do their work. This enables users to have the flexibility they want out of their machine and allows IT to sleep at night knowing that these former “rogue” machines are securely enabling employee productivity. © 2007, Forrester Research, Inc. Reproduction Prohibited November 8, 2007
    • 14 The Virtualization Imperative For IT Infrastructure & Operations Professionals R E C O M M E N D AT I O N S VIRTUALIZATION IS A PROCESS, NOT A TECHNOLOGY FAD While individual implementations of virtualization might be considered technology fads, the process of virtualization is universal in IT. And it almost always changes things for the better: · Virtualization usually comes with a performance price — try to afford it. For example, programs written in assembly language run a lot faster than those written in Visual Basic. Likewise, virtual networking can add some WAN overhead, and virtual servers generate more CPU overhead. At first, these forms of virtualization had a noticeable effect on performance. But as time went on, the overhead became minimal, and some environments were better positioned for change as a result. Do all of your users really need a PC? How many could run on a bladed desktop and use a thin client that costs a fraction of the price to purchase and support? If they need a laptop for a trip, they can sign one out and copy their virtual environment onto it. · Institute a testing program for virtualized technologies. Not every type of virtualization will be a winner in your environment. Set up a lab to determine which ones can have a positive effect on your environment. This also means pushing your vendors to move ahead with you, since they need to see real demands before they support new technologies. Customers implementing VMware’s virtual server discovered that big software companies like Microsoft and Oracle were eventually willing to support them — once they pushed. · Work on making your IT organization more change-ready. Virtualization technologies tend to break some rules that can make them tricky to support in a rigid IT organization. VMware servers can create virtual network switches, and virtual network hardware allows for one-click provisioning of services. But which administrators have control? If you have a lot of silos in your IT organization, chances are you’ll have a hard time taking advantage of virtualization’s flexibility. One solution is to require IT groups to delegate a certain amount of authority to other groups. While the network group may own the network topology, perhaps it then empowers server administrators to modify switches within a virtual server. These workflows can be further automated using a variety of service management tools that will handle records-keeping and any necessary approval processes.10 W H AT I T M E A N S VIRTUALIZATION WILL KEEP MOVING TO HIGHER LEVELS At some point in the life of a given virtualization scheme, there will be nothing more to virtualize — at least not practically. For example, after C++ and Java, higher-level programming languages never really gained much acceptance. To abstract any higher, the programming language would have to be capable of determining the most efficient algorithm to solve a problem. We just haven’t invented that computer yet. As a result, we find other, easier things to virtualize — like data centers. November 8, 2007 © 2007, Forrester Research, Inc. Reproduction Prohibited
    • The Virtualization Imperative 15 For IT Infrastructure & Operations Professionals A virtual data center is what you get when you take virtual server, network, storage, and application resources and combine them with a model that contains all of their interdependencies. The model would act like a blueprint that would allow these resources to be bundled together with their applications and data, and brought on line at another facility. In essence, the data center would become analogous to a server, and it would run any such blueprint that was instantiated. We don’t have such a blueprint today, although a lot of the required information could be found in a configuration management database (CMDB). ENDNOTES 1 Organic IT, Forrester’s vision for next-generation data center architecture, offers firms massive IT cost savings and business agility — if they can get past the confusion of ideas and offerings. IT executives must deploy technology for virtualization, automation, and self-management, while implementing infrastructure best practices of standardization, abstraction, and integration. See the May 18, 2004, “Organic IT 2004: Cut IT Costs, Speed Up Business” report. 2 Tomorrow’s scale-out data centers need a new command-and-control brain. The goal? Drop an app component on a fabric OS and watch it automatically scale up and down to meet demand. See the September 24, 2003, “The Fabric Operating System” report. 3 Adoption of server virtualization has accelerated dramatically in North America since 2005, with 51% of enterprises now using or piloting the technology. In tandem, interest in server virtualization has grown worldwide, with a notable leap in Asia Pacific. VMware is far and away the top pick of respondents, even among firms just now piloting and able to consider an established Microsoft product. See the February 7, 2007, “Server Virtualization Accelerates In North America” report. 4 Viridian is the code name for Microsoft’s Windows Server virtualization solution, which is designed to go head-to-head with VMware’s ESX server. Although Viridian is considered part of Windows Server 2008, only the beta of Windows Server virtualization will be available with the RTM of Windows Server 2008 at the end of 2007. The final release is due out within six months after the Windows Server 2008 RTM, although some of the original features have been pared down. See the August 1, 2007, “What Does Windows Server 2008 Mean To My Organization?” report. 5 According to a recent Forrester study, 49% of enterprises surveyed that are implementing or interested in x86 server virtualization indicate that improving disaster recovery/business continuity continues to be a very important motivation for adoption. See the October 24, 2007, “X86 Server Virtualization For High Availability And Disaster Recovery” report. 6 Storage virtualization provides a common management layer across all the resources under the control of the virtualized environment and centrally manages replication and data protection activities. Disk volumes are presented by each array to the virtualization layer at inception and then provisioned to the constituent hosts within the virtualization manager. See the January 17, 2007, “Storage Virtualization Emerges . . . Slowly” report. © 2007, Forrester Research, Inc. Reproduction Prohibited November 8, 2007
    • 16 The Virtualization Imperative For IT Infrastructure & Operations Professionals 7 Twenty-two percent of large US firms reported they had deployed storage virtualization, while 18% of European and Asia Pacific firms had done the same. See the January 17, 2007, “Large Enterprises Lead In Demand For Storage Virtualization” report. 8 Thin provisioning has been around for some time, but as buyers focus more attention on the utilization and power consumption profiles of their storage environments, vendors are jumping into the game at a rapid pace. Where it was previously mainly niche vendors that had thin provisioning offerings, the list of storage giants playing in this space has grown and is likely to continue. The major players in thin provisioned storage today include: 3PAR, Compellent, EMC, Hitachi Data Systems, HP, Isilon, Network Appliance, and Sun StorageTek. For a detailed analysis of thin provisioning and a description of major suppliers, see the July 23, 2007, “Trim The Fat In Storage With Thin Provisioning” report. 9 Inkra Network’s assets were acquired by Cisco Systems and Nortel Networks to build out their utility data center offerings. 10 Larger organizations typically have 100 or more people who will touch the service desk tool suite in some capacity. Technicians will log, dispatch, and resolve incidents. Specialists inside and outside of the service desk will get to the bottom of problems and recommend the changes required to fix them. Users in IT and the business will need to approve and schedule the change with the lowest possible impact to business- critical processes. Changes to the configuration across the thousands and thousands of IT assets will need to be recorded and potentially audited. See the February 17, 2006, “The Forrester Wave™: Service Desk Management Tools, Q1 2006” report. November 8, 2007 © 2007, Forrester Research, Inc. Reproduction Prohibited
    • Making Leaders Successful Every Day Headquarters Research and Sales Offices Forrester Research, Inc. Australia Israel 400 Technology Square Brazil Japan Cambridge, MA 02139 USA Canada Korea Tel: +1 617.613.6000 Denmark The Netherlands Fax: +1 617.613.5000 France Switzerland Email: forrester@forrester.com Germany United Kingdom Nasdaq symbol: FORR Hong Kong United States www.forrester.com India For a complete list of worldwide locations, visit www.forrester.com/about. For information on hard-copy or electronic reprints, please contact the Client Resource Center at +1 866.367.7378, +1 617.617.5730, or resourcecenter@forrester.com. We offer quantity discounts and special pricing for academic and nonprofit institutions. Forrester Research, Inc. (Nasdaq: FORR) is an independent technology and market research company that provides pragmatic and forward-thinking advice to global leaders in business and technology. For more than 24 years, Forrester has been making leaders successful every day through its proprietary research, consulting, events, and peer-to-peer executive programs. For more information, visit www.forrester.com. 42943