Presentation
Upcoming SlideShare
Loading in...5
×
 

Presentation

on

  • 217 views

 

Statistics

Views

Total Views
217
Views on SlideShare
217
Embed Views
0

Actions

Likes
0
Downloads
2
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Presentation Presentation Transcript

  • 1. Albert Chong, Kyle Jansen, Chan Hao Yee, Liang Jun Jie
  • 2. Introduction
    • Virtualization is a software technology which allows a computer to perform the tasks of multiple computers, by distributing the resources of a single computer across multiple environments.
    • Virtual servers and virtual desktops let you host multiple operating systems and multiple applications locally and in remote locations, freeing you from physical and geographical limitations.
  • 3. Continued…
    • Today’s computer hardware was originally designed to run only a single operating system and a single application, but virtualization breaks that bond, making it possible to run multiple operating systems and multiple applications on the same computer at the same time, increasing the utilization and flexibility of hardware.
  • 4. Background
    • On February 8 , 1999 , VMware introduced the first x86 virtualization product, "VMware Virtual Platform", based on earlier research by its founders at Stanford University . VMware filed for a patent on their techniques in October 1998, which was granted as U.S. Patent 6,397,242   on May 28 , 2002 . VMware and similar virtualization software for the x86 must employ binary translation techniques to trap and virtualize the execution of certain instructions. These techniques incur some performance overhead as compared to a VM running on a natively virtualizable architecture such as the IBM System/370 or Motorola MC68020 .
    • Kevin Lawton started the Plex86 project (originally called "freemware") to create Free software for x86 virtualization. The focus of this project has since changed to support only Linux as a guest operating system, but prior to that, Lawton published the paper Running multiple operating systems concurrently on an IA32 PC using virtualization techniques , which gives an analysis of what aspects of the x86 architecture are hard to virtualize and some techniques to overcome these difficulties.
    • Microsoft offers three Windows-based x86 virtualization products: Microsoft Virtual PC and Microsoft Virtual Server , based on technology they acquired from Connectix , as well as Hyper-V .
    • Open source alternatives include QEMU and VirtualBox .
    • The research systems Denali , L4 , and Xen explored ways to provide high performance virtualization of x86 by implementing a virtual machine that differs from the raw hardware. Operating systems are ported to run on the resulting virtual machine, which does not implement the hard-to-virtualize parts of the actual x86 instruction set. This technique is known as paravirtualization . As of 3.0 Xen also now supports full virtualization with an unmodified guest OS provided hardware-assisted virtualization support (i.e., Intel VT or AMD-V ) is available.
    • Virtualization is a proven concept that was first developed in the 1960s to partition large, mainframe hardware. Today, computers based on x86 architecture are faced with the same problems of rigidity and underutilization that mainframes faced in the 1960s. VMware invented virtualization for the x86 platform in the 1990s to address underutilization and other issues, overcoming many challenges in the process. Today, VMware is the global leader in x86 virtualization and has achieved success that is building momentum for virtualization in all x86 computers.
    • Standard Performance Evaluation Corporation (SPEC) has created a working group to address the development of a set of industry standard methods to compare performance of virtualization technologies. Current members of the working group include AMD , Dell , Fujitsu Siemens , Hewlett-Packard , Intel , IBM , Sun Microsystems , SWsoft (Now Parallels) and VMware . SPEC is currently seeking information from the IT community to better understand the types of information that would provide the best industry benchmarks .
  • 5. Benefits
    • freeing you from physical and geographical limitations. In addition to energy savings and lower capital expenses due to more efficient use of your hardware resources, you get high availability of resources, better desktop management, increased security, and improved disaster recovery processes when you build a virtual infrastructure.
    • With virtualization, one can save time, money and energy while achieving more with the computer hardware they already have.
    • Server Consolidation and Infrastructure Optimization : Virtualization makes it possible to achieve significantly higher resource utilization by pooling common infrastructure resources and breaking the legacy “one application to one server” model.
    • Physical Infrastructure Cost Reduction : With virtualization, you can reduce the number of servers and related IT hardware in the data center. This leads to reductions in real estate, power and cooling requirements, resulting in significantly lower IT costs.
    • Improved Operational Flexibility & Responsiveness : Virtualization offers a new way of managing IT infrastructure and can help IT administrators spend less time on repetitive tasks such as provisioning, configuration, monitoring and maintenance.
    • Increased Application Availability & Improved Business Continuity : Eliminate planned downtime and recover quickly from unplanned outages with the ability to securely backup and migrate entire virtual environments with no interruption in service.
    • Improved Desktop Manageability & Security : Deploy, manage and monitor secure desktop environments that end users can access locally or remotely, with or without a network connection, on almost any standard desktop, laptop or tablet PC.
  • 6. Criticism
    • Server failure. Large-scale consolidation may put many key processes, applications and services in the same proverbial basket. Consequently, fewer physical servers bear the workload -- and a physical failure has much more significant consequences. 
    • Over-provisioning. Starting consolidation without a clear picture of an application's function, workload or profile may lead to infrastructures that are out of balance and over-provisioned.
    • Operational process. Many IT organizations implement proactive monitoring systems and some formally change control process, but few have advanced operational processes to manage crucial aspects of a smooth-running virtual environment.
    • Service levels. Virtualization technology requires new skills; for example, the ability to identify whether a problem originates in the physical or virtual environment. Without staff skills to address problems, service levels may suffer.
  • 7. Opinion
    • freeing you from physical and geographical limitations. In addition to energy savings and lower capital expenses due to more efficient use of your hardware resources, you get high availability of resources, better desktop management, increased security, and improved disaster recovery processes when you build a virtual infrastructure.
    • With virtualization, one can save time, money and energy while achieving more with the computer hardware they already have.
    •    Reiterate the need to follow existing change control processes.
    • Document server and application configurations.
    • Determine relationships and dependencies between servers and applications and other parts of the infrastructure.
    • Implement tools that provide alerts when configurations are changed. 
    • Become familiar with the ITIL configuration management process and associated technologies, and implement them.
    • Understand the workload/profile of each key application and profile them with appropriate tools.
    • Establish performance baselines of existing servers and applications before consolidation.
    • Use management tools to help model workloads and virtualized infrastructures.
    • Become familiar with and implement the ITIL capacity management process. Understand the organization's server architecture.
    • Develop proficiency with virtualization technology in use.
    • Build proficiency with Windows and third-party management and performance tools.
    • Understand operations practices as defined in the ITIL. networking and storage technologies in use.
    • Observe
  • 8. Bibliography
    • http://www.vmware.com/overview/why.html
    • http://www.computerworld.com/action/article.do?command=viewArticleBasic&taxonomyId=18&articleId=9005255&intsrc=hm_topic