This document discusses practical steps organizations can take to mitigate security risks introduced by virtualization. It outlines seven steps, including securing virtual machine managers, establishing a known and trusted state, and gaining visibility and control over changes. The author argues that configuration control is important for virtual environments, and that Tripwire Enterprise can help implement the seven steps by integrating with systems to maintain visibility and control over the data center.
The benefits of employing virtualization in the corporate data center are compelling – lower operating
costs, better resource utilization, increased availability of critical infrastructure to name just a few. It is an
apparent “no brainer” which explains why so many organizations are jumping on the bandwagon. Industry
analysts estimate that between 60 and 80 percent of IT departments are actively working on server
consolidation projects using virtualization. But what are the challenges for operations and security staff
when it comes to management and ensuring the security of the new virtual enterprise? With new
technology, complexity and invariably new management challenges generally follow.
Over the last 18 months, Prism Microsystems, a leading security information and event management
(SIEM) vendor, working closely with a set of early adopter customers and prospects, has been working on
extending the capability of EventTracker to provide deep support for virtualization, enabling our customers
to get the same level of security for the virtualized enterprise as they have for their non-virtualized
enterprise. This White Paper examines the technology and management challenges that result from
virtualization, and how EventTracker addresses them.
Securing virtualization in real world environmentsArun Gopinath
This document discusses the security implications of server virtualization. While virtualization provides benefits like reduced costs and improved management, it also introduces new security risks. Specifically, a breach of one virtual server could potentially impact multiple virtual servers running on the same physical hardware. Traditional security tools are not designed to address the unique security challenges of virtualized environments. The document argues that organizations must understand these new risks and take steps to secure virtualized environments in order to fully realize the benefits of virtualization.
This technical brief discusses the challenges of virtualizing critical infrastructure like Active Directory (AD) and Microsoft Exchange. It explains that visibility into both the virtual and physical environments is needed to accurately diagnose and resolve performance issues. The brief recommends using a solution like Quest's vFoglight, which provides extensive monitoring of virtual and physical components, allowing administrators to quickly detect, diagnose, and resolve problems affecting AD and Exchange availability and performance.
Virtual versions of servers, applications, networks and storage can be created through virtualization. Its main types include operating system virtualization (VMs), hardware virtualization, application-server virtualization, storage virtualization, network virtualization, administrative virtualization and application virtualization.
This document provides an overview of virtualization. It defines virtualization as separating a resource or request for a service from the underlying physical delivery of that service. Virtualization allows for more efficient utilization of IT infrastructure by running multiple virtual machines on a single physical server. There are two main approaches to virtualization - hosted architectures which run on top of an operating system, and hypervisor architectures which install directly on hardware for better performance and scalability. Virtualization can provide benefits like server consolidation, test environment optimization, and business continuity.
Virtualization refers to the creation of virtual versions of hardware platforms, operating systems, storage devices and network resources. There are different types of virtualization including hardware virtualization, which creates virtual machines that act like physical computers running their own guest operating systems. Other types are desktop virtualization, software virtualization, memory virtualization, storage virtualization, data virtualization, and network virtualization. Virtualization provides benefits like consolidating resources and isolating systems.
The process of virtualization enables the creation of virtual forms of servers, applications, networks and storage. The four main types of virtualization are network virtualization, storage virtualization, application virtualization and desktop virtualization.
The benefits of employing virtualization in the corporate data center are compelling – lower operating
costs, better resource utilization, increased availability of critical infrastructure to name just a few. It is an
apparent “no brainer” which explains why so many organizations are jumping on the bandwagon. Industry
analysts estimate that between 60 and 80 percent of IT departments are actively working on server
consolidation projects using virtualization. But what are the challenges for operations and security staff
when it comes to management and ensuring the security of the new virtual enterprise? With new
technology, complexity and invariably new management challenges generally follow.
Over the last 18 months, Prism Microsystems, a leading security information and event management
(SIEM) vendor, working closely with a set of early adopter customers and prospects, has been working on
extending the capability of EventTracker to provide deep support for virtualization, enabling our customers
to get the same level of security for the virtualized enterprise as they have for their non-virtualized
enterprise. This White Paper examines the technology and management challenges that result from
virtualization, and how EventTracker addresses them.
Securing virtualization in real world environmentsArun Gopinath
This document discusses the security implications of server virtualization. While virtualization provides benefits like reduced costs and improved management, it also introduces new security risks. Specifically, a breach of one virtual server could potentially impact multiple virtual servers running on the same physical hardware. Traditional security tools are not designed to address the unique security challenges of virtualized environments. The document argues that organizations must understand these new risks and take steps to secure virtualized environments in order to fully realize the benefits of virtualization.
This technical brief discusses the challenges of virtualizing critical infrastructure like Active Directory (AD) and Microsoft Exchange. It explains that visibility into both the virtual and physical environments is needed to accurately diagnose and resolve performance issues. The brief recommends using a solution like Quest's vFoglight, which provides extensive monitoring of virtual and physical components, allowing administrators to quickly detect, diagnose, and resolve problems affecting AD and Exchange availability and performance.
Virtual versions of servers, applications, networks and storage can be created through virtualization. Its main types include operating system virtualization (VMs), hardware virtualization, application-server virtualization, storage virtualization, network virtualization, administrative virtualization and application virtualization.
This document provides an overview of virtualization. It defines virtualization as separating a resource or request for a service from the underlying physical delivery of that service. Virtualization allows for more efficient utilization of IT infrastructure by running multiple virtual machines on a single physical server. There are two main approaches to virtualization - hosted architectures which run on top of an operating system, and hypervisor architectures which install directly on hardware for better performance and scalability. Virtualization can provide benefits like server consolidation, test environment optimization, and business continuity.
Virtualization refers to the creation of virtual versions of hardware platforms, operating systems, storage devices and network resources. There are different types of virtualization including hardware virtualization, which creates virtual machines that act like physical computers running their own guest operating systems. Other types are desktop virtualization, software virtualization, memory virtualization, storage virtualization, data virtualization, and network virtualization. Virtualization provides benefits like consolidating resources and isolating systems.
The process of virtualization enables the creation of virtual forms of servers, applications, networks and storage. The four main types of virtualization are network virtualization, storage virtualization, application virtualization and desktop virtualization.
Virtualization 2.0: The Next Generation of VirtualizationEMC
In this paper, Frost & Sullivan define virtualization 2.0 and show the enhanced benefits that the latest virtualization platforms can deliver to the business.
You will learn how the virtualization 2.0 can:
- Improve your business agility, productivity, and application performance
- Provide new benefits of next generation virtualization platforms, including capacity management, predicitive analytics and data protection
NCCE 2011 - Virtualization 101: The Fundamentals of Virtualizationncceconnect
This document discusses the fundamentals of virtualization. It describes traditional computing versus a virtualized environment and the two types of hypervisors - type 1 runs directly on hardware while type 2 runs on a host OS. Features of virtualization include sharing resources transparently across VMs, live migration, centralized management, and high availability. Virtualization helps address server sprawl by allowing server consolidation ratios of 10-40 physical to virtual servers, reducing costs for power, cooling, space and hardware while improving operational efficiency and avoiding downtime through features like high availability.
Microsoft offers several virtualization technologies including application, server, presentation, storage, and desktop virtualization. Key server virtualization technologies include Hyper-V and Virtual Server 2005 which allow consolidating servers to reduce costs and improve manageability. System Center provides tools for managing virtualized environments.
The document discusses virtualization and its role in enabling cloud computing. It describes how virtualization abstracts physical computing resources into logical units, allowing single physical machines to appear as multiple virtual machines. This enables more efficient utilization of hardware resources. The document outlines different types of virtualization including server, network, storage and discusses how virtualization of CPU, memory and I/O allows virtual machines to run concurrently on a single physical host.
Virtualization Explained | What Is Virtualization Technology? | Virtualizatio...Simplilearn
In this presentation on virtualization explained, we will understand what is virtualization technology and how it is helpful to us during professional as well as personal work. In this virtualization tutorial, we will understand how virtualization takes place and what software makes virtualization possible and manage different virtual instances, along with the benefits of virtualization.
The topics covered in this what is virtualization presentation are:
1. What Is Virtualization?
2. What Is a Virtual Machine(VM)?
3. Role and Types of Hypervisor
4. Types of Virtualization
5. Benefits of Virtualization
Virtualization is the process of designing a virtual layer to allow one or more operating systems to work on a single physical system known as the host and virtual operating system as a guest. This virtual layer is created through software known as the hypervisor, and it also manages the resource distribution among the virtual machines.
About Simplilearn AWS Cloud Architect Program:
This AWS Cloud Architect Certification Course will make you an expert in Amazon Web Services (AWS). In this program, you will become familiar with architectural principles and services of AWS, learn how to design and deploy highly scalable and fault-tolerant applications on AWS, implement AWS security and testing, and become an expert in AWS components such as S3 and CloudFormation.
What are the course objectives for this AWS Cloud Architect training?
This AWS Cloud Architect certification training will enable you to master the core skills required for designing and deploying dynamically scalable, highly available, fault-tolerant, and reliable applications on one of the top cloud platform providers—Amazon Web Services (AWS). You will learn the fundamentals of the Amazon Web Services (AWS) cloud platform and become an expert in understanding AWS terminologies, concepts, benefits, and deployment options to meet your business requirements. You will also get an overview of AWS DMS (Database Migration Service), how the AWS Schema Conversion tool works, and the various types of AWS DMS; how to build, implement, and manage scalable and fault-tolerant systems on AWS; and, how to select the appropriate AWS service based on data, compute, database, and security requirements.
Learn more at: https://www.simplilearn.com/aws-cloud-architect-certification-training-course
In a general sense, virtualization, is the creation of a virtual, rather than an actual, version of something.
For example:
Google Earth, It is a virtual image of Earth which hold every detail about earth.
From a computing perspective, we might have already done some virtualization if you’ve ever partitioned a hard disk drive into more than one “virtual” drive.
Virtualization in a computing environment can be present in many different forms, some of which are:
Hardware virtualization
Storage and data virtualization
Software virtualization
Network virtualization
Virtualization 101 presents a history of virtualization and defines key concepts. It describes how virtual machines isolate operating systems and applications from each other and the physical hardware. Benefits include ease of deployment, mobility, backup/recovery, and hardware independence. Server virtualization partitions physical servers, while desktop virtualization hosts desktops centrally. Application virtualization protects operating systems from application changes. Major virtualization vendors include Citrix, Microsoft, and VMWare.
The process of creating a virtual version of something be it an operating system, a storage device, a server or network resources is known as virtualization. With virtualization, enterprises and companies succeeded in integrating administrative tasks, enhancing scalability, managing workloads, and reducing operational complexities.
Virtualization allows multiple operating systems to run simultaneously on a single physical server using a hypervisor. This reduces costs by improving hardware utilization, lowering maintenance needs, and providing continuous server uptime. There are two main hypervisor types: native hypervisors have direct access to server hardware while hosted hypervisors run within an operating system. Virtualization offers advantages like zero downtime maintenance, dynamic resource allocation, and automated backups.
One can Study the key concept of Virtualization, its types, why Virtualization and what are the use cases and Benefits of Virtualization and example of Virtualization.
This presentation tries to explain basics of virtualization, what is server virtualization ? why is it important ? how it is done ? What are the limitations and risks associated with it ?
This is summary on Virtualization. It contains benefits and different types of Virtualization. For example:Server Virtualization, Network Virtualization, Data Virtualization etc.
This document discusses server virtualization concepts including the advantages of virtualization, different types of virtualization, and virtualization products. It begins with an overview of server virtualization and defines virtualization. It then covers reasons for virtualization, virtualization concepts including hypervisor types, and advantages. It discusses different types of virtualization including operating system, desktop, application, service, and user virtualization. Finally, it provides examples of popular virtualization products and technologies including VMware ESX/ESXi, vMotion, and vSphere.
Virtualization allows multiple operating systems to run simultaneously on a single physical machine through the use of a hypervisor layer. It provides benefits like server consolidation, application consolidation, sandboxing, and mobility. The main technologies that enable virtualization are the hypervisor and virtual machines. Virtualization can be implemented through full virtualization, para-virtualization, software virtualization, or hardware virtualization. It has become a widely used technology in areas like desktops, servers, and cloud computing.
Cloud computing allows users to access shared computing resources over the internet. It utilizes virtualization which involves partitioning physical resources and allocating them to virtual machines. This improves resource utilization, enables multi-tenancy, and makes resources scalable and flexible. Virtualization allows multiple operating systems and applications to run concurrently on a single physical server through virtual machines. It provides benefits like hardware independence, migration of virtual machines, and better fault isolation. Security challenges in virtualized cloud environments include issues around scaling, diversity, identity management and sensitive data lifetime.
This document defines virtualization and cloud computing. Virtualization refers to creating virtual versions of hardware and resources that allow multiple operating systems to run on a single physical system by sharing underlying hardware. A hypervisor manages virtual machines (VMs), which are isolated runtime environments. Cloud computing delivers hosted services over the internet, providing on-demand access to resources that can be rapidly provisioned. It offers software, platform, and infrastructure services. Virtualization is an element of cloud computing that allows for efficient sharing of computing resources.
Virtualization, A Concept Implementation of CloudNishant Munjal
This presentation will guide through deploying virtualization in linux environment and get its access to another machine followed by virtualization concept.
Virtualization is a technique, which allows to share single physical instance of an application or resource among multiple organizations or tenants (customers)..
Virtualization is a proved technology that makes it possible to run multiple operating system and applications on the same server at same time.
Virtualization is the process of creating a logical(virtual) version of a server operating system, a storage device, or network services.
The technology that work behind virtualization is known as a virtual machine monitor(VM), or virtual manager which separates compute environments from the actual physical infrastructure.
This document discusses the importance of security for virtual environments. While virtualization can reduce costs, many organizations fail to properly manage and monitor their virtual infrastructures, jeopardizing performance and availability. The document outlines four key factors for securing virtual environments: 1) Treat virtual machines like physical servers and conduct regular security audits, 2) Isolate network traffic between virtual machines, 3) Control access to virtual servers carefully, and 4) Provide training to staff on virtualization security. Proper security practices are needed to avoid breaches and downtime and realize the full benefits of virtualization.
This document discusses cybersecurity threats and strategies. It contains the following key points:
1) Cybercrime poses a serious threat to financial services through account takeovers and data breaches at companies that store personal information. Education of both banks and customers is important to increase awareness of threats.
2) New technologies like biometrics and behavioral analytics show promise in improving security, but cybercriminals are also innovative so defenses must remain dynamic.
3) Adopting a big data approach to security analytics allows detection of complex patterns and threats that were previously difficult to identify from fragmented data sources. This has potential to automate some security monitoring and response.
Virtualization 2.0: The Next Generation of VirtualizationEMC
In this paper, Frost & Sullivan define virtualization 2.0 and show the enhanced benefits that the latest virtualization platforms can deliver to the business.
You will learn how the virtualization 2.0 can:
- Improve your business agility, productivity, and application performance
- Provide new benefits of next generation virtualization platforms, including capacity management, predicitive analytics and data protection
NCCE 2011 - Virtualization 101: The Fundamentals of Virtualizationncceconnect
This document discusses the fundamentals of virtualization. It describes traditional computing versus a virtualized environment and the two types of hypervisors - type 1 runs directly on hardware while type 2 runs on a host OS. Features of virtualization include sharing resources transparently across VMs, live migration, centralized management, and high availability. Virtualization helps address server sprawl by allowing server consolidation ratios of 10-40 physical to virtual servers, reducing costs for power, cooling, space and hardware while improving operational efficiency and avoiding downtime through features like high availability.
Microsoft offers several virtualization technologies including application, server, presentation, storage, and desktop virtualization. Key server virtualization technologies include Hyper-V and Virtual Server 2005 which allow consolidating servers to reduce costs and improve manageability. System Center provides tools for managing virtualized environments.
The document discusses virtualization and its role in enabling cloud computing. It describes how virtualization abstracts physical computing resources into logical units, allowing single physical machines to appear as multiple virtual machines. This enables more efficient utilization of hardware resources. The document outlines different types of virtualization including server, network, storage and discusses how virtualization of CPU, memory and I/O allows virtual machines to run concurrently on a single physical host.
Virtualization Explained | What Is Virtualization Technology? | Virtualizatio...Simplilearn
In this presentation on virtualization explained, we will understand what is virtualization technology and how it is helpful to us during professional as well as personal work. In this virtualization tutorial, we will understand how virtualization takes place and what software makes virtualization possible and manage different virtual instances, along with the benefits of virtualization.
The topics covered in this what is virtualization presentation are:
1. What Is Virtualization?
2. What Is a Virtual Machine(VM)?
3. Role and Types of Hypervisor
4. Types of Virtualization
5. Benefits of Virtualization
Virtualization is the process of designing a virtual layer to allow one or more operating systems to work on a single physical system known as the host and virtual operating system as a guest. This virtual layer is created through software known as the hypervisor, and it also manages the resource distribution among the virtual machines.
About Simplilearn AWS Cloud Architect Program:
This AWS Cloud Architect Certification Course will make you an expert in Amazon Web Services (AWS). In this program, you will become familiar with architectural principles and services of AWS, learn how to design and deploy highly scalable and fault-tolerant applications on AWS, implement AWS security and testing, and become an expert in AWS components such as S3 and CloudFormation.
What are the course objectives for this AWS Cloud Architect training?
This AWS Cloud Architect certification training will enable you to master the core skills required for designing and deploying dynamically scalable, highly available, fault-tolerant, and reliable applications on one of the top cloud platform providers—Amazon Web Services (AWS). You will learn the fundamentals of the Amazon Web Services (AWS) cloud platform and become an expert in understanding AWS terminologies, concepts, benefits, and deployment options to meet your business requirements. You will also get an overview of AWS DMS (Database Migration Service), how the AWS Schema Conversion tool works, and the various types of AWS DMS; how to build, implement, and manage scalable and fault-tolerant systems on AWS; and, how to select the appropriate AWS service based on data, compute, database, and security requirements.
Learn more at: https://www.simplilearn.com/aws-cloud-architect-certification-training-course
In a general sense, virtualization, is the creation of a virtual, rather than an actual, version of something.
For example:
Google Earth, It is a virtual image of Earth which hold every detail about earth.
From a computing perspective, we might have already done some virtualization if you’ve ever partitioned a hard disk drive into more than one “virtual” drive.
Virtualization in a computing environment can be present in many different forms, some of which are:
Hardware virtualization
Storage and data virtualization
Software virtualization
Network virtualization
Virtualization 101 presents a history of virtualization and defines key concepts. It describes how virtual machines isolate operating systems and applications from each other and the physical hardware. Benefits include ease of deployment, mobility, backup/recovery, and hardware independence. Server virtualization partitions physical servers, while desktop virtualization hosts desktops centrally. Application virtualization protects operating systems from application changes. Major virtualization vendors include Citrix, Microsoft, and VMWare.
The process of creating a virtual version of something be it an operating system, a storage device, a server or network resources is known as virtualization. With virtualization, enterprises and companies succeeded in integrating administrative tasks, enhancing scalability, managing workloads, and reducing operational complexities.
Virtualization allows multiple operating systems to run simultaneously on a single physical server using a hypervisor. This reduces costs by improving hardware utilization, lowering maintenance needs, and providing continuous server uptime. There are two main hypervisor types: native hypervisors have direct access to server hardware while hosted hypervisors run within an operating system. Virtualization offers advantages like zero downtime maintenance, dynamic resource allocation, and automated backups.
One can Study the key concept of Virtualization, its types, why Virtualization and what are the use cases and Benefits of Virtualization and example of Virtualization.
This presentation tries to explain basics of virtualization, what is server virtualization ? why is it important ? how it is done ? What are the limitations and risks associated with it ?
This is summary on Virtualization. It contains benefits and different types of Virtualization. For example:Server Virtualization, Network Virtualization, Data Virtualization etc.
This document discusses server virtualization concepts including the advantages of virtualization, different types of virtualization, and virtualization products. It begins with an overview of server virtualization and defines virtualization. It then covers reasons for virtualization, virtualization concepts including hypervisor types, and advantages. It discusses different types of virtualization including operating system, desktop, application, service, and user virtualization. Finally, it provides examples of popular virtualization products and technologies including VMware ESX/ESXi, vMotion, and vSphere.
Virtualization allows multiple operating systems to run simultaneously on a single physical machine through the use of a hypervisor layer. It provides benefits like server consolidation, application consolidation, sandboxing, and mobility. The main technologies that enable virtualization are the hypervisor and virtual machines. Virtualization can be implemented through full virtualization, para-virtualization, software virtualization, or hardware virtualization. It has become a widely used technology in areas like desktops, servers, and cloud computing.
Cloud computing allows users to access shared computing resources over the internet. It utilizes virtualization which involves partitioning physical resources and allocating them to virtual machines. This improves resource utilization, enables multi-tenancy, and makes resources scalable and flexible. Virtualization allows multiple operating systems and applications to run concurrently on a single physical server through virtual machines. It provides benefits like hardware independence, migration of virtual machines, and better fault isolation. Security challenges in virtualized cloud environments include issues around scaling, diversity, identity management and sensitive data lifetime.
This document defines virtualization and cloud computing. Virtualization refers to creating virtual versions of hardware and resources that allow multiple operating systems to run on a single physical system by sharing underlying hardware. A hypervisor manages virtual machines (VMs), which are isolated runtime environments. Cloud computing delivers hosted services over the internet, providing on-demand access to resources that can be rapidly provisioned. It offers software, platform, and infrastructure services. Virtualization is an element of cloud computing that allows for efficient sharing of computing resources.
Virtualization, A Concept Implementation of CloudNishant Munjal
This presentation will guide through deploying virtualization in linux environment and get its access to another machine followed by virtualization concept.
Virtualization is a technique, which allows to share single physical instance of an application or resource among multiple organizations or tenants (customers)..
Virtualization is a proved technology that makes it possible to run multiple operating system and applications on the same server at same time.
Virtualization is the process of creating a logical(virtual) version of a server operating system, a storage device, or network services.
The technology that work behind virtualization is known as a virtual machine monitor(VM), or virtual manager which separates compute environments from the actual physical infrastructure.
This document discusses the importance of security for virtual environments. While virtualization can reduce costs, many organizations fail to properly manage and monitor their virtual infrastructures, jeopardizing performance and availability. The document outlines four key factors for securing virtual environments: 1) Treat virtual machines like physical servers and conduct regular security audits, 2) Isolate network traffic between virtual machines, 3) Control access to virtual servers carefully, and 4) Provide training to staff on virtualization security. Proper security practices are needed to avoid breaches and downtime and realize the full benefits of virtualization.
This document discusses cybersecurity threats and strategies. It contains the following key points:
1) Cybercrime poses a serious threat to financial services through account takeovers and data breaches at companies that store personal information. Education of both banks and customers is important to increase awareness of threats.
2) New technologies like biometrics and behavioral analytics show promise in improving security, but cybercriminals are also innovative so defenses must remain dynamic.
3) Adopting a big data approach to security analytics allows detection of complex patterns and threats that were previously difficult to identify from fragmented data sources. This has potential to automate some security monitoring and response.
Virtualization is a technology that has greatly benefited businesses around the globe. The technology has a significant impact on the modern IT landscape and today plays a key role in the development and delivery of cloud computing solutions.However, the adoption of this advanced technology has major security implications on businesses
today. The adoption of Virtualization has openeddoors to a broad range of challenges for businesses in the industry. Especially, for organizations
that are PCI regulated and required to comply with PCI DSS Standards, the challenges in this area only seem to grow.
PAS: Leveraging IT/OT - Convergence and Developing Effective OT CybersecurityMighty Guides, Inc.
Michael Jacobs recommends three key pieces of advice for a CISO to make an OT/ICS environment more secure:
1. Conduct an accurate inventory of all ICS assets to understand the environment. This can begin with a physical walkthrough and mapping network configurations.
2. Establish good working relationships with OT staff to ensure they take cybersecurity seriously and buy into initiatives.
3. Develop a risk-based threat model to guide prioritized implementation of security controls that address real threats, are practical for the OT environment, and minimize safety and operational risks. Understanding the environment and people is crucial to effective OT security.
The adoption of cloud technologies has resulted in organizations accelerating their cloud migration process. But, doing so without taking necessary precautionary measures into account can make organizations vulnerable to the ever-evolving cyber-attacks.
Organizations moving to virtualized platforms need to carefully examine the impact on overall security policy. While virtualization can provide cost savings, it also brings new security risks that must be mitigated. When servers and applications are consolidated onto fewer physical hosts, there is a risk that a single vulnerability or failure could impact multiple systems. Implementing proper access controls, monitoring, and security best practices throughout the virtual infrastructure is important to reduce risks. CIOs must develop strategies to extend existing security policies and controls to the new virtual environment.
The document discusses five key security trends affecting security strategy: 1) Targeted attacks have revealed risks beyond just data exposure, requiring protection against these sophisticated attacks. 2) Data center transformation to software-defined services requires different security tailored to virtual/cloud constructs rather than traditional models. 3) Cloud security demands a strategy to keep data secure and compliant both in the cloud and to/from it. 4) Data protection must extend to intellectual property, risk management, and proof of due care. 5) Specialized environments like IoT shift security's role to protecting connected devices and their generated data.
IT Security at the Speed of Business: Security Provisioning with Symantec Dat...Symantec
Today’s data centers are transitioning into software-defined data centers (SDDC). In the SDDC, the core elements of the infrastructure—storage, server and compute, networking, databases, and business applications—are virtualized and delivered as services. The deployment, provisioning, configuration, management and operation of the entire infrastructure is abstracted from hardware and implemented through software. The infrastructure resources across the stack are application-centric, and customers have the ability to provision IT assets across their public cloud, private cloud, and on-premise domains. These SDDC capabilities are intended to enhance an enterprise’s ability to quickly respond to new opportunities and emerging threats.
Australia's early adopters of network virtualization_ReportBlake Douglas
- Network virtualization is being adopted in Australia to enable automation of network provisioning and security use cases like micro-segmentation. Early adopters include large telcos, Macquarie Telecom, Zettagrid, Global Speech Networks, and the Australian Bureau of Statistics.
- Automation allows network templates to reduce configuration time and enable self-service options. Security uses like micro-segmentation improve visibility of east-west traffic and allow distributed firewalls.
- Various organizations are using network virtualization for automation to more easily deploy customer solutions, extend networks between data centers, and gain infrastructure efficiencies.
Automation of Information (Cyber) Security by Joe HessmillerJoe Hessmiller
The focus is on physical and logical security vulnerabilities. Yes, locks and malware sandboxes are important. BUT, the biggest potential risk comes from inside. From the people who can - intentionally or unintentionally - expose the organization to the greatest risks. This presentation is about automating the process to control those risks.
Virtualize More in 2012 with HyTrust-Boost Data Center Efficiency and Consoli...HyTrust
Virtualize More in 2012 with HyTrust discusses virtualization security best practices and guidance. It recommends planning security into virtual environments by considering compliance requirements, new cloud roles, and security strategy. When virtualizing, organizations should strive for equal or better security than traditional infrastructures using virtualization-aware security solutions, privileged identity management, and vulnerability management. The presentation provides business drivers for increasing virtualization securely in 2012 to proactively protect systems and data.
This document discusses Fornetix, a company that provides advanced encryption key management software. It summarizes:
- Fornetix addresses the security dilemma of managing multiple incompatible key management systems by different vendors through its Key Orchestration solution, which supports a variety of devices, systems, servers and applications.
- Key Orchestration reduces complexity, improves security and lowers costs by replacing separate key management systems with a single interoperable platform.
- Fornetix demonstrated it could reduce the time to rekey an encryption system for a global satellite network from 48 hours to 30 minutes using its automated key management capabilities.
Cyber security is the body of technologies and process which practices protection of network, computers, data and programs from unauthorized access, cyber threats, attacks or damages
Because IP video cameras are networked, partnering with a technology vendor who knows networking technologies is critical. This is a skill that many traditional video surveillance firms lack thus increasing the reliability of the network security service provider.
This document discusses cybersecurity risks and strategies for insurers. It notes that as cyber threats have increased, insurers must gain a deeper understanding of cyber risks to develop effective cyber liability policies. Insurers need to maintain the confidentiality, integrity, and availability of systems and data. The document recommends that insurers take proactive approaches to cybersecurity, such as developing long-term security programs, investing in cybersecurity, and integrating cyber risks into enterprise risk management. It also discusses emerging threats, the importance of data integrity, and how technologies like keyless signature infrastructure can help address issues.
Hosted cloud environments, such as infrastructure as a service (IaaS) or platform as a service (PaaS), offer major IT and business benefits that organizations are looking to realize.
Organizations may decide to migrate some part of their IT operations to a hosted cloud environment to realize any number of benefits.
Critical Insight
Security remains a large impediment to realizing cloud benefits. Numerous concerns still exist around the ability for data privacy, confidentiality, and integrity to be maintained in a cloud environment.
Even if adoption is agreed upon, it becomes hard to evaluate vendors that have strong security offerings and even harder to utilize security controls that are internally deployed in the cloud environment.
Security Perception: Cloud can be secure although unique security threats and vulnerabilities create concerns for consumers.
Balancing Act: Securing an IaaS or PaaS environment is a balancing act of determining whether the vendor or the consumer is responsible for meeting specific security requirements.
Structured CSP Selection Process: Most security challenges and concerns can be minimized through our structured process (CAGI) of selecting the trusted CSP partner.
Impact and Result
The business is adopting a hosted cloud environment and it must be secured, which includes:
Ensuring business data cannot be leaked or stolen.
Maintaining privacy of data and other information.
Securing the network connection points.
Determine a balancing act between yourself and your CSP—through contractual and configuration requirements, determine what security requirements your CSP can meet and cover the rest through internal deployment.
This blueprint and associated tools are scalable for all types of organizations within various industry sectors.
The Secure Path to Value in the Cloud by Denny HeaberlinCloud Expo
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along with a steady stream of well-publicized data breaches, only add to the uncertainty.
In his session at 16th Cloud Expo, Denny Heaberlin, Security Product Manager with Windstream's Cloud and Data Center Marketing team, discussed how to manage these concerns and how choose the right cloud vendor, an essential part of any cloud strategy.
The top 3 security concerns for enterprises are mobile security, cloud security, and human error. Mobile security is challenging as mobile devices accessing business information can be compromised if lost or stolen. Cloud security is a concern as companies lose visibility and control over their data in the cloud. Most security breaches are caused by human error through misconfigurations, not system flaws. CIOs must implement security strategies and policies to address these growing threats to protect companies' sensitive data and systems from cyber attacks and breaches.
The document discusses smart security strategies for smart mobile devices. It defines smart mobile devices and outlines their business benefits, including increased productivity and improved customer service. However, it also notes risks like data breaches and issues around network security and managing devices. The document recommends strategies like implementing policies and standards, providing education, reviewing security regularly through audits, and recognizing that security is only as strong as its weakest link.
Full-RAG: A modern architecture for hyper-personalizationZilliz
Mike Del Balso, CEO & Co-Founder at Tecton, presents "Full RAG," a novel approach to AI recommendation systems, aiming to push beyond the limitations of traditional models through a deep integration of contextual insights and real-time data, leveraging the Retrieval-Augmented Generation architecture. This talk will outline Full RAG's potential to significantly enhance personalization, address engineering challenges such as data management and model training, and introduce data enrichment with reranking as a key solution. Attendees will gain crucial insights into the importance of hyperpersonalization in AI, the capabilities of Full RAG for advanced personalization, and strategies for managing complex data integrations for deploying cutting-edge AI solutions.
UiPath Test Automation using UiPath Test Suite series, part 5DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 5. In this session, we will cover CI/CD with devops.
Topics covered:
CI/CD with in UiPath
End-to-end overview of CI/CD pipeline with Azure devops
Speaker:
Lyndsey Byblow, Test Suite Sales Engineer @ UiPath, Inc.
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/building-and-scaling-ai-applications-with-the-nx-ai-manager-a-presentation-from-network-optix/
Robin van Emden, Senior Director of Data Science at Network Optix, presents the “Building and Scaling AI Applications with the Nx AI Manager,” tutorial at the May 2024 Embedded Vision Summit.
In this presentation, van Emden covers the basics of scaling edge AI solutions using the Nx tool kit. He emphasizes the process of developing AI models and deploying them globally. He also showcases the conversion of AI models and the creation of effective edge AI pipelines, with a focus on pre-processing, model conversion, selecting the appropriate inference engine for the target hardware and post-processing.
van Emden shows how Nx can simplify the developer’s life and facilitate a rapid transition from concept to production-ready applications.He provides valuable insights into developing scalable and efficient edge AI solutions, with a strong focus on practical implementation.
A tale of scale & speed: How the US Navy is enabling software delivery from l...sonjaschweigert1
Rapid and secure feature delivery is a goal across every application team and every branch of the DoD. The Navy’s DevSecOps platform, Party Barge, has achieved:
- Reduction in onboarding time from 5 weeks to 1 day
- Improved developer experience and productivity through actionable findings and reduction of false positives
- Maintenance of superior security standards and inherent policy enforcement with Authorization to Operate (ATO)
Development teams can ship efficiently and ensure applications are cyber ready for Navy Authorizing Officials (AOs). In this webinar, Sigma Defense and Anchore will give attendees a look behind the scenes and demo secure pipeline automation and security artifacts that speed up application ATO and time to production.
We will cover:
- How to remove silos in DevSecOps
- How to build efficient development pipeline roles and component templates
- How to deliver security artifacts that matter for ATO’s (SBOMs, vulnerability reports, and policy evidence)
- How to streamline operations with automated policy checks on container images
TrustArc Webinar - 2024 Global Privacy SurveyTrustArc
How does your privacy program stack up against your peers? What challenges are privacy teams tackling and prioritizing in 2024?
In the fifth annual Global Privacy Benchmarks Survey, we asked over 1,800 global privacy professionals and business executives to share their perspectives on the current state of privacy inside and outside of their organizations. This year’s report focused on emerging areas of importance for privacy and compliance professionals, including considerations and implications of Artificial Intelligence (AI) technologies, building brand trust, and different approaches for achieving higher privacy competence scores.
See how organizational priorities and strategic approaches to data security and privacy are evolving around the globe.
This webinar will review:
- The top 10 privacy insights from the fifth annual Global Privacy Benchmarks Survey
- The top challenges for privacy leaders, practitioners, and organizations in 2024
- Key themes to consider in developing and maintaining your privacy program
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
UiPath Test Automation using UiPath Test Suite series, part 6DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 6. In this session, we will cover Test Automation with generative AI and Open AI.
UiPath Test Automation with generative AI and Open AI webinar offers an in-depth exploration of leveraging cutting-edge technologies for test automation within the UiPath platform. Attendees will delve into the integration of generative AI, a test automation solution, with Open AI advanced natural language processing capabilities.
Throughout the session, participants will discover how this synergy empowers testers to automate repetitive tasks, enhance testing accuracy, and expedite the software testing life cycle. Topics covered include the seamless integration process, practical use cases, and the benefits of harnessing AI-driven automation for UiPath testing initiatives. By attending this webinar, testers, and automation professionals can gain valuable insights into harnessing the power of AI to optimize their test automation workflows within the UiPath ecosystem, ultimately driving efficiency and quality in software development processes.
What will you get from this session?
1. Insights into integrating generative AI.
2. Understanding how this integration enhances test automation within the UiPath platform
3. Practical demonstrations
4. Exploration of real-world use cases illustrating the benefits of AI-driven test automation for UiPath
Topics covered:
What is generative AI
Test Automation with generative AI and Open AI.
UiPath integration with generative AI
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
Maruthi Prithivirajan, Head of ASEAN & IN Solution Architecture, Neo4j
Get an inside look at the latest Neo4j innovations that enable relationship-driven intelligence at scale. Learn more about the newest cloud integrations and product enhancements that make Neo4j an essential choice for developers building apps with interconnected data and generative AI.
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdfMalak Abu Hammad
Discover how MongoDB Atlas and vector search technology can revolutionize your application's search capabilities. This comprehensive presentation covers:
* What is Vector Search?
* Importance and benefits of vector search
* Practical use cases across various industries
* Step-by-step implementation guide
* Live demos with code snippets
* Enhancing LLM capabilities with vector search
* Best practices and optimization strategies
Perfect for developers, AI enthusiasts, and tech leaders. Learn how to leverage MongoDB Atlas to deliver highly relevant, context-aware search results, transforming your data retrieval process. Stay ahead in tech innovation and maximize the potential of your applications.
#MongoDB #VectorSearch #AI #SemanticSearch #TechInnovation #DataScience #LLM #MachineLearning #SearchTechnology
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024
Virtualization Security Risks
1. Gene Kim
CtO, tripwire, Inc.
Practical Steps to Mitigate
Virtualization Security risks
whIte PaPer
Configuration Control for
Virtual and Physical Infrastructures
2. Contents
3 executive Summary
4 the Unique Information Security Challenges of
Virtualization
4 Virtualization is already here
4 Gene Kim’s Practical Steps to Mitigate the
Security risks of Virtualization
6 when we hear these things, we Know we have
a Problem
7 Securing the Virtual Machine Managers (VMMs)
7 what we Can Do about It: Seven Practical Steps
11 Business Value of Good Information Security
Controls
12 avoid the Dark Side of Virtualization with
ConfigurationControl
12 establishing the Known and trusted State
12 Gaining Visibility Into and Control of Change
13 the Value of Configuration Control
13 the role of tripwire enterprise in the Seven
Practical Steps
16 Conclusion
2 | whIte PaPer | Practical Steps to Mitigate Virtualization Security risk
3. executive Summary
The prospect of increased agility and the increasing cost and operations and information security organizations. At this point,
complexity of IT has contributed to the rapid adoption of virtu- I can confidently say that I’ve seen the best and worst of infor-
alization technologies. Virtualization makes it possible to build mation security. The high performing organizations I’ve studied
and deploy IT releases and changes into production faster and consistently had the best security, the best compliance
more economically than ever. Some virtualization experts claim posture, the greatest ability to make changes quickly and suc-
that virtualized environments are fundamentally no less secure cessfully, and optimal efficiency.
than physical environments. However, others claim that virtual- In this paper, I describe seven practical steps that IT organi-
ization can enable better security. Who is correct? Both claims zations can take to mitigate the unique security challenges of
can be correct, but only under certain conditions. virtualization. While many of these steps are solid best practices
Every day, information security practitioners live with the that apply to both physical and virtualized environments, some
reality that they are a single change away from a security breach are directed specifically at virtualized environments.
that could result in front page news, brand damage, or regula- Achieving a known and trusted state is a challenging task for
tory fines. These issues are clearly not confined to security, but even the most technically adept and process-focused organiza-
impact business at the highest level. Consequently, security prac- tions. Tripwire, the recognized leader of configuration control
titioners strive to implement IT controls to mitigate issues such with over 6,000 customers worldwide, enables organizations
as the risk of fraud, loss of confidential customer information, to fully realize the benefits of both their virtual and physical
disruption of critical business services and data integrity, and environments by ensuring that the entire data center achieves
inaccurate financial reporting. and maintains a known and trusted state. Tripwire specifically
Security must be baked in from conception, not addressed addresses the security of virtual environments with CIS- and
later as an afterthought. But since virtualization is already here, VMware-issued policies aimed directly at securing VMware ESX
what steps can we take to implement effective security controls? Servers, the hypervisor most used to virtualize machines. In addi-
Where do we start, and in what order? And how do we do this in tion, Tripwire® Enterprise integrates with critical systems—such
a way that creates value rather than the perception of informa- as change management and asset management solutions—allow-
tion security creating bureaucratic barriers to getting real work ing us to maintain full visibility and control into the data center
done? and any changes made to it.
These are the types of questions that I’ve been trying to
answer since 1999, when I started studying high performing IT
3 | whIte PaPer | Practical Steps to Mitigate Virtualization Security risk
4. the Unique Information is made, or the organization loses confidential data or critical
functionality.
Security Challenges of For better or for worse, virtualization is here. Tripwire sur-
Virtualization veyed 219 IT organizations and found that 85 percent were
already using virtualization, with half the remaining orga-
Every day, information security practitioners live with the real- nizations planning to use virtualization in the near future.
ity that they are a single change away from a security breach Furthermore, VMware found that 85 percent of their customers
that could result in front page news, brand damage, or regula- are using virtualization for mission-critical production services.
tory fines. These issues are clearly not confined to security, but In other words, inadequate information security controls may
impact business at the highest level. Consequently, security prac- already be jeopardizing critical IT services with risk introduced
titioners strive to implement IT controls to mitigate issues such by virtualization.
as the risk of fraud, loss of confidential customer information, Most information security practitioners now attribute the
disruption of critical business services and data integrity, and majority of security failures to misconfiguration resulting from
inaccurate financial reporting. human error. According to Gartner, “the security issues related
Effectively balancing risk with controls is made even more to vulnerability and configuration management get worse, not
difficult by the constant pressure on IT to respond quickly to better, when virtualized.”1 Gartner also asserts that, “Like their
urgent business needs. Most business functions now require IT in physical counterparts, most security vulnerabilities [with virtual
order to conduct operations. In fact, almost every business deci- machines (VMs)] will be introduced through misconfiguration
sion requires at least one change by IT—a trend that continues and mismanagement.”2 Why? Because, among other reasons,
to grow. insecure virtual server images can be replicated far more easily
The resulting need for increased agility and the increasing cost than before, and once deployed, require great effort to discover
and complexity of IT has contributed to the rapid adoption of and bring back to a known and trusted state.
virtualization technologies. Virtualization makes it possible to Analysts have published some startling predictions on these
build and deploy IT releases and changes into production faster information security implications: Gartner predicts that “Through
and more economically than ever before. Some virtualization 2009, 60 percent of production VMs will be less secure than their
experts claim that virtualized environments are fundamentally physical counterparts” and that “30 percent of deployments [will
no less secure than physical environments. Others claim that be associated] with a VM-related security incident.”3 The good
virtualization can enable better security. Both claims can be news is that it doesn’t have to be this way.
correct, but only under certain conditions. The reality is that
when information security controls are improperly implemented
or neglected in virtualized environments, real security risks and Gene Kim’s Practical Steps to
exposures are created faster than ever. This is the potential dark
side of virtualization, and the information security controls that
Mitigate the Security risks of
adequately controlled risks before virtualization may no longer Virtualization
suffice. There is nearly universal agreement that information security
and IT operations must properly manage virtualized servers the
Virtualization is already here same way as physical servers. Security must be baked in from
conception, not addressed later as an afterthought. But, if
Virtualization enables rapid deployment of computing resources, virtualization is already here, what steps can we take to imple-
potentially allowing insecure IT infrastructure to be deployed ment effective security controls? Where do we start, and in what
throughout the organization faster than ever. The unfortunate order? And how do we do this in a way that creates value rather
truth is that the people who deploy this infrastructure often than the perception of information security creating bureaucratic
circumvent existing security and compliance controls when barriers to getting real work done
doing so. Worse still, the risk these deployments introduce is These are the types of questions that I’ve been trying to
only discovered when a security breach occurs, an audit finding answer since 1999, when I started studying high-performing IT
operations and information security organizations. At this point,
4 | whIte PaPer | Practical Steps to Mitigate Virtualization Security risk
5. I can confidently say that I’ve seen the best and worst of infor- handbook Visible Ops Security: Achieving Common Security and
mation security. The high performing organizations I’ve studied IT Operations Objectives in 4 Practical Steps. Although Visible Ops
consistently had the best security, the best compliance posture, Security is not dedicated solely to the topic of virtualization,
the greatest ability to make changes quickly and successfully, it describes and examines the core chronic issue that exists in
and optimal efficiency. every IT organization, and helps explain why virtualization is so
What I learned was that high performing IT organizations compelling. Visible Ops Security also describes why security has
have figured out how to build sustainable security controls that so much to gain or lose through virtualization, and how security
integrate into daily IT operational processes and deliver value to can meaningfully integrate into and add value to an IT organiza-
other business stakeholders. In these high performers, informa- tion’s virtualization strategy.
tion security simultaneously enables the business to respond This paper describes typical information security risks that
more quickly to urgent business needs and helps provide stable, practitioners will face with virtualization, presents seven practi-
secure, and predictable IT services. cal steps to secure the virtualized environment, and discusses
How these information security organizations achieved the business value of implementing these steps.
their “good to great” transformation has been codified in the
5 | whIte PaPer | Practical Steps to Mitigate Virtualization Security risk
6. when we hear these things, we Know we have a Problem
Here are just a few of the real-life experiences that illustrate the issues and risks introduced by virtualization.
the ChallenGe the realIty
Virtualization may bypass secure configuration In an attempt to be proactive, information security spent two quarters building
standards secure configuration standards for windows, linux and eight other OS platforms.
But, one year later, we did a vulnerability scan and we found that no one is using
those configuration standards, and we’re more insecure than ever.
we did some analysis, and we finally figured out why: the server engineer-
ing groups were using virtualization to clone insecure builds, making us more
insecure than ever.
Security network scans may never detect insecure how did those insecure virtual servers get into production? It turns out that
virtual servers there are virtual machine administrators in virtually every It group, and that they
are adding and removing virtual servers on a daily basis without us even notic-
ing, and we certainly didn’t see them during our weekly network scans.
Information security must dig deeper to document now that more and more of the production applications are being deployed and
virtualized It environments for compliance run in a virtualized environment, it’s getting more and more difficult to keep
requirements track of all the It infrastructure that is deployed. worse, with all the increased
regulatory and contractual requirements for It controls, it’s getting even harder
to comply.
why? we’re finding out at the last minute about applications that don’t meet
compliance requirements for It controls, and we as the security team have to
spend more and more time in a reactive mode trying to get things in a state
where the auditors won’t generate more findings.
Information security relied upon the slow rate of we used to be able to head off many of these releases and deployments at the
physical server deployment to “stay secure” pass when we spotted boxes of servers showing up at the loading dock, or when
large cabling projects started. we’d do some digging, and usually find that there
was an upcoming deployment we didn’t know about. we’d then assign informa-
tion security resources to do a security review to make sure the right It controls
were put in place before it went into production.
Virtualization took away that safety net, and we’ve got to figure out how to
overcome this issue.
high risks due to virtualization Some of us are starting to lose sleep at night because of the potential risk of loss
of confidential information. I said, “look, you can’t put private health information
out on the public Internet on this infrastructure.” there are real mission-critical
services that contain confidential data that must be adequately secured. But,
when we try to mitigate these risks, we’re viewed as hysterical, paranoid, and an
obstacle.
6 | whIte PaPer | Practical Steps to Mitigate Virtualization Security risk
7. what we Can Do about It: Seven Practical Steps
The issues and indicators described above are typical when infor- SteP 1: GaIn SItUatIOnal awareneSS
mation security is not adequately plugged into IT operational Our first required step is to build situational awareness, defined
and software and service development processes. These issues in military parlance as “the ability to identify, process, and
are all magnified by virtualization because it enables such rapid comprehend the critical elements of information about what
change. is happening to the team with regard to the mission.”4 In the
Our goal is to start gaining control at the relevant parts of virtualization context, we must learn where virtualization tech-
the virtualization lifecycle, and start generating value for the nologies are being used, what they are being used for, and who
relevant parties. By doing this, we replicate the observed high is responsible for their management. This information will help
performing attributes, which are codified in Visible Ops Security: us answer these important questions:
• Business aligned – High performing information security • What IT services are being enabled by virtualization (e.g.,
teams understand how security advances and protects busi- e-commerce, point of sale, financial reporting, order entry,
ness goals. Low performing teams focus on things the business etc.)?
doesn’t care about, like the improbable or irrelevant and other
• Who are the business and IT units, and how are they organized
technological minutia. Often other groups in the organiza-
(e.g., the centralized IT services group, an IT outsourcer, etc.)?
tion consider these low performing teams paranoid and fear-
mongers. • What are the relevant regulatory and contractual requirements
for the business process enabled by virtualization (e.g., SOX-
• Plugged in – High performing information security teams
404, PCI DSS, FISMA, etc.)?
integrate into the right functional groups even though they
don’t have direct operational responsibility. Low performers • What are the technologies and IT processes being used (e.g.,
aren’t present where the work is done and often expend effort VMware Fusion, Citrix XenServer, Microsoft Virtual Server,
helping the wrong people, reinforcing the perception that etc.)?
information security is irrelevant. • Are there any high-level risk indicators from the past (e.g.,
• Adding value – High performing information security teams repeat audit findings, frequent outages, etc.)?
provide value to business and IT process owners, and they With answers to these questions, we can establish an opinion
know what they need from these process owners in return. on the magnitude of the business and technology risks so that
Low performers don’t help advance the operational objectives we can better prioritize our efforts.
of their colleagues, nor do they clearly articulate what they
want people to do differently to meet information security SteP 2: reDUCe anD MOnItOr PrIVIleGeD aCCeSS
requirements. Consequently, these low performers are often Once we know where virtualization technologies are being used
viewed as incompetent. and who is responsible for managing them, we must integrate
information security into the access management procedures.
The following steps are adapted from Visible Ops Security. More Our goal is to reduce access wherever possible and to ensure that
information on the handbook is available at the end of this some form of effective access control exists.
paper. Excessive access and privileges make it possible for people to
make uncontrolled changes to critical systems. Such changes
Securing the Virtual Machine
expose IT systems to risk of disrupted service and create unnec-
essary vulnerabilities for malicious and criminal acts that could
Managers (VMMs) jeopardize the organization. The potential for introducing risks
from excessive user access and privileges is especially evident
The first three steps focus on gaining situational awareness and
in the VMM. Often the VMM resides on a host operating system,
controlling configurations and changes at the virtualization
which has privileged user accounts that can modify security con-
layer—the virtual machine managers (VMMs), hypervisors, and
figuration settings, modify virtual machines, as well as activate
host OSes.
and deactivate virtualized computing environments.
7 | whIte PaPer | Practical Steps to Mitigate Virtualization Security risk
8. The necessary actions in this step include implementing the fol- and virtual machine policies. However, risk can be introduced if
lowing preventive controls: these settings are improperly configured.
• Document all the virtualization administrators who have privi- Our goal in this step is to ensure that all these VMM configu-
leged access to the VMM and ensure we can reconcile them ration settings are properly defined, implemented and verified.
back to authorized staff. Any ghost accounts that cannot be We can use guidance from respected third parties and vendors,
reconciled to authorized staff should be disabled or deleted. including:
• Work with virtualization managers to reduce the number of • VMware ESX Server 3.x Benchmark Version 1.0, from the
administrators to the minimum needed. This number will Center For Internet Security (CIS)
vary, but we know that if we have 25 administrators, we have • VMware Infrastructure 3, Security Hardening, from VMware
too many—in other words, everybody has root. On the other In order to operationalize this, we need the following preventive
hand, one is too few because if something happens to that
controls:
one person, nobody can get root.
• Work with IT management and virtualization managers on a
• Ensure that when personnel changes occur from actions such
policy that defines which virtualization security standard(s)
as hiring, firing, and transferring, access is appropriately
should be used.
revoked or assigned.
• Mandate that all virtualization technologies use these secure
To ensure that these preventive controls are working, we must
configuration settings, and create a plan for deploying virtual-
have the following detective control requirements:
ization technologies with the secure settings.
• Monitor privileged VMM user account adds, removes and
• Define a time limit for initial implementation and express
changes, wherever they are stored (e.g., /etc/passwd, LDAP,
expectations around how quickly corrective actions must be
Active Directory). Privileged accounts include service accounts
taken when configurations are non-compliant.
that do system maintenance and tasks like back up accounts
and manage the enterprise batch scheduler. In order to ensure that these VMM controls function correctly, we
must have the following detective control requirements:
• Reconcile each privileged VMM user account add, remove and
change with an authorized change order from the virtualiza- • Assess and continuously monitor VMM configuration settings
tion manager. This reconciliation process may be manual (a wherever they are stored (for example, Unix or Windows files,
signed paper form) or automated (for example, a BMC Remedy or Windows registry settings).
work order). • Test configuration settings against your internal security poli-
• Reconcile each VMM privileged account with an authorized cies, external compliance requirements, and industry best
user. For example, reconcile the account with an HR record. practices. Report on any variances.
Alternatively, reconcile the accounts with an authorized ser- • Verify that corrective actions for non-compliant configurations
vice; for example, an authorized Unicenter backup program. are properly implemented in the required timeframe.
• Routinely re-accredit accounts—quarterly or yearly, depending
on turnover—to ensure that management can reconcile privi- SeCUrInG the VIrtUal MaChIneS
leged accounts to reports from HR and payroll. These remaining four steps address gaining control over configu-
rations and changes at the virtual machine layer, which includes
SteP 3: DefIne anD enfOrCe VIrtUalIzatIOn the guest OSes, the VMMs, and the applications they run.
COnfIGUratIOn StanDarDS
As with any complex application, VMMs have configuration and SteP 4: InteGrate anD helP enfOrCe ChanGe
logical security settings that are designed to limit the risk of ManaGeMent PrOCeSSeS
human error, fraud, and security incidents by ensuring that the Once VMMs are in a known and trusted state, all changes made
technology only performs as designed. Examples include proper to the VMM should be authorized, scheduled, and substantiated
password settings for the system BIOS, hypervisor host operating by change management. We will do this by:
system settings and permissions, network configuration settings,
8 | whIte PaPer | Practical Steps to Mitigate Virtualization Security risk
9. • Helping assess the potential impact of changes on information By taking these actions, we will have integrated information
security and operations. security into the necessary preventive change management pro-
• Improving procedures for change authorization, scheduling, cesses. We also will have created detective controls to ensure
implementation, and substantiation. that those preventive controls are working. And, we will have
created evidence proactively to prove to auditors that effective
• Ensuring that change requests comply with information secu-
change controls exist.
rity requirements, corporate policy, and industry standards.
To accomplish these objectives, we must: SteP 5: Create a lIBrary Of trUSteD VIrtUalIzeD
• Get invited to the Change Advisory Board (CAB) meet- SerVer BUIlDS
ings. These meetings are the forums for assessing the risks Virtualization makes it easier and faster than ever to deploy
of proposed changes, approving or denying change requests, infrastructure on demand and without adequate controls. The
reviewing the status of changes being planned, agreeing on results of these rapid, poorly controlled deployments include
implementation schedules, and reviewing the success of imple- security breaches, compliance and audit findings, and other
mented changes. By being part of these meetings we ensure potential negative outcomes.
that we have a say in the review and approval process for In this step, we create a library of known, trusted, and
VMM changes and that VMM changes are subject to the change approved virtual images that can be used and re-used, making it
approval process. easier to deploy an authorized, secure configuration rather than
• Build and electrify the fence. We need a detective control an unauthorized, insecure configuration. These secure builds
that assesses configuration settings against internal and exter- combine mandatory and recommended configurations to reduce
nal standards, gives visibility to changes made in the VMM, the likelihood of operational and information security failures
helps determine whether the change was properly authorized that create vulnerabilities—vulnerabilities that an intruder can
and conforms to required standards, and provides relevant exploit.
forensics data to support an investigation in the event of a To create this library of trusted builds, we must document
security breach. the standards we will apply and maintain for these builds. This
• Ensure “tone from the top” and help define the con- requires us to:
sequences. Auditors use the term “tone from the top” to • Develop standards that specify how to secure and harden the
express the fact that words and actions from the boardroom on builds we release into production or check into the definitive
down set the tone for the behavior of everyone in the enter- software library (DSL). Configuration standards for informa-
prise. We must convince top management to set the appropri- tion security are published by trusted external organizations
ate tone from the top regarding information security as: “The such as the Center for Internet Security (CIS), DISA, the SANS
only acceptable number of unauthorized changes is zero. Institute, and virtualization vendors. As these external stan-
Senior executives will not tolerate people circumventing the dards evolve, we will revise existing documents and/or create
change management process.” Your Chief Information Officer new ones so that they can be used across the enterprise.
(CIO) or VP of Operations may be able to accomplish this by • Work with the server provisioning and virtualization teams to
simply sending an e-mail message to all organizational units build a library of standardized and secure virtualized server
that expresses the zero tolerance policy, explains the potential builds. We will want to integrate independent configuration
damage unauthorized change can cause, and specifies the con- standards and checklists, as well as take the standard steps of
sequences for those who intentionally circumvent policy. reducing security risks by:
• Substantiate that the electric fence is working. To prove − Turning off unnecessary features and modules that are
compliance with change management processes, we need to
enabled by default
prepare for audits in advance. We’ll need the following evi-
dence: change requests and their approvals, changes detected − Disabling un-needed services (e.g., http, DNS, and
on all relevant IT systems, reconciliations of detected changes SMB)
to approved changed requests, and any corrective actions − Disabling un-needed open network ports
undertaken for unauthorized changes.
9 | whIte PaPer | Practical Steps to Mitigate Virtualization Security risk
10. − Deleting or disabling unnecessary user accounts • Establish an agreed-upon protocol for when and how release
− Changing default passwords management should engage information security. This pro-
tocol should include criteria such as those defined in Step
• Ensure that necessary passwords are changed before systems
2 with the Project Management Organization (PMO)—for
move from development to production—for example, develop-
example, when releases include code that involves authori-
ers who know ODBC and application passwords for a new order
zation, encryption, financial transactions, and compliance
entry system no longer need these passwords when the system
requirements.
enters production.
• Integrate automated security testing tools into the release
• Include standard monitoring agents in each trusted build.
testing process and run these tools against code, builds, and
Once we’ve defined the policies and standards that create releases. Use vulnerability scanning and management testing
the library of approved virtual image builds—our preventive tools, even if they could potentially crash applications during
controls—we need detective controls to ensure the preventive testing—it’s better to find vulnerabilities in pre-production
controls are working with the following detective controls: rather than in production. Use the same tools in the pre-
• Verify virtual image configurations against known internal and production and production environments to prepare IT opera-
external standards to ensure they are in a known, approved, tions for potential problems when these tools are used in the
and secure state. production environment.
• Monitor the approved virtual image library to ensure that all • Use detective controls to compare releases and virtual images
adds, removes, and changes conform to internal and external being deployed against known and trusted states to mitigate
standards. the risks introduced by human error, missed steps, mis-config-
• Reconcile all adds, removes, and changes to an authorized urations, and other sources.
change order. Reconciliation may be done manually or may be In some situations, the security testing conducted by QA is suf-
automated. For example, we could reconcile manually with a ficient for us to approve a release. In other cases, we need to
signed change order from a virtualization manager, or auto- conduct independent security testing. In either case, arming QA
matically by reconciling with a BMC Remedy work order. with the same tools we use reduces findings for security testing
because corrections are made by QA—typically at a lower cost,
SteP 6: InteGrate IntO releaSe ManaGeMent with less stress, and with higher success rates for releases.
teStInG anD aCCePtanCe PrOCeDUreS The preventive controls are the release testing protocols,
To better safeguard the production environment, information including checklists and test procedures. To ensure that these
security requires standardization and documentation, imple- preventive controls are working, we need the following detective
mentation controls like checklists, and continual control of controls:
production variance. Release management shares many of these • Verify that deployed image configurations match the approved
key objectives. While development often focuses on specific and tested builds. In other words, make sure approved and
components, release management focuses on collections of com- tested builds are in a known, approved, and secure state by
ponents and whether the components work together. In this testing them against known internal and external standards.
step, we engage with release management to ensure that they • Detect all changes made to the test environments.
take information security requirements into account when test-
• Reconcile changes to an authorized change order either manu-
ing release packages.
ally or automatically. For example, reconcile with a signed
Release management is often driven by checklists and tem-
change order or with a BMC Remedy work order.
plates, so we must ensure that security requirements are added
to their lists. In order to do this, we must:
SteP 7: enSUre VIrtUalIzatIOn aCtIVItIeS GO
• Develop templates for release management and interface with thrOUGh ChanGe ManaGeMent
them, QA, and project management to ensure that information Information security must work with change management and
security and regulatory compliance requirements are methodi- the virtualization managers to ensure that activating and deacti-
cally collected at the start of each project. vating a virtual computing environment is defined as a change.
10 | whIte PaPer | Practical Steps to Mitigate Virtualization Security risk
11. Consequently, these actions must be treated as any change would
be—they must be authorized, scheduled, and audited by change
Business Value of Good
management. Information Security Controls
To underscore why virtualization actions should be viewed as The 2006 and 2007 ITPI IT Controls Performance Study was con-
a type of change, consider the following scenario: A business has ducted to establish the link between controls and operational
an application critical for a revenue-generating business process. performance. The studies revealed that, in comparison with low-
This business process is in scope for SOX-404, and the potential performing organizations, high-performing organizations were
consequences of an unauthorized deactivation of the comput- more effective and efficient. The studies found that the same
ing environment could include jeopardizing financial reporting, high performers have superior information security effectiveness.
revenue, and information security objectives. Clearly, this type The 2007 IT controls study found that when high performers had
of change must be authorized and scheduled before being security breaches:
implemented.
• The security breaches were far less likely to result in events
In addition to stating this policy requirement about what
such as financial, reputation, and customer loss. High perform-
change is in the context of virtualized environments, informa-
ers were half as likely as medium performers and one-fifth as
tion security must work with IT management to set the “tone
likely as low performers to experience security breaches that
from the top” about a zero tolerance for unauthorized changes.
result in loss.
Information security, change management, and virtualization
managers will likely need to answer the following questions and • The security breaches were far more likely to be detected using
automated controls instead of finding breaches through exter-
be in agreement on their answers:
nal sources such as the newspaper headlines or a customer
• Under what conditions are virtual machine activations, deacti-
complaint. High performers automatically detected security
vations, and restarts changes that require approval? Consider,
breaches 15 percent more often than medium performers and
for example, whether changes require approval if the change
twice as often as low performers.
delivers a new IT service, enables a service that has security
• Security access breaches were detected far more quickly. High
or regulatory requirements, or introduces outage risk to a
performers had a mean time to detect measured in minutes,
mission-critical service.
compared with hours for medium performers and days for low
• Who must approve standard and emergency changes for virtual
performers.
machines?
However, the value of good information security is not just about
The preventive controls are the policies that define how vir-
better loss recovery capabilities. Instead, when information
tualization actions should interface with change management
security controls are built into daily IT operations, the entire IT
processes. In order to ensure that these controls are working,
organization is more effective and efficient. These high perform-
we will also need a corresponding detective control to substan-
ing IT organizations also had the following attributes:
tiate that the policy is being followed. This detective control
• Production system changes fail half as often.
will monitor all virtualization activations and deactivations and
ensure that they are reconciled with an authorized and sched- • Releases cause unintended failures half as often.
uled change. Such monitoring lets information security ensure • Emergency change requests occur with one quarter the
that virtualization activity that could introduce information frequency.
security risk is adequately reviewed and mitigated. In addition, • Repeat audit findings occur with one quarter the frequency.
monitoring and vetting activations and deactivations helps con-
• Unplanned work and firefighting is cut in half.
trol unauthorized virtualization sprawl—the uncontrolled and
unauthorized activation of virtual servers released undocumented • Server-to-system-administrator ratios are two times higher.
into the computing environment. In short, these studies confirmed that high performing IT orga-
nizations have figured out how to simultaneously advance the
goals of information security and IT operations. These IT organi-
zations take proactive and decisive steps to promote teamwork.
The information security group in these organizations works
11 | whIte PaPer | Practical Steps to Mitigate Virtualization Security risk
12. with IT operations to manage production systems efficiently and developing organizations (CIS, NIST, DISA, and vendors) have
securely, integrates with development to streamline the introduc- defined industry standards. These standards define specific
tion of new systems into production, and properly manages risks benchmarks against which configurations may be tested and
to systems without introducing unnecessary controls or signifi- constitute best practices for both securing the data center and
cantly impeding development efforts. optimizing operations.
Configuration assessment is the process in configuration con-
avoid the Dark Side of trol of assessing and validating the state of the data center’s
configurations by proactively testing against these industry stan-
Virtualization with Configuration dards, as well as against an organization’s internal policies and
Control best practices. Configuration assessment compares the settings
and configurations of IT infrastructure elements—for example,
So far we’ve discussed and presented the preventive and detec- minimum password length, directory permissions, and network
tive controls that information security can use to secure and security settings—against those settings and configurations
control virtual environments. We’ve also discussed the char- defined by these policies. The result of the assessment should
acteristics of high-performing organizations. One of these be a report that indicates how each element measures up, along
characteristics is that these organizations take a pro-active with detailed, actionable information that points out what con-
approach that builds information security into IT operations; figuration setting(s) specifically cause an element to be out of
namely, by employing many of the preventive and detective con- compliance with internal or external policies. Armed with this
trols described in the seven practical steps to secure virtualized information, the organization can either adjust individual poli-
environments. cies to better reflect their goals and needs, or can correct the
The preventive controls described are more hands-on activi- settings for any out of compliance elements. Once compliance
ties that we must undertake, such as attending CAB meetings, issues have been addressed and corrected, another configuration
determining policy, establishing protocol, and helping set “tone assessment should be automatically run to verify that the data
from the top.” The detective controls described in the seven center configuration has achieved a known and trusted state.
steps equate to configuration control—the process of assess-
Gaining Visibility Into and
ing the state of IT configurations against known standards and
effectively combining that assessment with change auditing
to maintain a known and trusted state throughout virtual and Control of Change
physical environments.
The moment IT puts a system or device into production is the
Applying these detective controls to the physical environment
moment potential change can occur, so just because the data
is a known best practice, but it is every bit as critical that we
center achieves a known and trusted state following the configu-
apply these controls to secure the virtual environment. Only by
ration assessment doesn’t ensure it will maintain that state. In
doing this can we avoid the dark side of virtualization and fully
fact, it’s almost a given that within weeks, configurations and
realize the cost-savings and flexibility virtual environments offer. settings for most data center elements will have departed from
that state.
establishing the Known and Security professionals widely recognize that IT configura-
trusted State
tion integrity—having the data center in a known and trusted
state—is fundamental to a sound security strategy. Change
Mature organizations tend to have some set of policies against auditing gives visibility and control to configuration changes of
which they test the configuration of their data center to deter- data center elements that cause them to depart from a known
mine the “goodness” of the data center’s state. Business needs, and trusted state. To provide that visibility and control, the
performance requirements, regulations, and any number of inter- configuration control solution must take a snapshot, or estab-
nal and external forces may drive these policies. With mandates lish a baseline, of the data center configurations in a known
to secure personal and confidential data coming from regula- and trusted state. Subsequently, when any change is made, the
tions such as PCI, HIPAA, Sarbanes-Oxley, and others, standards Configuration control software solution detects any differences
12 | whIte PaPer | Practical Steps to Mitigate Virtualization Security risk
13. between the baseline setting state and the new, changed set-
ting, confirming if the change was within policy and authorized.
the role of tripwire
Depending on the severity and priority of the differences, the enterprise in the Seven
solution notifies appropriate individuals through a variety of
alert types, and in some cases may even automatically roll back
Practical Steps
changes to the previously known, good state. As I mentioned at the beginning of this section, the detective
In addition, the configuration control solution generates controls mentioned in the earlier seven practical steps for secur-
reports that flag the detected differences resulting from changes ing the virtual environment include configuration control. The
in configuration files or system files. These reports provide role of configuration control is to:
relevant information in an appropriate format to each person • Allow organizations to enforce the organization’s configura-
along the management chain. For example, the CISO may receive tion policies, so that all configurations within the hypervi-
a dashboard report that gives him or her an at-a-glance sense sor and virtual machines comply with internal and external
of the overall health of the data center. The further down the policies. External policies include configuration requirements
management chain, the more detailed the information, down to as described by CIS, VMware, and DISA, and also include
the point where details point out the specific configuration that operational policy to ensure performance and availability of IT
changed, the current value, along with the expected value— systems.
details that allow technical staff to go in and immediately
• Help develop a library of trusted virtualization builds and help
correct an issue caused by improper and unauthorized change. monitor testing and production environments against these
builds to ensure images match.
the Value of Configuration • Ensure that activating and deactivating a virtual machine and/
Control or hypervisor is captured as a change and therefore is subject
to the rigors of the change and configuration management
With the complex nature of today’s data center that now increas- processes.
ingly includes virtualized environments, having a single point
Tripwire Enterprise is the recognized leader of configuration
of control for gaining visibility into and ensuring consistency of
control, and in the next sections, we’ll revisit each of the
IT system configurations is a must. A single point of control for
prescriptive steps to see how Tripwire helps us address those
configuration assessment and change audit ensures that in the
detective control requirements.
face of dynamic environments that include application upgrades,
automatically installed patches, user-made system setting
SteP 1: trIPwIre’S rOle In GaInInG SItUatIOnal
changes, virtual machine managers, and virtual machines, we can
awareneSS
still deliver high availability and performance and comply with
operational and security standards. Configuration control helps Getting a complete picture of your data center elements is best
organizations: accomplished through an asset management application. From
there, configuration control steps in, harvesting data about these
• Mitigate security risk from both internal and external sources;
elements, including system files, configurations, and all their
• Lower costs by optimizing IT infrastructure resources; associated settings and metadata. This is the starting point for
• Reduce unplanned work for IT staff, freeing them for more implementing configuration control.
strategic projects; If Tripwire Enterprise agents are included in standard VM tem-
• Increase availability by identifying potential issues before they plates and are configured to start automatically, they will notify
cause outages; the TE Console when a new standard build is deployed. Tripwire
Enterprise can integrate with asset management systems, cap-
• Speed mean time to repair (MTTR) with details that let IT zero
turing any data about all elements in the data center—the
in on the exact cause of an issue; and
servers, routers, switches, applications, hypervisors, databases,
• Reduce time, effort and cost of regulatory compliance audit
and more—and capturing the details about any settings applied
activities.
to the configurations and system files associated with those
13 | whIte PaPer | Practical Steps to Mitigate Virtualization Security risk
14. elements. These details are crucial for creating an image of the tant, and sometimes more valuable, to ensure that the virtual
entire data center that is used by configuration assessment and elements comply with our own organizational policies.
change auditing solutions. Tripwire Enterprise captures data on First, Tripwire establishes a trusted state by performing a
virtual environment elements, including the virtual machine comprehensive configuration assessment of our virtual—and
manager (VMM), sometimes called the hypervisor depending on physical—elements against these defined policies and their asso-
the particular implementation. It also captures data on guest ciated benchmark tests. Once we’ve established the trusted state,
OSes and any applications and databases installed on top of we automatically take a snapshot of the configurations in this
them. known, good state. We’ve now created a baseline against which
In order to manage your virtual environment, you first need to test the configurations in the future.
to determine what virtual elements are out there. Tripwire helps Next, Tripwire Enterprise automatically combines the result of
capture that virtual information and it captures the same infor- configuration assessment with change auditing to monitor these
mation about the physical elements from one point of control. virtual elements for change that departs from that known and
trusted state. When Tripwire detects undesirable, out-of-compli-
SteP 2: trIPwIre’S rOle In reDUCInG anD ance change, it alerts IT and provides reports they can drill down
MOnItOrInG PrIVIleGeD aCCeSS into for detailed information about specifically what changed,
While organizations often have strictly enforced policies about when it changed, and who made the change. Once corrective
who can create and modify physical elements of the data center, measures have been made, IT can follow up to see if these mea-
they often fail to take the same strict approach to managing the sures were taken in the required timeframe. Tripwire can even
introduction and modification of virtual elements. These virtual rollback or provide remediation prescriptive guidance to keep the
elements probably warrant greater attention and enforcement virtual element in compliance.
of access policy because when someone makes an undesir- Tripwire provides over 100 out-of-the-box assessments against
able change to a single virtual element, that change can ripple policies issued by the Center for Internet Security (CIS) and
through and impact the other virtual machines running on the other security standards-developing organizations such as NIST
same host machine. Undesirable change in virtual environments and DISA. These assessments contain thousands of tests against
can cause problems that spread exponentially. standards benchmarks designed to ensure the integrity and secu-
Tripwire Enterprise helps monitor user access to physical and rity of the data center, including its virtual elements. Tripwire
virtual machines. With Tripwire, IT can manage user account even includes VMware ESX Server 3.x Benchmark Version 1.0
adds, removes, and changes, and reconcile them with authorized defined by CIS, and VMware Infrastructure 3, Security Handling
change orders from virtualization managers. By monitoring defined by VMware—specific policies designed for VMware ESX
the permissions, groups and access controls in the VMware ESX Server, the most popular hypervisor on the market. These assess-
Server and within virtual machines, Tripwire enables organiza- ment tests serve as a jumpstart to organizations with incomplete
tions to gain better visibility and control over their virtual or no security policies, but may also be modified by more mature
environments. In addition, if you use a tool such as Active organizations with greater experience defining sound secu-
Directory to manage your VMware administrator group, Tripwire rity policy to meet specific business objectives and regulatory
Enterprise also monitors for changes such as adds, removes, and requirements.
modifications within the Active Directory grouping to ensure vis-
ibility to and control of access and permissions change. SteP 4: trIPwIre’S rOle In InteGratInG anD helPInG
enfOrCe ChanGe ManaGeMent PrOCeSSeS
SteP 3: trIPwIre’S rOle In DefInInG anD enfOrCInG In this step, we assume that the VMMs are in a known and
VIrtUalIzatIOn COnfIGUratIOn StanDarDS trusted state. Now configuration control enables us to determine
In this step, we use configuration assessment to proactively what change was made to the VMM by continuously compar-
establish that the virtual elements of the data center—the VMM, ing the configuration in a previously known and trusted state
hypervisor, host OS, guest OS, and applications—are initially against the current configuration. When change is detected that
configured according to industry standards. It is also impor- fails to comply with standards and policy, appropriate staff are
notified and forensics data is collected to provide an audit trail
14 | whIte PaPer | Practical Steps to Mitigate Virtualization Security risk
15. in the event a security breach or compliance audit occurs. This configuration settings as defined by the organization. By testing
audit trail also provides IT critical information for remediat- all configurations against these benchmarks, IT can easily create
ing the issue and identifying the critical who, what, when, and a library of trusted virtual images that may be used each time a
where information of the change that circumvented policy and new virtual machine is deployed. And, because Tripwire monitors
introduced risk. all change, it can monitor the approved virtual image library to
Tripwire’s configuration control solution continuously moni- make sure that when images are added, removed, or modified,
tors the entire enterprise data center for changes that take IT these changes are authorized.
systems out of a known and trusted state. When unauthorized
or detrimental change occurs, including change to VMMs and SteP 6: trIPwIre’S rOle In InteGratInG IntO releaSe
other virtual elements in the data center, Tripwire immediately ManaGeMent teStInG
flags the change for further investigation and provides very spe- anD aCCePtanCe PrOCeDUreS
cific information on what values changed within what files or Emulating the production environment is one of the greatest
configurations. challenges faced by release management. To ensure that the
Tripwire also helps close the loop on the change management production environment functions as expected after a release,
process by helping IT reconcile changes discovered through release testing must be done on a configuration that exactly
continuous monitoring with authorized change tickets. When matches the production configuration. By testing the con-
changes are made that circumvent the change management pro- figuration of the testing environment against the production
cess, Tripwire provides details about who made the change so environment configuration, IT can detect and correct any small
senior executives can enforce the zero tolerance policy for unau- differences in the testing environment that invalidate testing
thorized changes. and cause a release to fail.
Tripwire further generates reports that serve as an audit trail Tripwire helps release management configure their testing
for change to data center elements so that when the organiza- environment so that it exactly matches the production environ-
tion is audited, they can easily prove compliance with relevant ment they are trying to emulate. Release management can then
standards. assess their testing environment against that trusted state,
eliminating the many small variables that can invalidate testing
SteP 5: trIPwIre’S rOle In CreatInG a lIBrary Of or cause their patch, system update, or other release activity
trUSteD VIrtUalIzeD SerVer BUIlDS to fail. When QA testers use Tripwire to verify that the test-
Configuration assessment helps you create this library of trusted ing environment matches the production environment, releases
and approved virtual images. By testing these virtual images succeed and security issues are discovered before they impact
against security, industry, and internal policy, IT can verify that production systems. In addition, Tripwire detects any releases as
these images comply with corporate standards. When new virtual a change to the production environment, verifying that a release
machines are deployed based on these trusted builds, the likeli- does not jeopardize the known and trusted state of the data
hood of introducing risk decreases. center.
Tripwire’s out-of-the-box configuration assessments pro-
vide thousands of tests needed to verify that a specific build SteP 7: trIPwIre’S rOle In enSUrInG VIrtUalIzatIOn
is trusted. These tests include benchmarks developed by CIS aCtIVItIeS GO thrOUGh ChanGe ManaGeMent
and VMware that specifically harden the builds for virtualized Change to virtual machines must follow established change
machines based on VMware ESX Server. Tripwire also includes a management processes and workflow if organizations are to
comprehensive library of tests for regulatory compliance with avoid virtual sprawl. The best way to ensure that they follow
the Payment Card Industry Data Security Standard (PCI DSS), these processes is by viewing activation, de-activation, and
Sarbanes-Oxley (SOX), and the Federal Information Security configuration modifications as change that must be subject to
Management Act (FISMA). In addition, Tripwire enables organiza- change management processes. Configuration control ideally
tions to modify or disable specific out-of-the-box assessments detects these changes in virtual environments and provides suf-
to meet their particular needs. Tripwire’s Golden Policy feature ficient detail to reconcile changes made with authorized change
even captures and preserves “golden” internal best practice IT requests.
15 | whIte PaPer | Practical Steps to Mitigate Virtualization Security risk
16. Tripwire monitors the entire data center, and detects changes into IT operational, software development and project manage-
made to any elements, including the VMM, hypervisor, host OS, ment processes.
guest OS, and almost all applications or services running on vir- Gene currently serves on the Advanced Technology Committee
tual machines. By comparing detected change against scheduled, for the Institute of Internal Auditors where he is part of the
authorized changes, Tripwire is able to determine if a change is GAIT task force, which has created guidance on how to scope IT
unauthorized, and if so, notifies appropriate IT staff. Because general controls for SOX-404. In 2007, Gene was presented the
Tripwire defines activating and de-activating virtual machines and Outstanding Alumnus Award by the Department of Computer
configuration modifications as changes, it detects these activities, Sciences at Purdue University for achievement and leadership in
reconciles them with a change ticket, and additionally determines the profession.
if the change complies with security, compliance, and operational
policy. When change is unauthorized, Tripwire provides informa- aBOUt VISIBle OPS SeCUrIty
tion on who made the change, so senior executives can take
Visible Ops Security derives from years of operational experience,
appropriate action to discourage further circumventing of change
customer engagements, and rigorous research and benchmarking
processes and enforce the zero tolerance policy for unauthorized
performed by the IT Process Institute. Working with top perform-
change. And when changes introduce risk into the environment,
ing organizations to tease out what differentiates them from
Tripwire alerts IT to the issue and provides detailed information so
medium and low-performers, Visible Ops Security has found that
they can immediately correct the issue.
high-performing security teams have unique cultural character-
istics. Based on this research, Visible Ops Security identifies 4
Conclusion phases for integrating information security into development and
operations so that it becomes business as usual. The steps for
When organizations use the detective controls offered by
each phase offer a prescriptive sequence of measurable actions,
Tripwire’s leading configuration control solution in conjunction
supported by true life examples that readers can easily identify
with the preventive controls described in the seven practical
with and use to help build momentum and support. By working
steps, they avoid the dark side of virtualization and experi-
together, development, security, and IT are in a better position
ence the benefits of a secure data center. These benefits include
to achieve common objectives and demonstrate business value.
increased availability, decreased time for recovery, reduced
For more information on Visible Ops Security and the ITPI visit:
unplanned work, higher performance, decreased risk, lessened
http://www.itpi.org.
time and effort for audits, and overall lower costs to deliver IT
services. And they can accomplish this all through a single point
of configuration control—Tripwire Enterprise.
aBOUt Gene KIM
Gene Kim is CTO and founder of Tripwire, Inc. In 1992, he co-
authored Tripwire® while at Purdue University with Dr. Gene
Spafford. Since then, Tripwire solutions have been adopted by
over 6,000 enterprises worldwide. Gene began studying high per-
forming IT operations and security organizations in 1999, which
led him to co-found the IT Process Institute (ITPI) in 2004. In
conjunction with the ITPI, Gene co-authored “The Visible Ops™
Handbook: Implementing ITIL in 4 Practical And Auditable Steps”
which has sold over 75,000 copies.
1 Gartner, Inc. “Security Considerations and Best Practices for
He was a principal investigator on the IT Controls Performance Securing Virtual Machines” by neil MacDonald, March 2007.
Study project, and in 2008 co-authored “Visible Ops Security”, 2 Gartner, Inc. “how to Securely Implement Virtualization”
a handbook describing how to link IT security and operational by neil MacDonald, november 2007.
objectives in four practical steps by integrating security controls 3 Ibid.
4 http://www.dirauxwest.org/tCtf/situational_awareness5.htm
16 | whIte PaPer | Practical Steps to Mitigate Virtualization Security risk