The document discusses vFabric Data Director, a platform that provides database-as-a-service capabilities. It enables database-aware virtualization, automates database lifecycle management, and provides self-service database provisioning. This reduces costs while improving agility, automation, and service quality for database management.
Le soluzioni tecnologiche per il Copy Data ManagementJürgen Ambrosi
L'incremento dei dati presenti sui sistemi aziendali impone all’IT di confrontarsi con l’aumento della complessità e dei costi che ne derivano per l’adeguamento tecnologico.
Uno dei motivi principali che causano l’aumento dei dati è tuttavia rappresentato dalla sempre più frequente richiesta di copie attendibili e rapide degli stessi, per far fronte alle diverse esigenze di business nell’avvio di nuovi progetti o di routine come test, archiviazione, backup, disaster recovery, reporting, ecc. Inoltre molto spesso non si ha il pieno controllo di chi ha accesso agli storage per eseguire tali copie con evidente esposizione a rischi di sottrazione illecita dei dati.
Venendo incontro a tale tipo di necessità, Veritas presenta Velocity, la propria soluzione di Copy Data Management che permette di ottenere in tempi rapidi copie di dati con accesso automatizzato e controllato, evitando inutili proliferazioni di copie e conseguente esposizione a rischi di atti fraudolenti.
Continua il ciclo di webinar in collaborazione con Veritas Technologies.
In questo secondo appuntamento abbiamo visto le soluzioni Veritas di Software Defined Storage.
Il settore IT è oggi una delle aree aziendali maggiormente impattate dal fenomeno dell’aumento esponenziale dei dati. Conseguentemente, gli IT Manager devono far fronte all'aumento dei costi e della complessità per l’implementazione di soluzioni di Storage atte a contenere la crescita del volume dei dati.
Al tempo stesso essi devono operare delle scelte orientate a soluzioni in grado di soddisfare i livelli prestazionali sempre più elevati richiesti dalle nuove applicazioni di business mantenendo altresì la funzionalità di quelle legacy.
L’implementazione di hardware NAS ad alte prestazioni o l’adozione di soluzioni storage di tipo diversificato non rappresentano oggi la soluzione ideale dal punto di vista degli impatti economici e di gestione. Sono infatti disponibili nuove tecnologie, sviluppate proprio in risposta all'esigenza di efficientamento e al contenimento dei costi, che permettono di realizzare infrastrutture che consentono di massimizzare l’utilizzo delle soluzioni storage già presenti nel Data Center e l’adozione si soluzioni Object Storage.
Allo scopo Veritas presenta la propria linea di soluzioni Software Defined Storage.
Die 10 besten PostgreSQL-Replikationsstrategien für Ihr UnternehmenEDB
Dieses Webinar hilft Ihnen, die Unterschiede zwischen den verschiedenen Replikationsansätzen zu verstehen, die Anforderungen der jeweiligen Strategie zu erkennen und sich über die Möglichkeiten klar zu werden, was mit jeder einzelnen zu erreichen ist. Damit werden Sie hoffentlich eher in der Lage sein, herauszufinden, welche PostgreSQL-Replikationsarten Sie wirklich für Ihr System benötigen.
- Wie physische und logische Replikation in PostgreSQL funktionieren
- Unterschiede zwischen synchroner und asynchroner Replikation
- Vorteile, Nachteile und Herausforderungen bei der Multi-Master-Replikation
- Welche Replikationsstrategie für unterschiedliche Use-Cases besser geeignet ist
Referent:
Borys Neselovskyi, Regional Sales Engineer DACH, EDB
------------------------------------------------------------
For more #webinars, visit http://bit.ly/EDB-Webinars
Download free #PostgreSQL whitepapers: http://bit.ly/EDB-Whitepapers
Read our #Postgres Blog http://bit.ly/EDB-Blogs
Follow us on Facebook at http://bit.ly/EDB-FB
Follow us on Twitter at http://bit.ly/EDB-Twitter
Follow us on LinkedIn at http://bit.ly/EDB-LinkedIn
Reach us via email at marketing@enterprisedb.com
Le soluzioni tecnologiche per il Copy Data ManagementJürgen Ambrosi
L'incremento dei dati presenti sui sistemi aziendali impone all’IT di confrontarsi con l’aumento della complessità e dei costi che ne derivano per l’adeguamento tecnologico.
Uno dei motivi principali che causano l’aumento dei dati è tuttavia rappresentato dalla sempre più frequente richiesta di copie attendibili e rapide degli stessi, per far fronte alle diverse esigenze di business nell’avvio di nuovi progetti o di routine come test, archiviazione, backup, disaster recovery, reporting, ecc. Inoltre molto spesso non si ha il pieno controllo di chi ha accesso agli storage per eseguire tali copie con evidente esposizione a rischi di sottrazione illecita dei dati.
Venendo incontro a tale tipo di necessità, Veritas presenta Velocity, la propria soluzione di Copy Data Management che permette di ottenere in tempi rapidi copie di dati con accesso automatizzato e controllato, evitando inutili proliferazioni di copie e conseguente esposizione a rischi di atti fraudolenti.
Continua il ciclo di webinar in collaborazione con Veritas Technologies.
In questo secondo appuntamento abbiamo visto le soluzioni Veritas di Software Defined Storage.
Il settore IT è oggi una delle aree aziendali maggiormente impattate dal fenomeno dell’aumento esponenziale dei dati. Conseguentemente, gli IT Manager devono far fronte all'aumento dei costi e della complessità per l’implementazione di soluzioni di Storage atte a contenere la crescita del volume dei dati.
Al tempo stesso essi devono operare delle scelte orientate a soluzioni in grado di soddisfare i livelli prestazionali sempre più elevati richiesti dalle nuove applicazioni di business mantenendo altresì la funzionalità di quelle legacy.
L’implementazione di hardware NAS ad alte prestazioni o l’adozione di soluzioni storage di tipo diversificato non rappresentano oggi la soluzione ideale dal punto di vista degli impatti economici e di gestione. Sono infatti disponibili nuove tecnologie, sviluppate proprio in risposta all'esigenza di efficientamento e al contenimento dei costi, che permettono di realizzare infrastrutture che consentono di massimizzare l’utilizzo delle soluzioni storage già presenti nel Data Center e l’adozione si soluzioni Object Storage.
Allo scopo Veritas presenta la propria linea di soluzioni Software Defined Storage.
Die 10 besten PostgreSQL-Replikationsstrategien für Ihr UnternehmenEDB
Dieses Webinar hilft Ihnen, die Unterschiede zwischen den verschiedenen Replikationsansätzen zu verstehen, die Anforderungen der jeweiligen Strategie zu erkennen und sich über die Möglichkeiten klar zu werden, was mit jeder einzelnen zu erreichen ist. Damit werden Sie hoffentlich eher in der Lage sein, herauszufinden, welche PostgreSQL-Replikationsarten Sie wirklich für Ihr System benötigen.
- Wie physische und logische Replikation in PostgreSQL funktionieren
- Unterschiede zwischen synchroner und asynchroner Replikation
- Vorteile, Nachteile und Herausforderungen bei der Multi-Master-Replikation
- Welche Replikationsstrategie für unterschiedliche Use-Cases besser geeignet ist
Referent:
Borys Neselovskyi, Regional Sales Engineer DACH, EDB
------------------------------------------------------------
For more #webinars, visit http://bit.ly/EDB-Webinars
Download free #PostgreSQL whitepapers: http://bit.ly/EDB-Whitepapers
Read our #Postgres Blog http://bit.ly/EDB-Blogs
Follow us on Facebook at http://bit.ly/EDB-FB
Follow us on Twitter at http://bit.ly/EDB-Twitter
Follow us on LinkedIn at http://bit.ly/EDB-LinkedIn
Reach us via email at marketing@enterprisedb.com
Cloud Migration Paths: Kubernetes, IaaS, or DBaaSEDB
Moving to the cloud is hard, and moving Postgres databases to the cloud is even harder. Public cloud or private cloud? Infrastructure as a Service (IaaS), or Platform as a Service (PaaS)? Kubernetes for the application, or for the database and the application? This talk will juxtapose self-managed Kubernetes and container-based database solutions, Postgres deployments on IaaS, and Postgres DBaaS solutions of which EDB’s DBaaS BigAnimal is the latest example.
Le soluzioni tecnologiche a supporto del mondo OpenStack e ContainerJürgen Ambrosi
L’interesse da parte delle aziende verso soluzioni come i Containers e cloud-based come OpenStack è ampiamente confermato dal trend positivo rilevato dagli analisti. I benefici derivanti dall’adozione di tali soluzioni nell’ambito IT sono rappresentati dalla possibitità di realizzare architetture maggiormente agili, scalabili ed economiche in grado di soddisfare le sempre piu’ stingenti esigenze di business ed affrontare le pressioni competitive. Veritas presenta le proprie soluzioni software defined storage Veritas ™ HyperScale per OpenStack e Veritas ™ HyperScale for Containers quali piattaforme abilitanti all’introduzione di tali nuove soluzioni tecnologiche garantendo altresì un livello di affidabilità Enterprise-class.
Join us to discover how Ivanti File Director can consolidate the management of your on-premises and cloud storage, delivering user profile data on-demand to physical and multi-user virtual workstations. We will also cover our modern device management capabilities by means of Intune.
Migrate Existing Applications to AWS without Re-engineeringBuurst
Migrating existing applications to the cloud can take weeks, if not months to complete. By moving your existing applications to AWS, you can take immediate advantage of: security, reliability, instant scalability and elasticity, isolated processes, reduced operational effort, on-demand provisioning and automation. But how do you migrate your existing applications to AWS without re-architecting?
In this webinar, we covered:
-Best Practices for migrating applications to AWS
-Design & Architectural considerations for cloud storage - including security and data protection
-How to design cloud storage for applications on AWS
-Lessons Learned from thousands of application migrations to AWS
-Demo: how to migrate an existing application to AWS without re-architecting
What the Enterprise Requires - Business Continuity and VisibilityCloudera, Inc.
Cloudera Enterprise BDR delivers centralized disaster recovery for data and metadata, enabling you to prepare for disaster by moving data to your secondary site automatically. Cloudera Navigator 1.0 provides data governance capabilities such as verifying access privileges and auditing access to all data stored in Hadoop, which are critical for customers that are in highly regulated industries and have stringent compliance requirements.
This presentation will teach you how to:
- Centrally configure and manage replication workflows for files (HDFS) and metadata (Hive)
- Consistently meet or exceed SLAs and RTOs through simplified management and process automation
- Track access permissions and actual accesses to all data objects in Hive, HBase, and HDFS
- Answer the questions:
- Who has access to which data object(s)
- Which data objects were accessed by a user
- When was a data object accessed and by whom
- What data assets were accessed using a service
- Which device was used to access
12 Architectural Requirements for Protecting Business Data in the CloudBuurst
Designing a cloud data system architecture that protects your precious data when operating business-critical applications and workloads in the cloud is of paramount importance to cloud architects today. Ensuring the high-availability for your company’s applications and protecting business data is challenging and somewhat different than in traditional on-premise data centers.
For most companies with hundreds to thousands of applications, it’s impractical to build all of these important capabilities into every application’s design architecture. The cloud storage infrastructure typically only provides a subset of what’s required to properly protect business data and applications.
So how do you ensure your business data and applications are architected correctly and protected in the cloud?
In this webinar, we covered:
-Best Practices for protecting business data in the cloud
-How To design a protected and highly-available cloud system architecture
-Lessons Learned from architecting thousands of cloud system architectures
Debunking Common Myths of Hadoop Backup & Test Data ManagementImanis Data
These slides are from a webinar where Hari Mankude, CTO at Talena, discussed key concepts associated with Hadoop data management processes around scalable backup, recovery and test data management.
An overview of December 2009 enhancements to Veritas Storage Foundation, Veritas Cluster File System and Veritas Cluster Server, Symantec’s storage management and high availability solutions.
This release enables organizations to capitalize on new storage technology – such as solid state drives (SSDs) and thin provisioning – and improving performance and scalability. In addition, near instantaneous recovery of applications is now possible with Veritas Cluster File System, allowing for fast failover of structured information and near linear scalability.
Software Defined Storage - Open Framework and Intel® Architecture TechnologiesOdinot Stanislas
(FR)
Dans cette présentation vous aurez le plaisir d'y trouver une introduction plutôt détaillées sur la notion de "SDS Controller" qui est en résumé la couche applicative destinée à contrôler à terme toutes les technologies de stockage (SAN, NAS, stockage distribué sur disque, flash...) et chargée de les exposer aux orchestrateurs de Cloud et donc aux applications.
(ENG)
This presentation cover in detail the notion of "SDS Controller" which is in summary a software stack able to handle all storage technologies (SAN, NDA, distributed file systems on disk, flash...) and expose it to Cloud orchestrators and applications. Lots of good content.
Demartek evaluated the Lenovo Storage S3200 SAN for SQL Server Database Performance. Read this report to learn how well the S3200 did and why you should consider it for you business!
Cloud Migration Paths: Kubernetes, IaaS, or DBaaSEDB
Moving to the cloud is hard, and moving Postgres databases to the cloud is even harder. Public cloud or private cloud? Infrastructure as a Service (IaaS), or Platform as a Service (PaaS)? Kubernetes for the application, or for the database and the application? This talk will juxtapose self-managed Kubernetes and container-based database solutions, Postgres deployments on IaaS, and Postgres DBaaS solutions of which EDB’s DBaaS BigAnimal is the latest example.
Le soluzioni tecnologiche a supporto del mondo OpenStack e ContainerJürgen Ambrosi
L’interesse da parte delle aziende verso soluzioni come i Containers e cloud-based come OpenStack è ampiamente confermato dal trend positivo rilevato dagli analisti. I benefici derivanti dall’adozione di tali soluzioni nell’ambito IT sono rappresentati dalla possibitità di realizzare architetture maggiormente agili, scalabili ed economiche in grado di soddisfare le sempre piu’ stingenti esigenze di business ed affrontare le pressioni competitive. Veritas presenta le proprie soluzioni software defined storage Veritas ™ HyperScale per OpenStack e Veritas ™ HyperScale for Containers quali piattaforme abilitanti all’introduzione di tali nuove soluzioni tecnologiche garantendo altresì un livello di affidabilità Enterprise-class.
Join us to discover how Ivanti File Director can consolidate the management of your on-premises and cloud storage, delivering user profile data on-demand to physical and multi-user virtual workstations. We will also cover our modern device management capabilities by means of Intune.
Migrate Existing Applications to AWS without Re-engineeringBuurst
Migrating existing applications to the cloud can take weeks, if not months to complete. By moving your existing applications to AWS, you can take immediate advantage of: security, reliability, instant scalability and elasticity, isolated processes, reduced operational effort, on-demand provisioning and automation. But how do you migrate your existing applications to AWS without re-architecting?
In this webinar, we covered:
-Best Practices for migrating applications to AWS
-Design & Architectural considerations for cloud storage - including security and data protection
-How to design cloud storage for applications on AWS
-Lessons Learned from thousands of application migrations to AWS
-Demo: how to migrate an existing application to AWS without re-architecting
What the Enterprise Requires - Business Continuity and VisibilityCloudera, Inc.
Cloudera Enterprise BDR delivers centralized disaster recovery for data and metadata, enabling you to prepare for disaster by moving data to your secondary site automatically. Cloudera Navigator 1.0 provides data governance capabilities such as verifying access privileges and auditing access to all data stored in Hadoop, which are critical for customers that are in highly regulated industries and have stringent compliance requirements.
This presentation will teach you how to:
- Centrally configure and manage replication workflows for files (HDFS) and metadata (Hive)
- Consistently meet or exceed SLAs and RTOs through simplified management and process automation
- Track access permissions and actual accesses to all data objects in Hive, HBase, and HDFS
- Answer the questions:
- Who has access to which data object(s)
- Which data objects were accessed by a user
- When was a data object accessed and by whom
- What data assets were accessed using a service
- Which device was used to access
12 Architectural Requirements for Protecting Business Data in the CloudBuurst
Designing a cloud data system architecture that protects your precious data when operating business-critical applications and workloads in the cloud is of paramount importance to cloud architects today. Ensuring the high-availability for your company’s applications and protecting business data is challenging and somewhat different than in traditional on-premise data centers.
For most companies with hundreds to thousands of applications, it’s impractical to build all of these important capabilities into every application’s design architecture. The cloud storage infrastructure typically only provides a subset of what’s required to properly protect business data and applications.
So how do you ensure your business data and applications are architected correctly and protected in the cloud?
In this webinar, we covered:
-Best Practices for protecting business data in the cloud
-How To design a protected and highly-available cloud system architecture
-Lessons Learned from architecting thousands of cloud system architectures
Debunking Common Myths of Hadoop Backup & Test Data ManagementImanis Data
These slides are from a webinar where Hari Mankude, CTO at Talena, discussed key concepts associated with Hadoop data management processes around scalable backup, recovery and test data management.
An overview of December 2009 enhancements to Veritas Storage Foundation, Veritas Cluster File System and Veritas Cluster Server, Symantec’s storage management and high availability solutions.
This release enables organizations to capitalize on new storage technology – such as solid state drives (SSDs) and thin provisioning – and improving performance and scalability. In addition, near instantaneous recovery of applications is now possible with Veritas Cluster File System, allowing for fast failover of structured information and near linear scalability.
Software Defined Storage - Open Framework and Intel® Architecture TechnologiesOdinot Stanislas
(FR)
Dans cette présentation vous aurez le plaisir d'y trouver une introduction plutôt détaillées sur la notion de "SDS Controller" qui est en résumé la couche applicative destinée à contrôler à terme toutes les technologies de stockage (SAN, NAS, stockage distribué sur disque, flash...) et chargée de les exposer aux orchestrateurs de Cloud et donc aux applications.
(ENG)
This presentation cover in detail the notion of "SDS Controller" which is in summary a software stack able to handle all storage technologies (SAN, NDA, distributed file systems on disk, flash...) and expose it to Cloud orchestrators and applications. Lots of good content.
Demartek evaluated the Lenovo Storage S3200 SAN for SQL Server Database Performance. Read this report to learn how well the S3200 did and why you should consider it for you business!
WARDA is the software platform that allows Luxury, Fashion & Retail companies to manage their digital assets to improve the activities during product development, marketing, visual merchandising and e-commerce.
Digital asseta are a huge value that are often not valued because unrecoverable, not accessible, not shared and managed in a very personal way. Digital data are critical to the business and provide huge efficiency if managed properly.
WARDA is an Italian software house that has significant experience in this field arising directly from major projects for companies in the fashion market, luxury and retail.
I give this talk in Hanoi and HCMC this year to present the trending of analytics and applications of Machine and Deep learning in broad areas. Hopefully you will find something interesting from it.
Avi Pfeffer, Principal Scientist, Charles River Analytics at MLconf SEA - 5/2...MLconf
Practical Probabilistic Programming with Figaro: Probabilistic reasoning enables you to predict the future, infer the past, and learn from experience. Probabilistic programming enables users to build and reason with a wide variety of probabilistic models without machine learning expertise. In this talk, I will present Figaro, a mature probabilistic programming system with many applications. I will describe the main design principles of the language and show example applications. I will also discuss our current efforts to fully automate and optimize the inference process.
Workshop #12: Research toolbox: Exploring innovation opportunities, emotion a...ux singapore
This workshop will help you select the best research methods for transformational projects – where innovation, desirability, and real-world relevance are essential. You will also practice a selection of techniques for involving users in designing products and services.
Lightning Talk #2: Sustaining Transformation in Government Agencies by Gerry ...ux singapore
The presenters have worked on a range of projects with the Department of Justice, and some of them have been transformative. One of the projects that they will be sharing about is the implementation of a new Jury Management System – while ostensibly an “IT” project, it enabled the organisation to reimagine the way it viewed the citizenry, re-engage them in a more positive fashion, and even “export” the solution to other jurisdictions to turn it into a revenue stream.
Join Gerry and Julian as they share with you a truthful account of these projects, exploring which elements are crucial to embarking on the journey of transformation, and discussing the traps and the pitfalls that can derail your efforts.
VMworld 2013: Maximize Database Performance in Your Software-Defined Data CenterVMworld
VMworld 2013
Mark Achtemichuk, VMware
Michael Webster, VMware
Learn more about VMworld and register at http://www.vmworld.com/index.jspa?src=socmed-vmworld-slideshare
VMworld 2013: Dell Solutions for VMware Virtual SAN VMworld
VMworld 2013
Sheetal Kochavara, VMware
Bryan Martin, Dell Inc.
Learn more about VMworld and register at http://www.vmworld.com/index.jspa?src=socmed-vmworld-slideshare
Azure SQL Database Managed Instance is a new flavor of Azure SQL Database that is a game changer. It offers near-complete SQL Server compatibility and network isolation to easily lift and shift databases to Azure (you can literally backup an on-premise database and restore it into a Azure SQL Database Managed Instance). Think of it as an enhancement to Azure SQL Database that is built on the same PaaS infrastructure and maintains all it's features (i.e. active geo-replication, high availability, automatic backups, database advisor, threat detection, intelligent insights, vulnerability assessment, etc) but adds support for databases up to 35TB, VNET, SQL Agent, cross-database querying, replication, etc. So, you can migrate your databases from on-prem to Azure with very little migration effort which is a big improvement from the current Singleton or Elastic Pool flavors which can require substantial changes.
Windows Server 2012 R2 at VMUG.org in LeedsSimon May
A brief overview of what's coming in Windows Server 2012 R2 that I delivered at VMUG recently. Details on virtualisation improvements, storage improvements, VDI and much more
Examining Technical Best Practices for Veritas and AWS Using a Detailed Refer...Veritas Technologies LLC
What is the safest, most efficient way to move data to and manage information on the Amazon Web Services (AWS) platform--and how can Veritas solutions help? In this deep dive technical session, Veritas experts will answer these questions by walking you through a detailed reference architecture for AWS. This includes exploring the best model for deploying, integrating, and managing Veritas solutions with AWS services; learning how Veritas can help you move and manage data most effectively on the AWS platform; and seeing a live, fully-functioning reference architecture in action.
Dalle soluzioni di BackUp & Recovery al Data management a 360° Jürgen Ambrosi
Modernizzare le soluzioni di Data Protection è oggi un tema dettato dalla rapida comparsa di fenomeni come la Digital Trasformation (o Revolution), la crescita esponenziale del volume dei dati riscontrata ed attesa nel prossimo futuro, l’adozione del Cloud e delle nuove Applicazioni, nonché il GDPR.
Non possono più fare affidamento a soluzioni di Backup poco efficienti, costose e molto spesso complesse. Conseguentemente ci si sta orientando verso nuove strategie di protezione del dato.
Esploreremo la piattaforma Veritas nativamente integrata “360° Data Management”, una piattaforma integrata che offra la protezione, l’alta affidabilità e la visibilità del dato. Primo elemento fondamentale è l’introduzione di una soluzione di Data Protection Unificata con unica console per ambienti fisici, virtuali e in Cloud capace di agire proattivamente per individuare in quale ambiente siano depositati i dati di interesse e quali dati strategici debbano essere rapidamente protetti e preservati in modo sicuro, contenendone il volume ai soli necessari per garantire i servizi di business.
VMworld 2013
Chris Greer, FedEx
Richard McDougall, VMware
Learn more about VMworld and register at http://www.vmworld.com/index.jspa?src=socmed-vmworld-slideshare
Caching for Microservices Architectures: Session IVMware Tanzu
In this 60 minute webinar, we will cover the key areas of consideration for data layer decisions in a microservices architecture, and how a caching layer, satisfies these requirements. You’ll walk away from this webinar with a better understanding of the following concepts:
- How microservices are easy to scale up and down, so both the service layer and the data layer need to support this elasticity.
- Why microservices simplify and accelerate the software delivery lifecycle by splitting up effort into smaller isolated pieces that autonomous teams can work on independently. Event-driven systems promote autonomy.
- Where microservices can be distributed across availability zones and data centers for addressing performance and availability requirements. Similarly, the data layer should support this distribution of workload.
- How microservices can be part of an evolution that includes your legacy applications. Similarly, the data layer must accommodate this graceful on-ramp to microservices.
Presenter : Jagdish Mirani is a Product Marketing Manager in charge of Pivotal’s in-memory products
VMworld 2013: Big Data Platform Building Blocks: Serengeti, Resource Manageme...VMworld
VMworld 2013
Abhishek Kashyap, Pivotal
Kevin Leong, VMware
Learn more about VMworld and register at http://www.vmworld.com/index.jspa?src=socmed-vmworld-slideshare
Key Architecture and Performance Principles to Optimize Data ManagementJana Lass
These slides are from a webinar that featured a discussion on ways companies can address shifts in big data infrastructure and design the appropriate data management architecture for optimal performance and scale. It covers lessons gleaned from real customer challenges and implementations, offering attendees practical advice on the kinds of design decisions that can optimize the protection and management modern data platforms from Hadoop to NoSQL databases.
VMworld 2013: Software-Defined Storage: The VCDX Way VMworld
VMworld 2013
Wade Holmes VCDX, VMware
Rawlinson Rivera VCDX, VMware
Learn more about VMworld and register at http://www.vmworld.com/index.jspa?src=socmed-vmworld-slideshare
Welcome to the first live UiPath Community Day Dubai! Join us for this unique occasion to meet our local and global UiPath Community and leaders. You will get a full view of the MEA region's automation landscape and the AI Powered automation technology capabilities of UiPath. Also, hosted by our local partners Marc Ellis, you will enjoy a half-day packed with industry insights and automation peers networking.
📕 Curious on our agenda? Wait no more!
10:00 Welcome note - UiPath Community in Dubai
Lovely Sinha, UiPath Community Chapter Leader, UiPath MVPx3, Hyper-automation Consultant, First Abu Dhabi Bank
10:20 A UiPath cross-region MEA overview
Ashraf El Zarka, VP and Managing Director MEA, UiPath
10:35: Customer Success Journey
Deepthi Deepak, Head of Intelligent Automation CoE, First Abu Dhabi Bank
11:15 The UiPath approach to GenAI with our three principles: improve accuracy, supercharge productivity, and automate more
Boris Krumrey, Global VP, Automation Innovation, UiPath
12:15 To discover how Marc Ellis leverages tech-driven solutions in recruitment and managed services.
Brendan Lingam, Director of Sales and Business Development, Marc Ellis
A tale of scale & speed: How the US Navy is enabling software delivery from l...sonjaschweigert1
Rapid and secure feature delivery is a goal across every application team and every branch of the DoD. The Navy’s DevSecOps platform, Party Barge, has achieved:
- Reduction in onboarding time from 5 weeks to 1 day
- Improved developer experience and productivity through actionable findings and reduction of false positives
- Maintenance of superior security standards and inherent policy enforcement with Authorization to Operate (ATO)
Development teams can ship efficiently and ensure applications are cyber ready for Navy Authorizing Officials (AOs). In this webinar, Sigma Defense and Anchore will give attendees a look behind the scenes and demo secure pipeline automation and security artifacts that speed up application ATO and time to production.
We will cover:
- How to remove silos in DevSecOps
- How to build efficient development pipeline roles and component templates
- How to deliver security artifacts that matter for ATO’s (SBOMs, vulnerability reports, and policy evidence)
- How to streamline operations with automated policy checks on container images
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
The Metaverse and AI: how can decision-makers harness the Metaverse for their...Jen Stirrup
The Metaverse is popularized in science fiction, and now it is becoming closer to being a part of our daily lives through the use of social media and shopping companies. How can businesses survive in a world where Artificial Intelligence is becoming the present as well as the future of technology, and how does the Metaverse fit into business strategy when futurist ideas are developing into reality at accelerated rates? How do we do this when our data isn't up to scratch? How can we move towards success with our data so we are set up for the Metaverse when it arrives?
How can you help your company evolve, adapt, and succeed using Artificial Intelligence and the Metaverse to stay ahead of the competition? What are the potential issues, complications, and benefits that these technologies could bring to us and our organizations? In this session, Jen Stirrup will explain how to start thinking about these technologies as an organisation.
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
zkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex ProofsAlex Pruden
This paper presents Reef, a system for generating publicly verifiable succinct non-interactive zero-knowledge proofs that a committed document matches or does not match a regular expression. We describe applications such as proving the strength of passwords, the provenance of email despite redactions, the validity of oblivious DNS queries, and the existence of mutations in DNA. Reef supports the Perl Compatible Regular Expression syntax, including wildcards, alternation, ranges, capture groups, Kleene star, negations, and lookarounds. Reef introduces a new type of automata, Skipping Alternating Finite Automata (SAFA), that skips irrelevant parts of a document when producing proofs without undermining soundness, and instantiates SAFA with a lookup argument. Our experimental evaluation confirms that Reef can generate proofs for documents with 32M characters; the proofs are small and cheap to verify (under a second).
Paper: https://eprint.iacr.org/2023/1886
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
2. 2
Key Customer Challenges around Databases
Backup
Database
provision
or clone
request
Lead time
of days or
weeks
Automate Database
Lifecycle Management
1000’s of databases to manage
with few DBA’s
Long lead time for database services
for developers
HADR
Patching Cloning
Performance TuningSecurity
Monitoring
Testing
Heterogeneous database
environment is common
No single tool to virtualize and
manage the database infrastructure
Require unified platform for diverse
databases
Unified Platform for
Database Virtualization
Thousands of under-managed and
under-secured databases
Fragmented database environment
Difficult to enforce policy and
compliance
Corporate IT Shadow IT
Support Growing
Database Infrastructure
4. 4
vFabric Data Director Powers Database-as-a-Service for your Cloud
Enables database-aware virtualization on VMware vSphere and provides
database-as-a-service for heterogeneous databases
Reduce capex through
database-aware
virtualization
Save cost by through
database virtualization
Increase IT agility
Automate database
lifecycle management
Accelerate analytics
and application
development
Through self-service and
automation
5. 5
Key Benefits
Automate database lifecycle management
Minutes to provision, backup, restore DB
Single pane of glass to monitor and manage
Reduce hardware costs and license saving by > 50%
Consolidate servers by 4X – 20X
License saving by 2X – 4X
Increase application Quality of Service
Scale dynamically
One click high availability
Complete isolation between systems on the same host
Protects databases and applications against network-based threats
Higher level of consolidation
OpEx Savings
CapEx Savings
Quality of Service
Security
6. 6
vFabric Data Director Platform Architecture
6
Cloud Infrastructure Platform
vFabric Data Director
.
.
.
Enterprise Services
…
Application
Director
Cloud
Foundry
EMC
Data Domain
IntegrationGateway(RESTAPI)
vFabric Data Director
Provisioning
Backup/
Restore
Clone
One click
HA
Resource
Mgmt
Security
Mgmt
Template
Mgmt
Monitor
DBA
App
Dev
IT
Admin
Lifecycle
Services
Infrastructure
Services
DBA
Database
Catalog
Patch
Mgmt
Database
Ingestion
8. 8
Accelerate Database-aware Virtualization
Challenge DBAs require specific knowledge about virtualization infrastructure
and tools to deploy a high performance virtualized database environment.
Solution
Data Director provides a unified platform with integrated capabilities and
enables DBA to quickly virtualize databases on vSphere.
Key
Benefits
Simplify database virtualization using built-in workflows
Standardize environment using database templates
Migrate from physical to virtual using database ingestion (P2V)
Manage the virtual database infrastructure using a single tool
9. 9
Improve IT Agility through Automated Lifecycle Management
Automate lifecycle management and enforce policies using vSphere
capabilities
Speed up application testing and troubleshooting
Easy HA and dynamic scale
Maintain compliance using seamless patch management
Data
Director
Benefits
Multiple IT teams
Multiple systems
DBA
Storage
Admin
Sys
Admin
Costly and complex
Not provided to Tier 2/3
Network
Admin
Today - High Availability HA with Data Director
10. 10
Accelerate Application Development through Self-service
Fast database provisioning
Integrated database refresh using database ingestion
Achieve productivity gain for DBA’s
Key
Benefits
Multiple IT teams
Multiple systems
DBA
Storage
Admin
Sys
Admin
Network
Admin
Days to Weeks
Leads to Shadow IT
Provisioning Today Self-Service with Data Director
12. 12
Database Virtualization Example Scenario
Physical Environment
3 different OS (Windows, Linux, Solaris)
9 different Oracle versions
51 Oracle databases on 51 servers
Virtualization Process
Virtualized Environment
Standardize on 2 OS and 3 Oracle
versions
Resource isolation for different
databases
……
13. 13
Standardize databases deployment through pre-defined
template, profile and policies
Flexible disk layout based on IO pattern enables high performance
database deployment
Automated ingestion process helps migrate databases from
physical to virtualized environment.
Key Use Cases
14. 14
Deploy Standard Operating Environments
Improved
DBA/Developer
productivity
• Application database deployed on three operating systems and 9 Oracle versions
• Application database scale categorized into large and small
• Integrated template management helps enforce standardized
environment
• Set the resource profile and database configuration profile
• Expose complete parameters from database configurations.
+ =
Solution Benefits
Use
Case
15. 15
Deploy High Performance Databases
Achieve high
performance database
deployment
• DBA usually deploy databases across multiple data stores.
• DBA is lack of knowledge how to tune the configurations to deploy high performance
database in virtualization environment,
• DBA can define the disk layout and “Eager Zero Thick” or “Thin”
provisioning in the template
• Data Director can match the disk layout definition to the provided data
stores, for example, put the data disk into SSD while the archive log to
the normal disk.
• Support multiple data and log disks.
Solution Benefits
Use
Case
16. 16
Migrating Database from Physical to Virtual
Simplify P2V with
integrated ingestion
Two-step process:
• Use database ingestion to move the database into Data Director catalog
• Apply custom templates to deploy the database from the catalog
Migrate the database from the physical server into a virtual environment.
Data Director
Backup File Share
Database
Catalog
Large Oracle 11g
Solution Benefits
Use
Case
18. 18
Fast Database Provisioning
Automate and
standardize
Provision database
within minutes
Ability to provision databases quickly to create development
and test environments.
Management portal for fast
deployment of databases
Database templates to deploy
certified database environments
Solution Benefits
Use
Case
19. 19
Simplify and Automate Backup and Restore
Backup and Restore with Data Director
1 team, 11 system
Fully automated process
Error free
Traditional Backup & Restore
Multiple IT teams
Multiple systems
Many scripts
Many places for errors
DBAs and IT Admins:
• Define the backup policy, including
frequency, method, retention, PITR
enablement, etc.
• Allocate the storage
• Authorize
Self Service Users:
• Pickup the policy and backup process
automatically executed
• Pickup an available backup, or snapshot to
restore
• Pickup an time point to recover if PITR
enabled.
20. 20
Creating Copies of Production Database
Easily create
and maintain
production test
environments
Save on storage
cost
Create database copies quickly and efficiently
Maintain close to real time copies
Mask sensitive data and destroy copy after job is done
Create Golden Clone
Linked Clone for creating copies instantly
Clone refresh
Post Clone Script and retention policy
Solution Benefits
Use
Case
21. 21
Application Testing and Troubleshooting
Faster application
testing and problem
resolution
Ability to quickly recreate the failed environment
Use a single tool to monitor gather end-to-end statistics
Real Application Testing (RAT) environment
Use Golden Clone to quickly create
a repro environment
Dashboard provides single pane
of glass monitoring
Centralized log collection
Solution Benefits
Use
Case
22. 22
High Availability And Dynamic Scale
Single-click HA
eliminates
complexity
Better recovery
SLAs for tier 2 and
tier 3 databases
Increased
application
performance
HA for tier 2 and tier 3 databases
Dynamic scaling of resources
Replication
Enable HA in the database template
vSphere 5.0 and Distributed Resource Scheduling
Add resources as needed
Master vPG database with slaves to form a replication system.
Solution Benefits
Use
Case
23. 23
Automate Database Clone and Flexible Movement
Automate the production to QA database clone and refresh
Automate the packaged application deployment
Facilitate the data movement process
Benefits
Backups
Data Director
Catalog
Prod DB
Dev DB Dev DB
New DB
Linked/Full Clone
Full Clone
Linked Clone
Import backups to reduce impact on production
Save it to catalog with pre loaded data
Linked clone improve storage efficiency
Full clone the entire machine to ensure the
environment is identical
Self-defined refresh policy and expiration time
Post-clone scripts to mask data
Save
26. 26
Multi-Tenancy and Isolation
Integrated:
Organization
hierarchy
Resource
management
Security
management
License
enforcement
Finance HR
DBG-1
DBG-2
DBG-1
• Create hierarchy using Organizations
and Database Groups (DBGs)
• Assign resources granularly using
Resource Bundles
• Create DRS groups based on
licensed hardware
• Use Role-Based Access Control
Applications owned by different organizations
• Need security isolation across organizations
• Need resource isolation between applications within the organization
Solution Benefits
Use
Case
27. 27
Benefits
Enabling Database-as-a-Service
Faster time to
market
Agile IT while
maintaining control
Use
Case
Solution
Role based self-service portal for database provisioning,
backup and cloning.
Database developers wants to quickly provision and maintain
development databases.
31. 31
DBAs Can Define Best Practices Based on Performance SLA’s
Great flexibility of storage
layout for database VM
• Customize separate datastores
for OS, data, redo log and backup
disks
• Eager zero thick vs. Thin
provisioning per each data/log
disk
• Configurable mount points
32. 32
Users Still Get What They Need
All Oracle and SQL
Server parameters
exposed through UI
Users can still configure some
options and parameters, guided
by policy and best practices.
33. 33
Ensure License Compliance using Policy
Ensuring that every database runs
on a licensed node in the vSphere
cluster is critical from a compliance
perspective.
Data Director enable DBA’s to set
policies that will ensure database is
always deployed on a licensed
node.
NOTE: You should refer to the individual
licensing policy of the database vendor when
using this feature.
34. 34
Deployment of Replicated Environments Made Easy…
Replication is inherently hard to
setup and manage.
Data Director 2.7 introduces easy
setup of vPostgres Replication
• Create as many slaves as you want
with a few clicks
• Promote a slave to a master from the
UI (Failover)
• Monitor progress of replication
35. 35
Integrate with vSphere Single Sign-On
Register Single Sign-On
Service through web portal
Support direct LDAP or AD
user login
• Only import users from SSO server
• No need to create native user
credentials
36. 36
Other Key Enhancements
SQL Server Snapshot and Restore
• Enable developers to quickly revert changes without
DBA intervention
Enhancing Template Building
• Can select OS template then build into DB template
SQL Server aware HA on Windows VM
SQL Server Name Instance Support
User-specified names for VM host
38. 38
Most Customers Already Virtualizing Business Critical Apps
% of Workload Instances That are Virtualized
Source: VMware customer survey, Jan 2010, Jun 2011, Mar 2012
Question: Total number of instances of that workload deployed in your organization and the percentage of those instances that are virtualized .
38%
53%
43%
25% 25%
18%
41%
56%
47%
34%
28% 28%
47%
57%
52%
41%
35%
40%
Microsoft ExchangeMicrosoft SharepointMicrosoft SQLOracle MiddlewareOracle DB SAP
Jan 2010
Jun 2011
Mar 2012
39. 39
Oracle Support for VMware
Oracle MyOracleSupport (MetaLink) 249212.1 defines Oracle’s
VMware support policy most broadly
• “Oracle will only provide support for issues that either are known to occur on
the native OS, or can be demonstrated not to be as a result of running on
VMware”
• VMware does not modify the native operating system
• Oracle RAC included for 11.2.0.2 and above (Updated Nov 8, 2010)
VMware Support
• Will accept accountability for any Oracle-related issue reported by a customer
• Will help drive the issue to resolution
• www.vmware.com/support/policies/oracle-support.html
Oracle does not certify infrastructure
• Oracle does not certify anything below the OS
• Example: Server hardware or storage
40. 40
Oracle Database Licensing – CPU-Based SKU Considerations
For all of the following situations: License the full machine
After being “fully licensed” you can deploy unlimited VMs!
Standard Edition One
-Licensed by socket
-Limited to two sockets
-Must license full machine
Standard Edition
-Licensed by socket
-Limited to four sockets
-Must license full machine(s)
Enterprise Edition
-Licensed by core
-Apply x86 factor of 0.5 to cores
-Must license full machine
Pricing per Oracle Technology Global Price List, October 20, 2011
$5,800 x 2 =
$11,600
$17,500 x 4 =
$70,000
$47,500x16 x 0.5
$380,000
41. 41
Licensing – Hard versus Soft Partitioning
Oracle licenses with Hard and Soft partitioning of physical systems
• Hard = “fixed” – allows sub-system licensing
• Soft = “fluid” – requires full-system licensing
• After being fully licensed, a soft-partitioned server can run multiple instances
of Oracle at no additional charge
• This was largely academic in the physical realm
• In the virtual realm, this can potentially cause concern: How many DBs can a
customer squeeze onto one system?
• Only the virtualization solution limits the number of DBs on a fully-licensed box
Oracle counts VMware as a soft partition technology
• Must license “entire server”
42. 42
vCenterCluster2
vCenterCluster1
Soft Partition License – Example in a Typical Blade Configuration
A VMware vSphere Distributed Resource Scheduler cluster with Oracle
• Four blades, each with 4 CPU cores – each VM is 2 vCPU (2-core)
• Solution
• Both Oracle hosts must be licensed for all 8 cores on 2 blades
• Same physical or virtual: (8 cores) x (0.5 x86 factor) = 4 licenses
• Free to move that Oracle DB VM back and forth between Host 1 and Host 2
• Do not allow Oracle DB VMs to migrate to App Host 3 or App Host 4!
• Create VMware vCenter™ logical clusters to isolate Oracle hosts and comply
• See Gartner Research Document ID #G00165003 for similar guidance
vMotion
Oracle
DB
“But Oracle tells me I have to license every
server in the ESX cluster!”
43. 43
Oracle Licensing Comparison – Customer Example
PHYSICAL VIRTUAL
ESX
8 core/96GB
8 core/96GB
8 core/96GB
8 core/96GB
X 12
Prod,
Stage,
Dev, Test
Siebel Loadstar OBIEE
Prod
Stage
Dev
/Test
8 core 8 core 4 core
8 core 8 core 4 core
12 core 12 core
Enterprise Edition Pricing per Oracle Technology Global Price List, October 20, 2011
44. 44
Database Backup and Restore
Multiple IT teams
Multiple systems
DBA
Storage
Admin
Sys
Admin
Operator
Many scripts
Many places for error
Today
DBA
1 team
1 system
Fully automated process
Error free
Data Director
More OPEX savings
Higher QoS
Fewer errors/risks
Key
Benefits
45. 45
High Availability
Multiple IT teams
Multiple systems
DBA
Storage
Admin
Sys
Admin
Many scripts
Many places for error
Network
Admin
Today
DBA
1 team
1 system
Fully automated process
Error free
Data Director
More OPEX savings
Higher QoS
Fewer errors/risks
Key
Benefits
46. 46
Database Provisioning
Multiple IT teams
Multiple systems
DBA
Storage
Admin
Sys
Admin
Network
Admin
Many scripts
Many places for error
Today
DBA
1 team
1 system
Fully automated process
Error free
Data Director
More OPEX savings
Higher QoS
Fewer errors/risks
Key
Benefits
47. 47
Backup & Restore
Backup and Restore with Data Director
1 team
1 system
Fully automated process
Error free
Traditional Backup & Restore
Multiple IT teams
Multiple systems
Many scripts
Many places for errors
More OPEX savings
Higher QoS
Fewer errors/risks
Results
48. 48
One Click High Availability Leads to Higher Quality of Service
HA with Data Director
1 team
1 system
Fully automated process
Error free
Traditional HA Setup
Multiple IT teams
Multiple systems
Many scripts
Many places for errors
More OPEX savings
Higher QoS
Fewer errors/risks
Results
49. 49
Self-service Database Provisioning
Database Provisioning with Data Director
1 team
1 system
Fully automated process
Error free
Traditional Database Provisioning
Multiple IT teams
Multiple systems
Many scripts
Many places for errors
More OPEX savings
Higher QoS
Fewer errors/risks
Results
Benefits of database virtualization are well understood, business and IT sees the cost savings and wants to accelerate database virtualization. The challenge for IT today is that DBA’s require specific knowledge about virtualization and also they need to build their own tools to manage a virtualized database environment. Also with different database environments not having a common tool and process it becomes challenging to quickly virtualize the database environment.vFabric Data Director provides a unified platform that enables DBA to focus on the key aspects of virtualizing the database rather than focus on the underlying vSphere platform.
When IT decides to virtualize their environment they go through the Discover, Analyze, Convert process. After the discover and analyze phase, IT comes up with a set of requirements for the virtual environment.
Now let’s look at the benefits in a little bit more detail and look at some of the use cases that Data Director helps to solve.
Now let’s look at the benefits in a little bit more detail and look at some of the use cases that Data Director helps to solve.
We typically think of application workloads as falling into two categories: Tier 1/mission critical workloads and lower-tiered workloads.For Tier 1 applications avoid overcommitment of processor resources. That is, maintain a 1:1 ratio of physical cores to vCPUs. Avoid over-allocating vCPUs—try to match the exact workload.For lower-tiered applications, manage processor overcommitment as a function of typical utilization.
Oracle counts VMware as a soft partition technology.Must license “entire server”.