I. Data is taxing companies in three ways: it strains infrastructure through copying, has a high financial price tag, and many companies are unaware of this "data tax".
II. The data tax hits companies hard through increased IT costs, slowed application development, and decreased business opportunities. Measuring before and after Delphix shows median app development throughput can increase 2x.
III. While companies think they have the best processes, Delphix asks them to measure how long it takes developers to get environments to show there are usually better options like Delphix that can significantly reduce the data tax.
Building the future using Newforma software for design and construction proce...Newforma
Newforma makes software that architects, engineers, contractors, and owner/operators of buildings and infrastructure use to manage information and collaborate more successfully.
Lately, it seems that talk of "the cloud" is everywhere you turn. But what does "the cloud" mean and how can it help your nonprofit function more efficiently? We will help you get a better understanding of what it means, the advantages/disadvantages of using it, and the future of the cloud, so you can fully research and evaluate if moving more items to the cloud is right for your organization. Presented by Richard Dietz of Nonprofit R+D
Modernize and Simplify IT Operations Management for DevOps SuccessDevOps.com
Whether your organization is already leveraging tools that are cloud based tools or you are part of an organization undergoing transformation, you may have the challenge of a hybrid set of workload deployments, both cloud based and on-premises. Join us for this webinar to explore the common operational challenges many DevOps teams are facing today, how IT operations best practices could be leveraged for use in a DevOps methodology and how modern operations management tools can help you carry out those best practices to meet your goals on an on-going basis.
Success Stories In MEP Engineering Project DeliveryNewforma
In mechanical, electrical, and plumbing engineering companies across the country, engineers are distracted from their real work. Instead, they’re managing email, logging submittals and RFIs, keeping track of file transfers, compiling and managing punch lists, and more.
If you’re charged with making sure your teams are operating at maximum effectiveness – and are motivated to do their best work – look into Newforma today.
5 technologies that shorten the document lifecycle & increase productivityNikec Solutions
As the business world moves faster, lawyers are expected to respond to a client’s demand swiftly and consequently need to produce documents quicker than ever. With the different processes involved between the creation and the completion of a document (initial scanning, editing, reviewing, approval and sharing) – i.e. the document lifecycle – teams must stay sharp and consistent to meet those shorter response times.
Digitalisation has revolutionised the speed at which we work with documents but the use of latest technologies can further enhance this. Here are some of the instrumental technologies that will help optimise your document life cycle, without requiring an army of IT experts to manage.
Building the future using Newforma software for design and construction proce...Newforma
Newforma makes software that architects, engineers, contractors, and owner/operators of buildings and infrastructure use to manage information and collaborate more successfully.
Lately, it seems that talk of "the cloud" is everywhere you turn. But what does "the cloud" mean and how can it help your nonprofit function more efficiently? We will help you get a better understanding of what it means, the advantages/disadvantages of using it, and the future of the cloud, so you can fully research and evaluate if moving more items to the cloud is right for your organization. Presented by Richard Dietz of Nonprofit R+D
Modernize and Simplify IT Operations Management for DevOps SuccessDevOps.com
Whether your organization is already leveraging tools that are cloud based tools or you are part of an organization undergoing transformation, you may have the challenge of a hybrid set of workload deployments, both cloud based and on-premises. Join us for this webinar to explore the common operational challenges many DevOps teams are facing today, how IT operations best practices could be leveraged for use in a DevOps methodology and how modern operations management tools can help you carry out those best practices to meet your goals on an on-going basis.
Success Stories In MEP Engineering Project DeliveryNewforma
In mechanical, electrical, and plumbing engineering companies across the country, engineers are distracted from their real work. Instead, they’re managing email, logging submittals and RFIs, keeping track of file transfers, compiling and managing punch lists, and more.
If you’re charged with making sure your teams are operating at maximum effectiveness – and are motivated to do their best work – look into Newforma today.
5 technologies that shorten the document lifecycle & increase productivityNikec Solutions
As the business world moves faster, lawyers are expected to respond to a client’s demand swiftly and consequently need to produce documents quicker than ever. With the different processes involved between the creation and the completion of a document (initial scanning, editing, reviewing, approval and sharing) – i.e. the document lifecycle – teams must stay sharp and consistent to meet those shorter response times.
Digitalisation has revolutionised the speed at which we work with documents but the use of latest technologies can further enhance this. Here are some of the instrumental technologies that will help optimise your document life cycle, without requiring an army of IT experts to manage.
AWS based Enterprise Digital Transformation Platform (EDTP) is architected as an Event Processing Digital Center around which all Businesses’ current and future data sources, consumers, services, and processes interact. The purpose of the Platform is to enable business innovation and agility by providing semantically-cohesive and structurally-flexible harmonized data across processes and systems and by bringing functions and capabilities to data instead of moving data.
Webinar: What Microsoft's Cloud Services Can Do For Your Nonprofit-2016-10-27 TechSoup
Get to know Microsoft's full suite of donated and discounted cloud solutions for nonprofits, including Office 365, Microsoft Azure, Power BI, and more. Tech Impact's Linda Widdop discusses the benefits of cloud adoption for nonprofits. This 60-minute webinar will help you understand the different options, features of cloud solutions.
This informative slide deck contains two presentations, one by Simon Hudson from Cloud2 about Microsoft SharePoint best practice and moving to the cloud. This is followed by an overview of performance monitoring and best practice for SharePoint applications by Mick McGuinness from Appplication Performance Ltd.
Rethinking Data Availability and Governance in a Mobile WorldHao Tran
The Briefing Room with Malcolm Chisholm and Druva
Live
The emergence of the mobile workforce has left an indelible mark on the enterprise; every employee is now mobile, and business data continues to be dispatched to the far reaches of the enterprise. While this has added enormous opportunity for increased productivity, it has also muddied the waters when it comes to controlling and protecting valuable data assets. As companies quickly evolve to address the new set of challenges posed by this shift in data usage, IT must ensure that all data, no matter where it’s generated or stored, is available and governed just as if it were still safely behind the corporate firewall.
Register for this episode of The Briefing Room to hear veteran Analyst Malcolm Chisholm as he explains the myriad challenges that mobile data introduces when addressing regulations and compliance needs, requiring new approaches to data governance. He’ll be briefed by Dave Packer of Druva, who will outline his company’s converged data protection strategy, which brings data center class capabilities to backup, availability and governance for the mobile workforce. He will share strategies to meet regional data residency, data recovery, legal hold and eDiscovery requirements and more.
How To Leverage Cloud Computing for Business & Operational Benefit - CAMP ITSkytap Cloud
CAMP IT Presentation by Brett Goodwin, VP Marketing & Business Development at Skytap, Inc. Presented 2.22.2013. Focus: How To Leverage Cloud Computing for Business & Operational Benefit
Curiosity Software and RCG Global Services Present - Solving Test Data: the g...Curiosity Software Ireland
This webinar was co-hosted by Curiosity and RCG Global Services on January 20th, 2022. Watch the webinar on demand: https://www.curiositysoftware.ie/solving-test-data-webinar
Outdated test data management practices are today a sinkhole for testing and development time. They stifle release velocity, risk costly legislative non-compliance, and yet still do not provide the data needed to protect releases from damaging bugs. To achieve true quality at speed, the test data paradigm must shift. Enterprises must move beyond slowly copying large sets of production data to a limited number of out-of-date test environments.
In this webinar, Global Head of Quality Engineering at RCG, Niko Mangahas, draws on extensive project experience to define the test data challenges facing enterprises today. He then helps you identify the right test data solution for your organisation, setting out principles for effective requirements gathering and program design. Niko then hands over to veteran test data inventor, Huw Price, who demoes some of the latest techniques for making complete and compliant data available on-the-fly during parallel testing, development, and CI/CD.
In the digital era, empowering the workforce with the ability to re-engineer their workflows, processes and activities into more competitive and effective outcomes for the business is essential. In this session we will share our vision for developers, programmers and “citizen developers” refreshing and developing new Domino apps that are the foundation to automating processes that free up workers to pursue higher value activities. We also want to share how customers all over the world helped us shape the future of Notes/Domino and defined the top outcomes to be included in Domino 10.
From Desktop to the Cloud: Why Organizations Are ConvertingTechSoup
On-prem vs. cloud? What solution is right for your organization? In this 30-minute webinar, we will focus on the current desktop environment and the factors that will drive increased cloud adoption over the next several years. We will also review why nonprofits need to be thinking about their digital transformation now.
What the App? : A Modernization Strategy for Your Business ApplicationsJohn Head
Presented at IBM Connect 2016: It's 2016 – your application portfolio is being reviewed and scrutinized. Email and application platforms are being separated. Users' expectations of their work experiences are higher than ever. But you're invested in your Notes & Domino applications – what do you do? Looking through the lens of IBM ESS solutions, we will answer that question by providing a roadmap and experiences to help you choose the best path. We will deep dive into the five aspects of Application Modernization: User Experience, Social, Cloud, Mobile, & Modern Workflow. See demos of actual application transformations and the impact they have within an organization. Learn how new functionality in the products will make your journey easier.
Databases: The Neglected Technology in DevOpsDevOps.com
Much has been written about software delivery in DevOps, with much less focus on the database. However, DevOps can—and should—play an equally critical role in both software and database development. In this ebook, we examine how DevOps can be used for database development and delivery, factors influencing DevOps’ role in database delivery, and some of the technologies designed to help.
Join us for this lively panel discussion!
This webinar will go over BI For Cloud Apps and answer the questions, What’s Different? It also includes demos for the following topics:
-Replicating Deleted & Archived Rows
-Auto-creation of Database Tables
-“Full Load” vs. “Incremental Load”
-Using Salesforce Bulk API
DataOps, DevOps and the Developer: Treating Database Code Just Like App CodeDevOps.com
Application developers want a fast DevOps process. However, database development and DataOps processes follow a different workflow which ends up creating a significant bottleneck in an otherwise streamlined process. But it doesn’t have to be that way. When it comes to data and database code many application development best practices still apply.
Join leaders from RedMonk, Datical and Delphix as they share the key DevOps and DataOps practices every application developer should know to treat database code just like app code.
AWS based Enterprise Digital Transformation Platform (EDTP) is architected as an Event Processing Digital Center around which all Businesses’ current and future data sources, consumers, services, and processes interact. The purpose of the Platform is to enable business innovation and agility by providing semantically-cohesive and structurally-flexible harmonized data across processes and systems and by bringing functions and capabilities to data instead of moving data.
Webinar: What Microsoft's Cloud Services Can Do For Your Nonprofit-2016-10-27 TechSoup
Get to know Microsoft's full suite of donated and discounted cloud solutions for nonprofits, including Office 365, Microsoft Azure, Power BI, and more. Tech Impact's Linda Widdop discusses the benefits of cloud adoption for nonprofits. This 60-minute webinar will help you understand the different options, features of cloud solutions.
This informative slide deck contains two presentations, one by Simon Hudson from Cloud2 about Microsoft SharePoint best practice and moving to the cloud. This is followed by an overview of performance monitoring and best practice for SharePoint applications by Mick McGuinness from Appplication Performance Ltd.
Rethinking Data Availability and Governance in a Mobile WorldHao Tran
The Briefing Room with Malcolm Chisholm and Druva
Live
The emergence of the mobile workforce has left an indelible mark on the enterprise; every employee is now mobile, and business data continues to be dispatched to the far reaches of the enterprise. While this has added enormous opportunity for increased productivity, it has also muddied the waters when it comes to controlling and protecting valuable data assets. As companies quickly evolve to address the new set of challenges posed by this shift in data usage, IT must ensure that all data, no matter where it’s generated or stored, is available and governed just as if it were still safely behind the corporate firewall.
Register for this episode of The Briefing Room to hear veteran Analyst Malcolm Chisholm as he explains the myriad challenges that mobile data introduces when addressing regulations and compliance needs, requiring new approaches to data governance. He’ll be briefed by Dave Packer of Druva, who will outline his company’s converged data protection strategy, which brings data center class capabilities to backup, availability and governance for the mobile workforce. He will share strategies to meet regional data residency, data recovery, legal hold and eDiscovery requirements and more.
How To Leverage Cloud Computing for Business & Operational Benefit - CAMP ITSkytap Cloud
CAMP IT Presentation by Brett Goodwin, VP Marketing & Business Development at Skytap, Inc. Presented 2.22.2013. Focus: How To Leverage Cloud Computing for Business & Operational Benefit
Curiosity Software and RCG Global Services Present - Solving Test Data: the g...Curiosity Software Ireland
This webinar was co-hosted by Curiosity and RCG Global Services on January 20th, 2022. Watch the webinar on demand: https://www.curiositysoftware.ie/solving-test-data-webinar
Outdated test data management practices are today a sinkhole for testing and development time. They stifle release velocity, risk costly legislative non-compliance, and yet still do not provide the data needed to protect releases from damaging bugs. To achieve true quality at speed, the test data paradigm must shift. Enterprises must move beyond slowly copying large sets of production data to a limited number of out-of-date test environments.
In this webinar, Global Head of Quality Engineering at RCG, Niko Mangahas, draws on extensive project experience to define the test data challenges facing enterprises today. He then helps you identify the right test data solution for your organisation, setting out principles for effective requirements gathering and program design. Niko then hands over to veteran test data inventor, Huw Price, who demoes some of the latest techniques for making complete and compliant data available on-the-fly during parallel testing, development, and CI/CD.
In the digital era, empowering the workforce with the ability to re-engineer their workflows, processes and activities into more competitive and effective outcomes for the business is essential. In this session we will share our vision for developers, programmers and “citizen developers” refreshing and developing new Domino apps that are the foundation to automating processes that free up workers to pursue higher value activities. We also want to share how customers all over the world helped us shape the future of Notes/Domino and defined the top outcomes to be included in Domino 10.
From Desktop to the Cloud: Why Organizations Are ConvertingTechSoup
On-prem vs. cloud? What solution is right for your organization? In this 30-minute webinar, we will focus on the current desktop environment and the factors that will drive increased cloud adoption over the next several years. We will also review why nonprofits need to be thinking about their digital transformation now.
What the App? : A Modernization Strategy for Your Business ApplicationsJohn Head
Presented at IBM Connect 2016: It's 2016 – your application portfolio is being reviewed and scrutinized. Email and application platforms are being separated. Users' expectations of their work experiences are higher than ever. But you're invested in your Notes & Domino applications – what do you do? Looking through the lens of IBM ESS solutions, we will answer that question by providing a roadmap and experiences to help you choose the best path. We will deep dive into the five aspects of Application Modernization: User Experience, Social, Cloud, Mobile, & Modern Workflow. See demos of actual application transformations and the impact they have within an organization. Learn how new functionality in the products will make your journey easier.
Databases: The Neglected Technology in DevOpsDevOps.com
Much has been written about software delivery in DevOps, with much less focus on the database. However, DevOps can—and should—play an equally critical role in both software and database development. In this ebook, we examine how DevOps can be used for database development and delivery, factors influencing DevOps’ role in database delivery, and some of the technologies designed to help.
Join us for this lively panel discussion!
This webinar will go over BI For Cloud Apps and answer the questions, What’s Different? It also includes demos for the following topics:
-Replicating Deleted & Archived Rows
-Auto-creation of Database Tables
-“Full Load” vs. “Incremental Load”
-Using Salesforce Bulk API
DataOps, DevOps and the Developer: Treating Database Code Just Like App CodeDevOps.com
Application developers want a fast DevOps process. However, database development and DataOps processes follow a different workflow which ends up creating a significant bottleneck in an otherwise streamlined process. But it doesn’t have to be that way. When it comes to data and database code many application development best practices still apply.
Join leaders from RedMonk, Datical and Delphix as they share the key DevOps and DataOps practices every application developer should know to treat database code just like app code.
The 5S Approach to Performance Tuning by Chuck EzellDatavail
If your organization relies on data, optimizing the performance of your database can increase your earnings and savings. Many factors large and small can affect performance, so fine-tuning your database is essential. Performance Tuning expert and Senior Applications Tuner for Datavail, Chuck Ezell, sheds light on the right questions to get the answers that will help you move forward by using a defined approach, refered to as 5S.
This performance tuning white paper addresses each stage of this novel approach, as well as key performance issues: SQL, Space, Sessions, Statistics, and Scheduled Processes.
Best Practices for Managing SaaS ApplicationsCorrelsense
The proliferation of SaaS applications like Salesforce.com is creating a host of new management challenges. For example, how do you measure the performance of applications you don’t host? What real-time data do you have to communicate with business stakeholders? How will you know if SLA commitments are being met?
Join us for a webinar exploring the best practices for managing SaaS applications, including:
*Important ways that the management of SaaS and hosted application management differ
*The unique challenges of supporting enterprise SaaS applications
*Case studies demonstrating new techniques and tools for measuring the performance of hosted applications like Salesforce.com
Webinar: The 5 Most Critical Things to Understand About Modern Data IntegrationSnapLogic
In this webinar, we talk to industry analyst, author and practitioner David Linthicum who provides a state-of-the-technology explanation of big data integration.
David also provides 5 critical and lesser known data integration requirements, how to understand today's requirements, and guidance for choosing the right approaches and technology to solve these problems.
To learn more, visit: www.snaplogic.com/big-data
Every day we roughly create 2.5 Quintillion bytes of data; 90% of the worlds collected data has been generated only in the last 2 years. In this slide, learn the all about big data
in a simple and easiest way.
Wie beschleunigt die Denodo Plattform Ihre Zeit der Erkenntnisgewinnung?Denodo
Watch full webinar here: https://bit.ly/3ayILnx
In this demo session, we will illustrate the power of Denodo and delve into how Denodo helps organisations make sense of disparate silos of data. We will demonstrate the Denodo advanced data catalog and our AI/ML features that help organizations democratize and govern their data.
We think it is a source of creativity. Join our interactive session where you will see firsthand how our small charity (a few staff and hundreds of volunteers) are able to be hyper-productive using Salesforce.com and Google Apps. Our live demonstrations will show how we’ve been able to triple our productivity with only double the resources. Framework, the organization that brings you the Timeraiser program, is a small team with limited resources. We’ve made a deliberate decision to invest heavily in our Cloud Computing Strategy - tools like Salesforce.com and Google Apps - to better focus on our mission and collaborate with like-minded organizations. This has lead to improved fundraising and volunteer management capabilities.
Attendees Will Walk Away With:
• Project Management – ideas on how your team can collaborates on social media projects
• Knowledge Management – ideas on how your team can stay on top of technology trends
• Fundraising Management – ideas on how your team can collaborate on Fundraising
• Board Management – ideas on how we organize information for your board
• Volunteer Management – ideas on to track volunteer time and participation
Anil Patel
In 2001, Anil along with some of his university friends co-founded Timeraiser, a program aimed at engaging skilled and energetic Canadians to get involved in the community. The Timeraiser is Part volunteer fair, Part silent art auction, Part night on the town. To date the Timeraiser has generated 55,000 volunteer hours and invested $330,000 in the careers of Canadian artists. View Anil’s profile at http://ca.linkedin.com/in/anilpatrickpatel
6 ways DevOps helped PrepSportswear move from monolith to microservicesDynatrace
Like a lot of online businesses today, PrepSportswear’s success is 100% dependent on the availability, scalability and performance of their digital online services. If the website is down, the business stops. They knew they had to transform their business from that of a retailer with a website to a high caliber IT company that sells products online.
In these webinar slides, Richard Dominguez, PrepSportswear’s Developer in Operations, shares their journey. They transformed from a team operating a monolithic app using waterfall development methodology on an old, hard to maintain code base, to a modern IT organization applying new practices from Agile development, DevOps and a Service-Oriented Architectural approach.
The Impact? PrepSportswear’s Most Successful Online Holiday Shopping Season in Company History! Join us to:
Learn how to identify if you are running a monolithic application that is dragging you down.
Get tips on hiring the right people to inject a DevOps cultural mindset into your organization.
Understand how to break the monolith into smaller pieces that support key lines of business.
Discover where to automate monitoring into your pipeline and platform.
Identify metrics for individual stakeholders (dev vs. test vs. business).
Go forward, celebrate, learn from, and repeat success!
Richard will be joined by Andreas Grabner, Performance Advocate at Dynatrace who will support why monitoring, application and end user metrics have to be a key part of your own transformation!
Richard Dominguez has 9+ years’ experience as both a System Analyst and Software Developer in Test. He has worked on many high profile projects in Microsoft such as Hyper-V, Windows 7 Client Performance, and Windows Phone Services. Richard now works at PrepSportswear as the company’s DevOps engineer. His responsibilities include site reliability, external synthetic testing, release management and overall site performance.
Andreas Grabner has 15+ years’ experience as an architect and developer in the Java and .NET space. In his current role, Andi works as an advocate for high performing applications in both the development and operations areas. He is a regular expert and contributor to large performance communities, a frequent speaker at technology conferences and regularly publishes articles blogs on blog.dynatrace.com
[DSC Adria 23] Miro MIljanic Telco Data Pipelines in the Cloud Architecture a...DataScienceConferenc1
This talk will provide description of several real-life Telco scenarios and our implementations of cloud-based data solutions for them. It will include insights on the pros and cons of cloud for each use case, and why we chose the specific tools and architectures.
A Key to Real-time Insights in a Post-COVID World (ASEAN)Denodo
Watch full webinar here: https://bit.ly/2EpHGyd
Presented at Data Champions, Online Asia 2020
Businesses and individuals around the world are experiencing the impact of a global pandemic. With many workers and potential shoppers still sequestered, COVID-19 is proving to have a momentous impact on the global economy. Regardless of the current situation and post-pandemic era, real-time data becomes even more critical to healthcare practitioners, business owners, government officials, and the public at large where holistic and timely information are important to make quick decisions. It enables doctors to make quick decisions about where to focus the care, business owners to alter production schedules to meet the demand, government agencies to contain the epidemic, and the public to be informed about prevention.
In this on-demand session, you will learn about the capabilities of data virtualization as a modern data integration technique and how can organisations:
- Rapidly unify information from disparate data sources to make accurate decisions and analyse data in real-time
- Build a single engine for security that provides audit and control by geographies
- Accelerate delivery of insights from your advanced analytics project
Successfully convince people with data visualizationKyle Hailey
Successfully convince people with data visualization
video of presentation available at https://www.youtube.com/watch?v=3PKjNnt14mk
from Data by the Bay conference
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
GridMate - End to end testing is a critical piece to ensure quality and avoid...ThomasParaiso2
End to end testing is a critical piece to ensure quality and avoid regressions. In this session, we share our journey building an E2E testing pipeline for GridMate components (LWC and Aura) using Cypress, JSForce, FakerJS…
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
SAP Sapphire 2024 - ASUG301 building better apps with SAP Fiori.pdfPeter Spielvogel
Building better applications for business users with SAP Fiori.
• What is SAP Fiori and why it matters to you
• How a better user experience drives measurable business benefits
• How to get started with SAP Fiori today
• How SAP Fiori elements accelerates application development
• How SAP Build Code includes SAP Fiori tools and other generative artificial intelligence capabilities
• How SAP Fiori paves the way for using AI in SAP apps
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
Communications Mining Series - Zero to Hero - Session 1DianaGray10
This session provides introduction to UiPath Communication Mining, importance and platform overview. You will acquire a good understand of the phases in Communication Mining as we go over the platform with you. Topics covered:
• Communication Mining Overview
• Why is it important?
• How can it help today’s business and the benefits
• Phases in Communication Mining
• Demo on Platform overview
• Q/A
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
The Art of the Pitch: WordPress Relationships and Sales
Data is the Constraint
1. Data is the Constraint
Blog: kylehailey.com
•
•
•
•
•
•
•
20 years working w/ Oracle
Oracle Ace
Oaktable
Conferences
Training days
Tech journal articles
Sales meetings
2. Main blogs
• Personal blog : http://kylehailey.com
• Linkedin www.linkedin.com/in/kylehailey
• Twitter @kylehhailey
• Facebook facebook.com/kyle.hailey.73
3. Top Blog Posts
• http://www.kylehailey.com/delphix/
–
–
–
–
–
What is Delphix
Benefits in Brief
High Performance Delphix
Jonathan Lewis Explains Delphix
Delphix vs Netapp and EM 12c
4. Data is the constraint
Three points
I. Data Tax strains infrastructure
II. Data Tax price is huge
III. Companies unaware of Data Tax
5. I. Data Taxes your business
If you can’t satisfy the business demands then your process is broken.
8. I. Data Tax
– Moving data is hard
– Triple tax
– Data Floods infrastructure
9. I. Data Tax : moving data is hard
– Storage & Systems : capital resources
– Personnel : operation expenditure
– Time : delayed projects
10. I. Data Tax : Typical Architecture
Production
Dev, QA, UAT
Reporting
Instance
Instance
Instance
Instance
Instance
Instance
Instance
Instance
Database
Database
Database
Database
Database
File system
File system
File system
File system
File system
File system
File system
File system
File system
File system
File system
Backup
Instance
Instance
Instance
File system
File system
11. I. Data Tax: triple data tax
Sandbox
Development
QA
UAT
Production
Business
Intelligence
Analytics
Reporting
Tape
Backup
Recovery
Forensics
12. I. Data Tax : Copying data floods infrastructure
13. I. Data Tax : Data Tax floods company
infrastructure
92% of the cost of business
— the financial services business —
is “data”
www.wsta.org/resources/industry-articles
Most companies have
2-9% IT spending
http://uclue.com/?xq=1133
Data management is the largest
portion of IT expense
15. Part II. Data Tax is Huge
• Four Areas data tax hits
1.
2.
3.
4.
IT Capital resources
IT Operations personnel
Application Development ***
Business
• How big is the data tax?
– Measure before and after installing Delphix
16. II. Data Tax is huge : 1. IT Capital
• Hardware
–
–
–
–
Servers
Storage
Network
Data center floor space, power, cooling
• Example
– Some customers have over 1 Petabyte duplicate
data
– (1000 TB, ie 1,000,000 GB )
17. II. Data Tax is huge : 2. IT Operations
• Involves many people
–
–
–
–
–
DBAs
SYS Admin
Storage Admin
Backup Admin
Network Admin
• 1000s of hours annually just for DBAs
– not including all the other personnel required to supply the infrastructure
necessary
• Data center efforts costly and difficult
– Consolidation
– Migrations
– Move to cloud
18. II. Data Tax is Huge : 3. App Dev
quality and speed
Five examples of application Data Tax impact
•
•
•
•
•
Inefficient QA: Higher costs of QA
QA Delays : Greater re-work of code
Sharing DB Environments : Bottlenecks
Using DB Subsets: More bugs in Prod
Slow Environment Builds: Delays
“if you can't measure it you can’t manage it”
19. II. Data Tax is Huge : 3. App Dev
Inefficient QA : Long Build times
Build
QA Test
Build Time
96% of QA time was building environment
$.04/$1.00 actual testing vs. setup
20. II. Data Tax is Huge : 3. App Dev
QA Delays: bugs found late = more code re-work
Build QA Env
Sprint 3
Sprint 2
Sprint 1
X
Build QA Env
QA
Bug Code
70
60
50
40
30
20
10
0
Cost
To
Correct
1
2
3
4
5
6
7
Delay in Fixing the bug
Software Engineering Economics – Barry Boehm (1981)
QA
21. II. Data Tax is Huge : 3. App Dev
full copies cause bottlenecks
Old Unrepresentative Data
Frustration Waiting
22. II. Data Tax is Huge : 3. App Dev
subsets cause bugs
23. II. Data Tax is Huge : 3. App Dev
subsets cause bugs
The Production ‘Wall’
Classic problem is that queries that
run fast on subsets hit the wall in
production.
Developers are unable to test against
all data
24. II. Data Tax is Huge : 3. App Dev
Slow Environment Builds:
3-6 Months to Deliver Data
Developer
Asks for DB
Manager
approves
DBA
System
Admin
Storage
Admin
Get Access
Request
system
Setup DB
Request
storage
Setup
machine
Allocate
storage
(take snapshot)
25. II. Data Tax is Huge : 3. App Dev
Slow Environment Builds
Why are hand offs so expensive?
26. II. Data Tax is Huge : 3. App Dev
Slow Environment Builds: culture of no
DBA
Developer
27. II. Data Tax is Huge : 3. App Dev
Slow Environment Builds
Never enough environments
28. II. Data Tax is Huge : 3. App Dev
What We’ve Seen
Five examples of application Data Tax impact
•
•
•
•
•
Inefficient QA: Higher costs
QA Delays : Increased re-work
Sharing DB : Bottlenecks
Subset DB : Bugs
Slow Environment Builds: Delays
29. II. Data Tax is Huge : 4. Business
Ability to capture revenue
• Business Intelligence
– ETLs or data warehouse refreshes
• Old data = less intelligence
• Less Intelligence = Missed
Revenue
• Business Applications
– Delays on getting applications that
generate the revenue
30. II. Data Tax is Huge : 4. Business Intelligence
1pm
noon
10pm
2011
2012
2013
2014
2015
8am
32. II. Data Tax is Huge : 4. Business
Magnitude of business impact
• $27 Billion Revenue
• $1 Billion IT spend
• $850M development staff
• $110M IT Ops
• $ 40M storage
Revenue
Dev
IT Ops
Storage
0
5000
10000
15000
20000
25000
30000
33. II. Data Tax is Huge : 4. Business
More
important
Revenue
Dev
IT Ops
More
obvious
Storage
0
5000
10000
15000
20000
25000
30000
34. II. Data Tax is Huge : How Big is the Data tax?
Measure before and after Delphix w/ Fortune 500 :
Median App Dev throughput increase by 2x
35. II. Data Tax is Huge : How big is the data tax ?
• 10 x Faster Financial Close
– Facebook 21 days down to 2
• 9x Faster BI refreshes
– Bank of West from 3 weeks per refresh to 3x a week
– Weekly refreshes down to 3 times a day
• 2x faster Projects
–
–
–
–
Informatica
KLA-Tencor (5 x increase)
Presbyterian Health
NY Life
• 20 % less bugs
– Stubhub
36. II. Data Tax is Huge : Public Customer Quotes
• "Delphix allowed us to shrink our project schedule from
12 months to 6 months.”
– BA Scott, NYL VP App Dev
• "It used to take 50-some-odd days to develop an
insurance product, … Now we can get a product to the
customer in about 23 days.”
– Presbyterian Health
• “Can't imagine working without Delphix”
– Ramesh Shrinivasan CA Department of General Services
42. 3
Oracle EM 12c Snap Clone
EM 12c
Test
Master
Source
Instance
?
instance
Profile
•
•
•
•
•
•
•
•
Register Netapp or ZFS with Storage Credentials
Install agents on a LINUX machine to manage the Netapp or ZFS storage.
Register test master database
Enable Snap Clone for the test master database
Set up a zone – set max CPU and Memory and the roles that can see these zones
Set up a pool – a pool is a set of machines where databases can be provisioned
Set up a profile – a source database that can be used for thin cloning
Set up a service template – init.ora values
Oracle Frankenstein
Linux
Clone
instance
Agents
Pool
Template
Zone
ZFS or
NetApp
43. EM 12c: Snap Clone
Production
Flexclone
Development
Flexclone
Netapp
Snap Manager for Oracle
Other technology?
• Prove it
• Bake off
• Customer refs
44. III. Companies unaware of the Data Tax
#1 Biggest Enemy IT departments believe
–
–
–
–
–
have best processes that exist
have latest and greatest technology
nothing better exists
Just the way it is
Data management is a drop in the bucket of overall
issues
“The status quo is pre-ordained failure”
45. The Phoenix Project
•
•
•
•
•
IT bottlenecks
Setting Priorities
Company Goals
Defining Metrics
Fast Iterations
IT version of
“The Goal”
by E. Goldratt
• “Any improvement not made at the constraint is an illusion.”
• What is the constraint ?
• The DBAs and environments.
• “One of the most powerful things that IT can do is get
environments to development and QA when they need it”
46. III. Companies unaware of the Data Tax
• Ask Questions
– Delphix: we provision environments in
minutes for almost not extra storage.
– Customer: We already do that
– Delphix: How long does it take a developer to
get an environment after they ask ?
– Customer: 2-3 weeks
– Delphix: we do it in 2-3 minutes
No other product does what we do
47. III. Companies unaware of the Data Tax
• How to enlighten companies ? Ask for
metrics
–
–
–
–
–
Batch window size for ETL
How new (old) is their BI data?
Number: app projects per year
How long does it take a developer to get a DB copy?
How long does it take QA to setup an environment
• How long to rollback
• How long to refresh
• How many time do they run a QA cycle
– How old is data in QA and DEV
50. Typical Architecture
Production
Dev, QA, UAT
Reporting
Instance
Instance
Instance
Instance
Instance
Instance
Instance
Instance
Database
Database
Database
Database
Database
File system
File system
File system
File system
File system
File system
File system
File system
File system
File system
File system
Backup
Instance
Instance
Instance
File system
File system
52. DevOps With Delphix
1.
2.
3.
4.
5.
Efficient QA: Low cost, high utilization
Quick QA : Fast Bug Fix
Every Dev gets DB: Parallelized Dev
Full DB : Less Bugs
Fast Builds: Culture of Yes
53. Impact on bottom line
• IT Capital expense
– 90 % less storage
– 30 % less licenses, floor space, servers , power
• IT Operational & Personnel
– 1000s of hours down to 10
• Application Development
– 2x project output at higher quality
• Business
– 9x more Fresh BI data
– 2x Faster time to market and higher quality
– Increase revenue
54. Summary
• Data tax IS the big gorilla.
• Delphix Agile data is small & fast
• With Delphix , Deliver projects
– Half the Time
– Higher Quality
– Increase Revenue
59. Allocate Any Storage to Delphix
Allocate Storage
Any type
Pure Storage + Delphix
Better Performance for
1/10 the cost
60. One time backup of source database
Production
Supports
Instance
Instance
Instance
Database
File system
Upcoming
61. DxFS (Delphix) Compress Data
Production
Instance
Instance
Instance
Database
File system
Data is
compressed
typically 1/3
size
62. Incremental forever change collection
Production
Instance
Instance
Instance
Database
Changes
Time Window
File system
• Collected incrementally forever
• Old data purged
82. DevOps With Delphix
1.
2.
3.
4.
5.
Efficient QA: Low cost, high utilization
Quick QA : Fast Bug Fix
Every Dev gets DB: Parallelized Dev
Full DB : Less Bugs
Fast Builds: Culture of Yes
83. 1. Efficient QA: Lower cost
Build
QA Test
Build Time
B
u
i
l
d
T
i
m
e
QA Test
1% of QA time was building environment
$.99/$1.00 actual testing vs. setup
90. 1. Forensics: Investigate Production Bugs
Development
Instance
Instance
Time Window
Anomaly on Prod
Possible code bug
At noon yesterday
Spin up VDB of Prod
as it was during
anomaly
91. 2. Testing : Rewind for patch and QA testing
Prod
Development
Instance
Instance
Time Window
Time Window
92. 2. Testing: A/B
Instance
Test A with Index 1
Instance
Instance
Time Window
• Keep tests for compare
• Production vs Virtual
– invisible index on Prod
– Creating index on virtual
• Flashback vs Virtual
Test B with Index 2
93. 3. Recovery: Surgical recover of Production
Source
Development
Instance
Instance
Spin VDB up
Before drop
Time Window
Problem on Prod
Dropped Table Accidently
94. 3. Recovery Surgical or Full Recovery on VDB
Dev1 VDB
Source
Instance
Instance
Dev2 VDB Branched
Source
Time Window
Dev1 VDB
Time Window
Instance
95. 3. Recovery: Virtual to Physical
Source
VDB
Instance
Instance
Spin VDB up
Before drop
Time Window
Corruption
101. ETL and DW Refreshes
Prod
DW & BI
Instance
Instance
Data Guard – requires full refresh if used
Active Data Guard – read only, most reports don’t work
102. Fast Refreshes
• Collect only Changes
• Refresh in minutes
Prod
Instance
BI
DW
Instance
Instance
ETL
24x7
Interview Delphix blew me away. As a DBA I had to spend 50% of my time making copiesAfter joining Delphix, I banged on Delphix for 2 years. It worksBlog entryReinforce ideas we’ve already seen from a different perspectiveWork for a company called DelphixWe write software that enables Oracle and SQL Server customers toCopy their databases in 2 minutes with almost no storage overheadWe accomplish that by taking one initial copy and sharing the duplicate blocks Across all the clones
BLOG
want data now.don’t understand DBAs.Db bigger and harder to copy.Devswant more copies.Reporting wants more copies.Everyone has storage constraints.If you can’t satisfy the business demands your process is broken
Moving the data IS the big gorilla. This gorilla of a data tax is hitting your bottom line hard.
Probably nothing more onerous for a DBA than to hear “can you get me a copy of the production database for my project”RMAN vs Delphix. I was running out of space for RMAN live demo !When moving data is too hard, then the data in non production systems such as reporting, development or QA becomes older, and the older the data, the less actionable intelligence your BI or Analytics can give you.
You are paying a data tax. You are paying a data tax moving and copying data and moving and copying data over and over again. Business intelligence and analyticsProject data for development, testing, QA, integration, UAT and trainingData protection backups for recovery
We know from our experience that there are some $1B+ Data center consolidation price tags. Taking even 30% of the cost out of that, and cutting the timeline, is a strong and powerful way to improve margin.What about really big problems like consolidating data center real estate, or moving to the cloud?f you can non-disruptively collect the data, and easily and repeatedly present it in the target data center, you take huge chunks out of these migration timelines. Moreover, with data being so easy to move on demand, you neutralize the hordes of users who insist that there isn’t enough time to do this, or its too hard, or too risky. Annual time spent coping databases can measure in the 1000s of hours just for DBAs not including all the other personnel required to supply the infrastructure necessary
Data gets old because not refreshedInstead of running 5 tests in two weeks (because it takes me 2 days to rollback after each of my 1 hour tests) and paying the cost of bugs slipping into production, what if I could run 15 tests in that same two weeks and have no bugs at all in production?
And they told us that they spend 96% of their QA cycle time building the QA environmentAnd only 4% actually running the QA suiteThis happens for every QA suitemeaningFor every dollar spent on QA there was only 4 cents of actual QA value Meaning 96% cost is spent infrastructure time and overhead
Because of the time required to set up QA environmentsThe actual QA tests suites lag behind the end of a sprint or code freezeMeaning that the amount of time that goes by after the introduction of a bug in code and before the bug is found increasesAnd the more time that goes by after the introduction of a bug into the codeThe more dependent is written on top of the bug Increasing the amount of code rework required after the bug is finally foundIn his seminal book that some of you may be familiar with, “Software Engineering Economics”, author Barry Boehm Introduce the computer world to the idea that the longer one delays fixing a bug in the application design lifescyleThe more expensive it is to to fix that bug and these cost rise exponentially the laterThe bug is address in the cycle
Not sure if you’ve run into this but I have personally experience the followingWhen I was talking to one group at Ebay, in that development group they Shared a single copy of the production database between the developers on that team.What this sharing of a single copy of production meant, is that whenever a Developer wanted to modified that database, they had to submit their changes to codeReview and that code review took 1 to 2 weeks.I don’t know about you, but that kind of delay would stifle my motivationAnd I have direct experience with the kind of disgruntlement it can cause.When I was last a DBA, all schema changes went through me.It took me about half a day to process schema changes. That delay was too much so it was unilaterally decided byThey developers to go to an EAV schema. Or entity attribute value schemaWhich mean that developers could add new fields without consulting me and without stepping on each others feat.It also mean that SQL code as unreadable and performance was atrocious.Besides creating developer frustration, sharing a database also makes refreshing the data difficult as it takes a while to refresh the full copyAnd it takes even longer to coordinate a time when everyone stops using the copy to make the refreshAll this means is that the copy rarely gets refreshed and the data gets old and unreliable
To circumvent the problems of sharing a single copy of productionMany shops we talk to create subsets.One company we talked to , spends 50% of time copying databases have to subset because not enough storagesubsetting process constantly needs fixing modificationNow What happens when developers use subsets -- ****** -----
Stubhub (ebay) estimates that 20% of there production bugs arise from testing onSubsets instead of full database copies.
Due to the constraints of building clone copy database environments one ends up in the “culture of no”Where developers stop asking for a copy of a production database because the answer is “no”If the developers need to debug an anomaly seen on production or if they need to write a custom module which requires a copy of production they know not to even ask and just give up.
If Walmart in New York sold Lego Batman like hotcakes the morning it came out, wouldn’t be good to know at Walmart CaliforniaWeek old data happens when refreshes are too disruptive and limited to weekends
You might be familiar with this cycle that we’ve seen in the industry:Where IT departments budgets are being constrainedWhen IT budgets are constrained one of the first targets is reducing storageAs storage budgets are reduced the ability to provision database copies and development environments goes downAs development environments become constrained, projects start to hit delays. As projects are delayed The applications that the business depend on to generate revenue to pay for IT budgets are delayedWhich reduces revenue as the business cannot access new applications Which in turn puts more pressure on the IT budget.It becomes a viscous circle
From our experience before and after with Fortune 500 companies
How big is the data tax? One way we can measure it is by looking at the improvements in project timelines at companies that have eliminated this data tax through implementing a data virtualization appliance (DVA) and creating an agile data platform (ADP). Agile data is data that is delivered to the exact spot it’s needed just in time and with much less time/cost/effort. By looking at productivity rates after implementing an ADP compared to before the ADP we can get an idea of the price of the data tax without an ADP. IT experts building mission critical systems for Fortune 500 companies have seen real project returns averaging 20-50% productivity increases after having implemented an ADP. That’s a big data tax to pay without an ADP. The data tax is real, and once you understand how real it is, you realize how many of your key business decisions and strategies are affected by the agility of the data in your applications.Took us 50 days to develop an insurance product … now we can get a product to the customer in 23 days with Delphix
Internet vs browserAutomate or die – the revolution will be automatedThe worst enemy of companies today is thinking that they have the best processes that exist, that their IT organizations are using the latest and greatest technology and nothing better exists in the field. This mentality will be the undermining of many companies.http://www.kylehailey.com/automate-or-die-the-revolution-will-be-automated/Data IS the constraintBusiness skeptics are saying to themselves that data processes are just a rounding error in most of their project timelines, and that they are sure their IT has developed processes to fix that. That’s the fundamental mistake. The very large and often hidden data tax lay in all the ways that we’ve optimized our software, data protection, and decision systems around the expectation that data is simply not agile. The belief that there is no agility problem is part of the problem.http://www.kylehailey.com/data-is-the-constraint/
Internet vs browserengine vs carAutomate or die – the revolution will be automatedThe worst enemy of companies today is thinking that they have the best processes that exist, that their IT organizations are using the latest and greatest technology and nothing better exists in the field. This mentality will be the undermining of many companies.http://www.kylehailey.com/automate-or-die-the-revolution-will-be-automated/Data IS the constraintBusiness skeptics are saying to themselves that data processes are just a rounding error in most of their project timelines, and that they are sure their IT has developed processes to fix that. That’s the fundamental mistake. The very large and often hidden data tax lay in all the ways that we’ve optimized our software, data protection, and decision systems around the expectation that data is simply not agile. The belief that there is no agility problem is part of the problem.http://www.kylehailey.com/data-is-the-constraint/
Source Syncing* Initial backup once onlyContinual forever change collection Purging of old data Storage DxFSShare blocks snap shots , unlimited, storage agnosticCompression , 1/3 typically, compress on block boundaries. Overhead for compression is basically undetectable Share data in memory, super caching*Self Service AutomationVirtual database provisioning, rollback, refresh*, branching*, tagging*Mount files over NFSInit.ora, SID, database name, database unique nameSecurity on who can see which source databases, how many clones they can make and how much storage they can use
Don’t know anyone successfully using this , yet
Internet vs browserAutomate or die – the revolution will be automatedThe worst enemy of companies today is thinking that they have the best processes that exist, that their IT organizations are using the latest and greatest technology and nothing better exists in the field. This mentality will be the undermining of many companies.http://www.kylehailey.com/automate-or-die-the-revolution-will-be-automated/Data IS the constraintBusiness skeptics are saying to themselves that data processes are just a rounding error in most of their project timelines, and that they are sure their IT has developed processes to fix that. That’s the fundamental mistake. The very large and often hidden data tax lay in all the ways that we’ve optimized our software, data protection, and decision systems around the expectation that data is simply not agile. The belief that there is no agility problem is part of the problem.http://www.kylehailey.com/data-is-the-constraint/
if you look at what’s really impeding flow from development to operations to the customer, it’s typically IT operations.Operations can never deliver environments upon demand. You have to wait months or quarters to get a test environment. When that happens terrible things happen. People actually horde environments. They invite people to their teams because the know they have reputation for having a cluster of test environments so people end up testing on environments that are years old which doesn’t actually achieve the goal.One of the most powerful things that organizations can do is to enable development and testing to get environment they need when they need it“One of the best predictors of DevOps performance is that IT Operations can make available environments available on-demand to Development and Test, so that they can build and test the application in an environment that is synchronized with Production.One of the most powerful things that organizations can do is to enable development and testing to get environment they need when they need itEliyahuGoldratt
How long does it take a developer to get a copy of a database
Moving the data IS the big gorilla. Eliminating the data tax is crucial to the success of your company. And, if huge databases can be ready at target data centers in minutes, the rest of the excuses are flimsy. Agile data – virtualized data – uses a small footprint. A truly agile data platform can deliver full size datasets cheaper than subsets. A truly agile data platform can move the time or the location pointer on its data very rapidly, and can store any version that’s needed in a library at an unbelievably low cost. And, a truly agile data platform can massively improve app quality by making it reliable and dead simple to return to a common baseline for one or many databases in a very short amount of time. Applications delivered with agile data can afford a lot more full size virtual copies, eliminating wait time and extra work caused by sharing, as well as side effects. With the cost of data falling so dramatically, business can radically increase their utilization of existing hardware and storage, delivering much more rapidly without any additional cost. An agile data platform presents data so rapidly and reliably that the data becomes commoditized – and servers that sit idle because it would just take too long to rebuild can now switch roles on demand.
In the physical database world, 3 clones take up 3x the storage.In the virtual world 3 clones take up 1/3 the storage thanks to block sharing and compression
Software installs an any x86 hardware uses any storage supports Oracle 9.2-12c, standard edition, enterprise edition, single instance and RAC on AIX, Sparc, HPUX, LINUX support SQL Server
EMC, Netapp, Fujitsu, Or newer flash storage likeViolin, Pure Storage, Fusion IO etc
Delphix does a one time only copy of the source database onto Delphix
Physically independent but logically correlatedCloning multiple source databases at the same time can be a daunting task
One example with our customers is InformaticaWho had a project to integrate 6 databases into one central databaseThe time of the project was estimated at 12 monthsWith much of that coming from trying to orchestratingGetting copies of the 6 databases at the same point in timeLike herding cats
Informatical had a 12 month project to integrate 6 databases.After installing Delphix they did it in 6 months.I delivered this earlyI generated more revenueI freed up money and put it into innovationwon an award with Ventana Research for this project
Developer each get a copyFast, fresh, full, frequentSelf serviceQA branch from DevelopmentFederated cloning easyForensicsA/B testingRecovery : Logical and physical Development Provision and RefreshFullFreshFrequent (Many) Source control for code, data control for the database Data version per release version Federated cloning QA fork copies off to QA QA fork copies back to Dev Instant replay – set up and run destructive tests performance A/B Upgrade patching Recovery Backup 50 days in size of 1 copy, continuous data protection (use recent slide ob backup schedules full, incr,inrc,inrc, full) Restore logical recovery on prod logical recovery on Dev Debugging debug on clone instead of prod debug on data at the time of a problem Validate physical integrity (test for physical corruption)
Presbyterian when from 10 hour builds to 10 minute buildsTotal Investment in Test Environment: $2M/year10 QA engineersDBA, storage team dedicated to support testingApp, Oracle server, storage, backupsRestore load competes with backup jobsRequirements: fast data refresh, rollbackData delivery takes 480 out of 500 minute test cycle (4% value)$.04/$1.00 actual testing vs. setup
For example Stubhub went from 5 copies of production in development to 120Giving each developer their own copy
Stubhub estimated a 20% reduction in bugs that made it to production
Multiple scripted dumps or RMAN backups are used to move data today. With application awareness, we only request change blocks—dramatically reducing production loads by as much as 80%. We also eliminate the need for DBAs to manage custom scripts, which are expensive to maintain and support over time.
Once Last Thinghttp://www.dadbm.com/wp-content/uploads/2013/01/12c_pluggable_database_vs_separate_database.png
250 pdb x 200 GB = 50 TBEMC sells 1GB$1000Dell sells 32GB $1,000.terabyte of RAM on a Dell costs around $32,000terabyte of RAM on a VMAX 40k costs around $1,000,000.
http://www.emc.com/collateral/emcwsca/master-price-list.pdf These prices obtain on pages 897/898:Storage engine for VMAX 40k with 256 GB RAM is around $393,000Storage engine for VMAX 40k with 48 GB RAM is around $200,000So, the cost of RAM here is 193,000 / 208 = $927 a gigabyte. That seems like a good deal for EMC, as Dell sells 32 GB RAM DIMMs for just over $1,000. So, a terabyte of RAM on a Dell costs around $32,000, and a terabyte of RAM on a VMAX 40k costs around $1,000,000.2) Most DBs have a buffer cache that is less than 0.5% (not 5%, 0.5%) of the datafile size.