HP underwent a large-scale data center transformation project to consolidate over 85 global data centers into six new next-generation data centers located in three zones across the US. This consolidation aimed to standardize HP's technology environment, retire legacy applications, build state-of-the-art infrastructure, automate monitoring and control, improve business continuity, and significantly reduce IT costs. The new data centers employ technologies like Dynamic Smart Cooling and are designed for high availability, disaster recovery, and rapid service delivery.
Business analytics (BA) refers to the methods and techniques used to measure business performance. BA uses statistical analysis to transform raw data into meaningful insights. There are six major components of a BA solution: data mining, forecasting, predictive analytics, optimization, text mining, and visualization.
BA can be categorized into descriptive, predictive, and prescriptive analytics. Descriptive analytics answers "what happened" by analyzing past data. Predictive analytics predicts future outcomes and answers "what could happen." Prescriptive analytics determines optimal courses of action and answers "what should we do?" Together, these three categories of BA provide businesses with insights from data to improve decision-making.
The Path to Data and Analytics ModernizationAnalytics8
Learn about the business demands driving modernization, the benefits of doing so, and how to get started.
Can your data and analytics solutions handle today’s challenges?
To stay competitive in today’s market, companies must be able to use their data to make better decisions. However, we are living in a world flooded by data, new technologies, and demands from the business for better and more advanced analytics. Most companies do not have the modern technologies and processes in place to keep up with these growing demands. They need to modernize how they collect, analyze, use, and share their data.
In this webinar, we discuss how you can build modern data and analytics solutions that are future ready, scalable, real-time, high speed, and agile and that can enable better use of data throughout your company.
We cover:
-The business demands and industry shifts that are impacting the need to modernize
-The benefits of data and analytics modernization
-How to approach data and analytics modernization- steps you need to take and how to get it right
-The pillars of modern data management
-Tips for migrating from legacy analytics tools to modern, next-gen platforms
-Lessons learned from companies that have gone through the modernization process
Business analytics workshop presentation finalBrian Beveridge
This document outlines an agenda and presentation for a business analytics seminar for credit union executives and board directors. The presentation will define business analytics, explain how it can help credit unions address key issues like margin compression and regulatory compliance, and provide examples of how analytics can be applied to areas like marketing, risk management, and branch performance. Attendees will learn how predictive analytics can help credit unions retain members, optimize pricing, and streamline operations. The presentation will also cover getting started with business analytics projects.
This document discusses different types of data analytics including web, mobile, retail, social media, and unstructured analytics. It defines business analytics as the integration of disparate internal and external data sources to answer forward-looking business questions tied to key objectives. Big data comes from various sources like web behavior and social media, while little data refers to any data not considered big data. Successful analytics requires addressing business challenges, having a strong data foundation, implementing solutions with goals in mind, generating insights, measuring results, sharing knowledge, and innovating approaches. The future of analytics involves every company having a data strategy and using tools to augment internal data. Predictive analytics tells what will happen, while prescriptive analytics tells how to make it
The Data Driven Enterprise - Roadmap to Big Data & Analytics SuccessBigInsights
The document discusses how data-driven companies are performing better financially and outlines the benefits of big data and analytics. It provides examples of companies using big data and analytics to improve customer experience through personalization, predict maintenance needs, and identify at-risk veterans to prevent suicide. The challenges of big data are also reviewed. Finally, it proposes a seven-step methodology for leveraging big data and analytics to address critical business challenges.
These presentations are created by Tushar B Kute to teach the subject 'Management Information System' subject of TEIT of University of Pune.
http://www.tusharkute.com
The document discusses data analytics and provides examples of its applications. It defines analytics as the transformation of data into insights for decision making. There are four main types of analytics: descriptive analyzes what is happening; diagnostic analyzes why things happened; predictive analyzes how patterns will perform in the future; and prescriptive determines future actions based on trends. The document also outlines elements of data analytics like data, processes, skills and tools. It provides a case study example and discusses how internal audit and fraud detection can utilize analytics.
This document discusses business analytics. It defines business analytics as using data, statistical and quantitative analysis, explanatory and predictive models to gain insights and support decision-making. The document outlines the typical business analytics process, including understanding the business objectives, assessing the situation, collecting and preparing data, developing analytic models, evaluating and reporting results, and deploying the outcomes. It provides examples of how analytics can be used to drive personalized customer services, optimize people management decisions, and conduct real-time sentiment analysis of social media data for an FMCG company. The document concludes with lessons learned, emphasizing the importance of continuous learning, gaining experience through projects and mentoring, and having confidence in one's abilities.
Business analytics (BA) refers to the methods and techniques used to measure business performance. BA uses statistical analysis to transform raw data into meaningful insights. There are six major components of a BA solution: data mining, forecasting, predictive analytics, optimization, text mining, and visualization.
BA can be categorized into descriptive, predictive, and prescriptive analytics. Descriptive analytics answers "what happened" by analyzing past data. Predictive analytics predicts future outcomes and answers "what could happen." Prescriptive analytics determines optimal courses of action and answers "what should we do?" Together, these three categories of BA provide businesses with insights from data to improve decision-making.
The Path to Data and Analytics ModernizationAnalytics8
Learn about the business demands driving modernization, the benefits of doing so, and how to get started.
Can your data and analytics solutions handle today’s challenges?
To stay competitive in today’s market, companies must be able to use their data to make better decisions. However, we are living in a world flooded by data, new technologies, and demands from the business for better and more advanced analytics. Most companies do not have the modern technologies and processes in place to keep up with these growing demands. They need to modernize how they collect, analyze, use, and share their data.
In this webinar, we discuss how you can build modern data and analytics solutions that are future ready, scalable, real-time, high speed, and agile and that can enable better use of data throughout your company.
We cover:
-The business demands and industry shifts that are impacting the need to modernize
-The benefits of data and analytics modernization
-How to approach data and analytics modernization- steps you need to take and how to get it right
-The pillars of modern data management
-Tips for migrating from legacy analytics tools to modern, next-gen platforms
-Lessons learned from companies that have gone through the modernization process
Business analytics workshop presentation finalBrian Beveridge
This document outlines an agenda and presentation for a business analytics seminar for credit union executives and board directors. The presentation will define business analytics, explain how it can help credit unions address key issues like margin compression and regulatory compliance, and provide examples of how analytics can be applied to areas like marketing, risk management, and branch performance. Attendees will learn how predictive analytics can help credit unions retain members, optimize pricing, and streamline operations. The presentation will also cover getting started with business analytics projects.
This document discusses different types of data analytics including web, mobile, retail, social media, and unstructured analytics. It defines business analytics as the integration of disparate internal and external data sources to answer forward-looking business questions tied to key objectives. Big data comes from various sources like web behavior and social media, while little data refers to any data not considered big data. Successful analytics requires addressing business challenges, having a strong data foundation, implementing solutions with goals in mind, generating insights, measuring results, sharing knowledge, and innovating approaches. The future of analytics involves every company having a data strategy and using tools to augment internal data. Predictive analytics tells what will happen, while prescriptive analytics tells how to make it
The Data Driven Enterprise - Roadmap to Big Data & Analytics SuccessBigInsights
The document discusses how data-driven companies are performing better financially and outlines the benefits of big data and analytics. It provides examples of companies using big data and analytics to improve customer experience through personalization, predict maintenance needs, and identify at-risk veterans to prevent suicide. The challenges of big data are also reviewed. Finally, it proposes a seven-step methodology for leveraging big data and analytics to address critical business challenges.
These presentations are created by Tushar B Kute to teach the subject 'Management Information System' subject of TEIT of University of Pune.
http://www.tusharkute.com
The document discusses data analytics and provides examples of its applications. It defines analytics as the transformation of data into insights for decision making. There are four main types of analytics: descriptive analyzes what is happening; diagnostic analyzes why things happened; predictive analyzes how patterns will perform in the future; and prescriptive determines future actions based on trends. The document also outlines elements of data analytics like data, processes, skills and tools. It provides a case study example and discusses how internal audit and fraud detection can utilize analytics.
This document discusses business analytics. It defines business analytics as using data, statistical and quantitative analysis, explanatory and predictive models to gain insights and support decision-making. The document outlines the typical business analytics process, including understanding the business objectives, assessing the situation, collecting and preparing data, developing analytic models, evaluating and reporting results, and deploying the outcomes. It provides examples of how analytics can be used to drive personalized customer services, optimize people management decisions, and conduct real-time sentiment analysis of social media data for an FMCG company. The document concludes with lessons learned, emphasizing the importance of continuous learning, gaining experience through projects and mentoring, and having confidence in one's abilities.
This document summarizes a presentation on clinical information governance at GlaxoSmithKline (GSK). GSK is combining data modelling, master data management, enterprise service bus, data stewardship, and enterprise architecture to simplify managing clinical study information. They have established different levels of data stewardship accountability and are implementing a clinical data stewardship framework. Their goal is to transform how clinical trial data is collected, reported, archived and retrieved to make trials more efficient and enhance patient safety.
Analytics, Business Intelligence, and Data Science - What's the Progression?DATAVERSITY
Data analysis can include looking back at historical data, understanding what an organization currently has, and even looking forward to predictions of the future. This presentation will talk about the differences between analytics, business intelligence, and data science, as well as the differences in architecture — and possibly even organization maturity — that make each successful.
Learn more about these topics we will explore including:
Defining analytics, business intelligence, and data science
Differences in architecture
When to use analytics, business intelligence, or data science
Whether there has been an evolution between analytics, business intelligence, and data science
The data governance function exercises authority and control over the management of your mission critical assets and guides how all other data management functions are performed. When selling data governance to organizational management, it is useful to concentrate on the specifics that motivate the initiative. This means developing a specific vocabulary and set of narratives to facilitate understanding of your organizational business concepts. This webinar provides you with an understanding of what data governance functions are required and how they fit with other data management disciplines. Understanding these aspects is a necessary pre-requisite to eliminate the ambiguity that often surrounds initial discussions and implement effective data governance and stewardship programs that manage data in support of organizational strategy.
Find more of our Data-Ed webinars here: http://www.datablueprint.com/resource-center/webinar-schedule/
Business intelligence is the process of collecting raw data from various sources, analyzing it to draw meaningful conclusions, and presenting it to drive business decisions. It involves technologies that convert data into useful information to support decision making. Over time, tools like data warehouses, OLAP, and ETL were developed to facilitate analyzing large datasets and generating insights. Business intelligence aims to provide strategic decision support through data exploration, data mining, optimization, and ultimately informing decisions.
This document discusses the use of data science in modern banking. It provides an overview of Raiffeisen Bank, which uses data science for applications like customer profiling, churn prediction, and fraud prevention. It then describes a datathon use case to build predictive models for new customers using external open data to supplement limited internal customer information. Finally, it outlines the daily work and benefits of being a data scientist at Raiffeisen Bank.
Big data in healthcare refers to large, diverse, and complex datasets that are difficult to analyze using traditional methods. The healthcare industry generates huge amounts of data from sources like electronic health records, medical imaging, and fitness trackers. Analyzing this big data can help improve patient outcomes, reduce costs, and advance personalized medicine. However, healthcare also faces challenges like data silos, privacy concerns, and resistance to change. Opportunities include disease prediction and prevention, reducing readmissions and fraud, and optimizing care through remote monitoring. Some organizations are starting to see benefits from big data initiatives focused on areas like evidence-based treatment and integrated health records.
The document discusses IT infrastructure, which includes hardware, software, and services required to operate an enterprise. It describes different levels of infrastructure including public, enterprise, and business unit levels. It also discusses various infrastructure components such as operating systems, enterprise applications, data management, networking, internet platforms, and consulting services. Key trends discussed include grid computing, on-demand computing, edge computing, and the rise of Linux and open-source software.
1. The document discusses the management information systems used at ICICI Bank. It describes how ICICI Bank has evolved over time from being formed in 1955 to becoming a diversified financial services group today.
2. It outlines the key information systems used at different levels of the bank, including transaction processing systems, management information systems, and enterprise information systems that support functions like deposits, loans, payments, and online services.
3. The document also summarizes some of the major software and technologies used at ICICI Bank to power its information systems and enable key operations like customer relationship management, risk management, and remote monitoring of infrastructure.
This is a presentation in a meetup called "Business of Data Science". Data science is being leveraged extensively in the field of Banking and Financial Services and this presentation will give a brief and fundamental highlight to the evergreen field.
This document discusses the scope, growth, and career opportunities in analytics. It defines analytics as the process of analyzing large datasets to discover useful patterns and insights. Analytics helps organizations make better, faster decisions by identifying opportunities for improvement. The analytics market in India is worth $375 million currently and is expected to increase to $1.15 billion. Analytics jobs in India range from 500 to 800 analysts out of every 10,000 employees at a company. Salaries for analytics professionals increase with more years of experience, ranging from 3.2 lakhs for 0-2 years of experience to over 27 lakhs for more than 12 years of experience. The field of analytics is highly in demand and is considered the sexiest job of
The document provides an overview of management information systems (MIS). It discusses how MIS are integrated collections of subsystems that support decision-making through routine reports. It outlines the key components of MIS including operations support systems, management support systems, and examples of financial and manufacturing MIS.
This document provides an overview of data science, including its history, definition, applications, challenges, career opportunities, required skills, courses, jobs, and salaries. Data science emerged in the 1960s to help interpret large amounts of gathered data and uses computer science and statistics to gain insights from data in many fields. It allows businesses to understand vast data sources for informed decisions. Common data science jobs include data scientist, data analyst, and data engineer.
The document discusses web-based decision support systems (DSS). It outlines the tasks of conventional DSS and how the internet and web enable new approaches. A web-based DSS delivers decision support tools over the internet while a web-enabled DSS incorporates web technologies. Recent research focuses on architectures, technologies, and applications like a hospital management system and e-commerce risk analysis tool. Benefits include increased availability while challenges involve technological issues adapting DSS for the web and economic questions around new payment models.
This presentation reports on data governance best practices. Based on a definition of fundamental terms and the business rationale for data governance, a set of case studies from leading companies is presented. The content of this presentation is a result of the Competence Center Corporate Data Quality (CC CDQ) at the University of St. Gallen, Switzerland.
Data Marketplace and the Role of Data VirtualizationDenodo
Watch full webinar here: https://bit.ly/3IS9sQS
A data marketplace is like an online shopping interface specializing in data. Ideally, it should work just like an online store, with minimal latency and maximum responsiveness. However, this does not mean that all of the data in the data marketplace needs to be stored in the same central repository.
In this session, Shadab Hussain, Americas Sales Head, Data Analytics at Wipro, a partner company with Denodo and a co-sponsor of DataFest 2021, talks about the role of data virtualization in enabling full-featured data marketplaces. Such data marketplaces provide real-time, curated access to data, even when the data is stored across many different sources throughout the organization.
You will learn:
- The main features of a data marketplace
- Why organizations need data marketplaces
- Why data marketplaces sometimes fail
- How data virtualization enables the most effective data marketplaces
- How one of Europe’s premiere public healthcare system organizations leveraged a data marketplace to improve data consumption and ease of access
An attempt at categorizing the thriving big data ecosystem by @mattturck and @shivonZ - comments are welcome (please add your thoughts on mattturck.com)
Business intelligence (BI) refers to techniques used to analyze business data and present it to facilitate decision making. BI technologies provide historical, current, and predictive views of business operations to support better decisions. The major components of BI include applications like reporting, analytics, and dashboards. While BI helps improve productivity, decision making, and results, it also faces disadvantages like data piling, costs, and complexity.
This document discusses data quality and its importance for businesses. It provides a case study of how British Airways improved data quality which increased efficiency and decision making. An insurance case study shows how improving data quality led to better customer understanding and risk assessment. Finally, the document outlines key drivers of data quality including regulatory compliance, business intelligence, and customer-centric models.
Download at http://DavidHubbard.net/powerpoint - This Introduction to Business Intelligence gives an overview of how Business Intelligence fits into business strategy in general. It does not go into the specific technologies of Business Intelligence. It is meant to be used to explain Business Intelligence to those not already familiar with Business Intelligence.
This presentation discusses how technology can be used to teach graphing equations to 9th grade algebra students. It provides examples of blogs, videos, websites and apps that teachers can use to enhance instruction and help students better understand graphing, including Khan Academy videos, Desmos graphing calculator, and interactive websites with lessons, worksheets and activities. The presentation evaluates the reliability of internet sources and properly cites the visuals used.
The Edge of Disaster Recovery - May Events Presentation FINALJohn Baumgarten
Peak 10 provides disaster recovery services including disaster recovery as a service (DRaaS). Their approach involves replicating customer VMs to their Recovery Cloud using Zerto virtual replication appliances with recovery point objectives of seconds and recovery time objectives of minutes. Peak 10 manages the disaster recovery environment including ongoing monitoring, twice annual testing, and support for declaration events. Their DRaaS solution is hypervisor-agnostic, storage-agnostic, and can scale on demand.
This document summarizes a presentation on clinical information governance at GlaxoSmithKline (GSK). GSK is combining data modelling, master data management, enterprise service bus, data stewardship, and enterprise architecture to simplify managing clinical study information. They have established different levels of data stewardship accountability and are implementing a clinical data stewardship framework. Their goal is to transform how clinical trial data is collected, reported, archived and retrieved to make trials more efficient and enhance patient safety.
Analytics, Business Intelligence, and Data Science - What's the Progression?DATAVERSITY
Data analysis can include looking back at historical data, understanding what an organization currently has, and even looking forward to predictions of the future. This presentation will talk about the differences between analytics, business intelligence, and data science, as well as the differences in architecture — and possibly even organization maturity — that make each successful.
Learn more about these topics we will explore including:
Defining analytics, business intelligence, and data science
Differences in architecture
When to use analytics, business intelligence, or data science
Whether there has been an evolution between analytics, business intelligence, and data science
The data governance function exercises authority and control over the management of your mission critical assets and guides how all other data management functions are performed. When selling data governance to organizational management, it is useful to concentrate on the specifics that motivate the initiative. This means developing a specific vocabulary and set of narratives to facilitate understanding of your organizational business concepts. This webinar provides you with an understanding of what data governance functions are required and how they fit with other data management disciplines. Understanding these aspects is a necessary pre-requisite to eliminate the ambiguity that often surrounds initial discussions and implement effective data governance and stewardship programs that manage data in support of organizational strategy.
Find more of our Data-Ed webinars here: http://www.datablueprint.com/resource-center/webinar-schedule/
Business intelligence is the process of collecting raw data from various sources, analyzing it to draw meaningful conclusions, and presenting it to drive business decisions. It involves technologies that convert data into useful information to support decision making. Over time, tools like data warehouses, OLAP, and ETL were developed to facilitate analyzing large datasets and generating insights. Business intelligence aims to provide strategic decision support through data exploration, data mining, optimization, and ultimately informing decisions.
This document discusses the use of data science in modern banking. It provides an overview of Raiffeisen Bank, which uses data science for applications like customer profiling, churn prediction, and fraud prevention. It then describes a datathon use case to build predictive models for new customers using external open data to supplement limited internal customer information. Finally, it outlines the daily work and benefits of being a data scientist at Raiffeisen Bank.
Big data in healthcare refers to large, diverse, and complex datasets that are difficult to analyze using traditional methods. The healthcare industry generates huge amounts of data from sources like electronic health records, medical imaging, and fitness trackers. Analyzing this big data can help improve patient outcomes, reduce costs, and advance personalized medicine. However, healthcare also faces challenges like data silos, privacy concerns, and resistance to change. Opportunities include disease prediction and prevention, reducing readmissions and fraud, and optimizing care through remote monitoring. Some organizations are starting to see benefits from big data initiatives focused on areas like evidence-based treatment and integrated health records.
The document discusses IT infrastructure, which includes hardware, software, and services required to operate an enterprise. It describes different levels of infrastructure including public, enterprise, and business unit levels. It also discusses various infrastructure components such as operating systems, enterprise applications, data management, networking, internet platforms, and consulting services. Key trends discussed include grid computing, on-demand computing, edge computing, and the rise of Linux and open-source software.
1. The document discusses the management information systems used at ICICI Bank. It describes how ICICI Bank has evolved over time from being formed in 1955 to becoming a diversified financial services group today.
2. It outlines the key information systems used at different levels of the bank, including transaction processing systems, management information systems, and enterprise information systems that support functions like deposits, loans, payments, and online services.
3. The document also summarizes some of the major software and technologies used at ICICI Bank to power its information systems and enable key operations like customer relationship management, risk management, and remote monitoring of infrastructure.
This is a presentation in a meetup called "Business of Data Science". Data science is being leveraged extensively in the field of Banking and Financial Services and this presentation will give a brief and fundamental highlight to the evergreen field.
This document discusses the scope, growth, and career opportunities in analytics. It defines analytics as the process of analyzing large datasets to discover useful patterns and insights. Analytics helps organizations make better, faster decisions by identifying opportunities for improvement. The analytics market in India is worth $375 million currently and is expected to increase to $1.15 billion. Analytics jobs in India range from 500 to 800 analysts out of every 10,000 employees at a company. Salaries for analytics professionals increase with more years of experience, ranging from 3.2 lakhs for 0-2 years of experience to over 27 lakhs for more than 12 years of experience. The field of analytics is highly in demand and is considered the sexiest job of
The document provides an overview of management information systems (MIS). It discusses how MIS are integrated collections of subsystems that support decision-making through routine reports. It outlines the key components of MIS including operations support systems, management support systems, and examples of financial and manufacturing MIS.
This document provides an overview of data science, including its history, definition, applications, challenges, career opportunities, required skills, courses, jobs, and salaries. Data science emerged in the 1960s to help interpret large amounts of gathered data and uses computer science and statistics to gain insights from data in many fields. It allows businesses to understand vast data sources for informed decisions. Common data science jobs include data scientist, data analyst, and data engineer.
The document discusses web-based decision support systems (DSS). It outlines the tasks of conventional DSS and how the internet and web enable new approaches. A web-based DSS delivers decision support tools over the internet while a web-enabled DSS incorporates web technologies. Recent research focuses on architectures, technologies, and applications like a hospital management system and e-commerce risk analysis tool. Benefits include increased availability while challenges involve technological issues adapting DSS for the web and economic questions around new payment models.
This presentation reports on data governance best practices. Based on a definition of fundamental terms and the business rationale for data governance, a set of case studies from leading companies is presented. The content of this presentation is a result of the Competence Center Corporate Data Quality (CC CDQ) at the University of St. Gallen, Switzerland.
Data Marketplace and the Role of Data VirtualizationDenodo
Watch full webinar here: https://bit.ly/3IS9sQS
A data marketplace is like an online shopping interface specializing in data. Ideally, it should work just like an online store, with minimal latency and maximum responsiveness. However, this does not mean that all of the data in the data marketplace needs to be stored in the same central repository.
In this session, Shadab Hussain, Americas Sales Head, Data Analytics at Wipro, a partner company with Denodo and a co-sponsor of DataFest 2021, talks about the role of data virtualization in enabling full-featured data marketplaces. Such data marketplaces provide real-time, curated access to data, even when the data is stored across many different sources throughout the organization.
You will learn:
- The main features of a data marketplace
- Why organizations need data marketplaces
- Why data marketplaces sometimes fail
- How data virtualization enables the most effective data marketplaces
- How one of Europe’s premiere public healthcare system organizations leveraged a data marketplace to improve data consumption and ease of access
An attempt at categorizing the thriving big data ecosystem by @mattturck and @shivonZ - comments are welcome (please add your thoughts on mattturck.com)
Business intelligence (BI) refers to techniques used to analyze business data and present it to facilitate decision making. BI technologies provide historical, current, and predictive views of business operations to support better decisions. The major components of BI include applications like reporting, analytics, and dashboards. While BI helps improve productivity, decision making, and results, it also faces disadvantages like data piling, costs, and complexity.
This document discusses data quality and its importance for businesses. It provides a case study of how British Airways improved data quality which increased efficiency and decision making. An insurance case study shows how improving data quality led to better customer understanding and risk assessment. Finally, the document outlines key drivers of data quality including regulatory compliance, business intelligence, and customer-centric models.
Download at http://DavidHubbard.net/powerpoint - This Introduction to Business Intelligence gives an overview of how Business Intelligence fits into business strategy in general. It does not go into the specific technologies of Business Intelligence. It is meant to be used to explain Business Intelligence to those not already familiar with Business Intelligence.
This presentation discusses how technology can be used to teach graphing equations to 9th grade algebra students. It provides examples of blogs, videos, websites and apps that teachers can use to enhance instruction and help students better understand graphing, including Khan Academy videos, Desmos graphing calculator, and interactive websites with lessons, worksheets and activities. The presentation evaluates the reliability of internet sources and properly cites the visuals used.
The Edge of Disaster Recovery - May Events Presentation FINALJohn Baumgarten
Peak 10 provides disaster recovery services including disaster recovery as a service (DRaaS). Their approach involves replicating customer VMs to their Recovery Cloud using Zerto virtual replication appliances with recovery point objectives of seconds and recovery time objectives of minutes. Peak 10 manages the disaster recovery environment including ongoing monitoring, twice annual testing, and support for declaration events. Their DRaaS solution is hypervisor-agnostic, storage-agnostic, and can scale on demand.
PAETEC Disaster Recovery & Business Continuity SolutionsMark Lawrence Peay
The document discusses PAETEC's disaster recovery and network diversity solutions. It outlines critical BC/DR elements and assessing business risks. It then describes PAETEC's products that provide inherent and customized network diversity, including access diversity solutions, call rerouting solutions, MPLS/internet redundancy, and email scanning backup. Sample customer architectures are provided applying these solutions.
Next Generation Data Centre - IDC InfravisionDamian Hamilton
Recent Presentation made at IDC Infravision in Indonesia, discussing the emerging world of a Hybrid DC - one that has traditional DC components and a blend of cloud services.
Some client scenarios and case studies and thoughts around Active-Active DC, Right-sourcing Workloads in a Next Generation DC & building a Green DC Facility
The document discusses how social media can help students develop their careers in three key ways: building connections, gaining industry knowledge, and developing a strong personal brand. It proposes social media training sessions for students to teach networking, personal branding, professionalism, and how to use platforms like LinkedIn, Facebook, blogging and Twitter. The training would be led by the founder of an innovative career guidance startup and cover techniques for advancing careers through social media.
Juniper is introducing new networking products to extend their advantage in two-tier data center fabrics. The new EX4500, EX8200-40XS line card, and MX80 3D router simplify operations, increase application performance, and reduce costs compared to legacy multi-tier networks or competitors' solutions. Juniper's two-tier approach using Virtual Chassis technology creates a network fabric that lowers latency and enables dynamic cloud infrastructure.
Datacenter transformation - Dion van der ArendHPDutchWorld
(1) Datacenters are facing increasing demands that many current facilities cannot meet, requiring transformation through consolidation, virtualization, and improved energy efficiency and availability.
(2) Datacenter designs are evolving from small, isolated IT islands to larger, standardized facilities with improved reliability, energy conservation, and reduced costs. Next-generation designs feature modular pods that can be deployed rapidly and offer high power densities up to 20kW/m2.
(3) As datacenter economics have changed, managing costs such as power and cooling have become priorities, driving the need for more energy-efficient computing and facility solutions.
Datacenter Transformation - Energy And Availability - Dio Van Der ArendHPDutchWorld
(1) Datacenters are facing increasing demands that many current facilities cannot meet, requiring transformation through consolidation, virtualization, and improved energy efficiency and availability.
(2) Datacenter designs are evolving from small, isolated IT islands to larger, standardized facilities with improved reliability through redundant critical systems and failover capabilities.
(3) Next generation datacenter designs focus on high power density, energy efficiency through technologies like containerization, and rapid deployment in multiple locations for business flexibility.
Dynamic IT for SAP - Fujitsu Siemens Computers Offers and Values for SAP Cust...FSCitalia
The document discusses a presentation by Paolo Satalic of Fujitsu Siemens Computers on Dynamic IT for SAP. It covers the partnership between SAP and FSC, the need to move from R/3 to enterprise SOA and adaptive computing. FSC offers FlexFrame for SAP and BladeFrame solutions to provide a dynamic IT infrastructure for SAP implementations.
Complimentary report on the current needs of CIOs BMAJCHER
Ahead of the Corporate IT Exchange 2012, we asked participants what the factors and main trends influencing their IT function are and what types of solutions providers could help them deliver on their IT and business priorities. The results are shown in an easy to digest visual presentation
Mario Derba, Country Manager of HP SW Solutions in Italy, discussed how technology is advancing rapidly and creating an environment where enterprises need to be able to instantly respond to opportunities and competition. Derba outlined how HP software can help enterprises by enabling application transformation, converged infrastructure, enterprise security, information optimization, and hybrid delivery models. Derba also highlighted HP's leadership position in the software market and new innovations that further simplify, automate, and secure IT operations for businesses.
This document summarizes a presentation given by Deltek about their software solutions for project-focused businesses. The key points are:
1) Deltek provides enterprise project management software for industries like government contracting, architecture/engineering, and consulting.
2) They help customers improve project visibility, resource optimization, and new business wins.
3) Deltek hosted a session where government and industry professionals identified the top 5 challenges for earned value practitioners as inconsistent processes, lack of management buy-in, integrating cost and schedule data, producing reliable reports manually, and different approaches across organizations.
4) Deltek's software aims to address these challenges through automated reporting, early warning indicators, and integrated data across key areas
This document discusses developing an IT strategy in uncertain times and challenges of effective software delivery. It outlines three key challenges: complexity challenges due to more granular functionality and large projects/assets, team challenges due to dispersed teams, and process challenges due to the need for agility and market experimentation. It emphasizes balancing budget planning with strategic planning and gaining persistent commitment to maximizing value delivery over the long-term.
Archstone Consulting recommends targeting a company's IT service delivery model to reduce IT costs more effectively than solely focusing on technology assets. A robust IT service delivery model has four key components: governance, organization, operational processes, and performance management. Archstone's rapid assessment identifies improvement opportunities within 5-7 days through workshops and a maturity model analysis to understand gaps and savings potential. The assessment delivers a comparative spend analysis and recommendations.
The document discusses the emergence of cloud computing and HP's role in pioneering cloud computing technologies and services. It provides an overview of cloud computing concepts, HP's flexible computing services, and the open cloud computing research testbed being developed by HP, Intel, and Yahoo to advance cloud computing research. The testbed will provide a large-scale, global platform for researchers to experiment with data center management and cloud services technologies.
The document discusses ADP Dealer Services' implementation of a master data management (MDM) system using Oracle software. It summarizes their journey so far, which involved overcoming silos between different business units, establishing governance practices, and integrating data from sales, finance, and external sources. Their next steps are to expand MDM to additional parts of the business and further improve data quality through continuous processes and profiling during acquisitions. The presentation emphasizes that MDM is an ongoing program rather than a single project.
This document summarizes key trends and directions for IT executives in 2008 according to a research firm called iReach. It finds that CIO priorities in 2008 will be improving support for business projects and reducing operational costs. Telecom costs will remain a priority and IP telephony/VoIP deployments are expected to double. Enterprise software licenses are expected to grow again and managed services and outsourcing will continue growing substantially over 2007 levels as companies seek improved IT support and cost savings.
It one conf it costs chargeback 2008_04_22Laurent Remmy
The document discusses best practices for managing and allocating IT costs through chargeback models. It recommends identifying all IT costs, dividing them into cost centers, and then allocating costs to business units based on chargeback keys tied to IT resource usage. A practical example is provided that showed how a company improved cost transparency and reduced IT expenditures by 37% by implementing these practices.
Nagios Conference 2011 - Christian Mies - German Health Insurance Company Ref...Nagios
Christian Mies' presentation on a reference story for a German Health Insurance Company. The presentation was given during the Nagios World Conference North America held Sept 27-29th, 2011 in Saint Paul, MN. For more information on the conference (including photos and videos), visit: http://go.nagios.com/nwcna
HCL Infosystems provided total outsourcing of IT services to a leading auto ancillary company in India to help modernize their fragmented IT infrastructure and systems. This included standardizing business processes, implementing an ERP system across all units, and providing infrastructure management and application support services. The goals were to improve operational efficiency, provide end-to-end business visibility, and reduce IT costs by 20% while achieving 99.99% service availability. HCL emerged as a strategic partner by aligning the client's IT strategy with their business needs for growth.
Learn how L&T–CASE engineers faster production with SAP and IBM, improving both business efficiency and the availability of its business-critical IT platform and offering better than 99.9 percent uptime. For more information on System x, visit http://ibm.co/Q7m3iQ.
Visit http://bit.ly/KWh5Dx to 'Follow' the official Twitter handle of IBM India Smarter Computing.
Sustainable IT for Energy Management: Approaches, Challenges, and TrendsEdward Curry
An invited talk to the Galway-Mayo Institute of Technology on the current state of the art in Sustainable IT for energy management, the challenges, and the emerging trends.
Outsourcing The Next Frontier In Editorial Workflow Shivaji SenguptaNXTKey Corporation
This document summarizes a presentation about outsourcing editorial workflows. It discusses how HOV Services is a large outsourcing company that processes over 10 million transactions per month globally. It then discusses how editorial workflows have evolved from physical to digital processes. It proposes that outsourcing help desks and maintenance can provide benefits like reduced costs, better integration, and a single support strategy. Finally, it provides recommendations on best practices for outsourcing like defining service levels, skills-based routing, and treating the support operation like a call center.
HP is expanding its mission critical converged infrastructure with Project Odyssey. The project aims to modernize mission critical computing by bringing Integrity/HP-UX technology to x86 servers, extending HP's strategy. This will allow customers to do mission critical computing on their terms with a variety of operating systems and applications, providing flexibility and choice. Intel supports the project as continuing innovation in Itanium and Xeon will allow HP and Intel to deliver customer-driven mission critical solutions. Industry analysts and users have praised Project Odyssey as a smart move that promises gains for HP customers.
SAP in-Memory Computing technology enables real-time computing by bringing together online transaction processing applications and online analytical processing applications at a low total cost. It combines hardware and software innovations to replace traditional databases and provide performance improvements of up to 1000x with data compression of up to 10x. Examples of how it can be used include giving shop floor associates instant access to the same data as board members, enabling movie studios to respond immediately to consumer feedback, and allowing utilities companies to offer consumers incentives to reduce energy usage during peak periods.
Virtualisatie In Het NGDC - Marc JanssenHPDutchWorld
HP DUTCHWORLD 2008 introduces HP Insight Dynamics - VSE, which allows organizations to treat physical and virtual servers in the same way by using "Logical Servers". Logical Servers are server profiles that contain resource requirements and can be instantiated on physical blades or as virtual machines. HP Insight Dynamics - VSE also provides capacity planning and workload optimization capabilities to reduce costs and energy usage.
Polyserve DB Consolidation Platform - Clemens EsserHPDutchWorld
HP's PolyServe platform allows for consolidating multiple SQL Server instances onto a single physical server or across multiple servers for higher utilization and fault tolerance compared to virtualization. Key benefits include: (1) Increasing SQL Server utilization from 5% to over 75% (2) Guaranteeing high availability for all instances (3) Reducing ongoing administration costs through features like one-click updates. PolyServe offers more efficient consolidation and management of SQL Server workloads than VMware by utilizing shared storage and enabling rapid instance failover between physical servers.
Next Generation Datacenter Oracle - Alan HartwellHPDutchWorld
The document discusses next generation data center solutions from Oracle and HP. It highlights the need for businesses to have agile infrastructure that can quickly adapt to changing needs. Oracle and HP are introducing new products like the Exadata Storage Server and HP Oracle Database Machine that promise unprecedented performance, scalability, and availability for data warehousing. These solutions are optimized to handle the exponential growth of data and claim to be at least 10 times faster than conventional data warehouse deployments.
The document discusses HP's StorageWorks solutions for bridging the gap between data explosion and storage infrastructure. Some key points:
1. Data has become critical for businesses and is growing exponentially, posing challenges for storage.
2. HP StorageWorks provides integrated storage solutions including blades, extreme capacity systems, virtualized storage, and data protection/archiving to optimize storage infrastructure.
3. The solutions aim to make infrastructure change-ready, lower costs through features like thin provisioning and data reduction, and provide a trusted partner to businesses.
The document discusses Business Technology Optimization (BTO) software from HP that aims to align IT with business goals while reducing costs. BTO integrates solutions across IT strategy, applications, and operations to automate and standardize processes. This helps deliver measurable business outcomes, improve predictability and accountability of IT, and demonstrate IT's value. HP claims market leadership across the IT value chain with best-in-class products in categories like project management, application security, and asset management.
This document is an agenda for the HP Dutchworld 2008 event. The agenda outlines several presentations that will be given on networking topics such as next generation datacenter networking trends, wireless 802.11n solutions, and demonstrations of datacenter and wireless networking technologies and solutions. The event will also include sessions on unifying wired and wireless networks and HP's roadmap and technology management software demonstration.
Data Center Automation - Erwin Van KruiningHPDutchWorld
1) Data center infrastructure is growing exponentially but management costs are spiraling out of control due to the complexity and shortage of qualified talent.
2) HP Business Service Automation provides a comprehensive and integrated suite for automating the entire data center across networks, servers, storage, applications and business services.
3) It enables organizations to optimize operations, improve efficiency, ensure compliance and reduce costs through automated discovery, provisioning, patching, configuration and more.
The document summarizes security risks related to web applications and discusses how applications have become the main target of attacks. It notes that over 85% of scanned sites show vulnerabilities that can expose sensitive data and that costs of data breaches to enterprises can range from $90 to $305 per compromised record. The document advocates that application security needs to be addressed at the development stage rather than trying to bolt on security after applications are built.
Oracle - Next Generation Datacenter - Alan HartwellHPDutchWorld
The document discusses next generation data center solutions from Oracle and HP. It highlights the need for businesses to have agile infrastructure that can quickly adapt to changing needs. Oracle and HP are introducing new products like the Exadata Storage Server and HP Oracle Database Machine that promise unprecedented performance, scalability, and availability for data warehousing. These solutions are optimized to handle the exponential growth of data and claim to be at least 10 times faster than conventional data warehouse deployments.
This talk will cover ScyllaDB Architecture from the cluster-level view and zoom in on data distribution and internal node architecture. In the process, we will learn the secret sauce used to get ScyllaDB's high availability and superior performance. We will also touch on the upcoming changes to ScyllaDB architecture, moving to strongly consistent metadata and tablets.
Must Know Postgres Extension for DBA and Developer during MigrationMydbops
Mydbops Opensource Database Meetup 16
Topic: Must-Know PostgreSQL Extensions for Developers and DBAs During Migration
Speaker: Deepak Mahto, Founder of DataCloudGaze Consulting
Date & Time: 8th June | 10 AM - 1 PM IST
Venue: Bangalore International Centre, Bangalore
Abstract: Discover how PostgreSQL extensions can be your secret weapon! This talk explores how key extensions enhance database capabilities and streamline the migration process for users moving from other relational databases like Oracle.
Key Takeaways:
* Learn about crucial extensions like oracle_fdw, pgtt, and pg_audit that ease migration complexities.
* Gain valuable strategies for implementing these extensions in PostgreSQL to achieve license freedom.
* Discover how these key extensions can empower both developers and DBAs during the migration process.
* Don't miss this chance to gain practical knowledge from an industry expert and stay updated on the latest open-source database trends.
Mydbops Managed Services specializes in taking the pain out of database management while optimizing performance. Since 2015, we have been providing top-notch support and assistance for the top three open-source databases: MySQL, MongoDB, and PostgreSQL.
Our team offers a wide range of services, including assistance, support, consulting, 24/7 operations, and expertise in all relevant technologies. We help organizations improve their database's performance, scalability, efficiency, and availability.
Contact us: info@mydbops.com
Visit: https://www.mydbops.com/
Follow us on LinkedIn: https://in.linkedin.com/company/mydbops
For more details and updates, please follow up the below links.
Meetup Page : https://www.meetup.com/mydbops-databa...
Twitter: https://twitter.com/mydbopsofficial
Blogs: https://www.mydbops.com/blog/
Facebook(Meta): https://www.facebook.com/mydbops/
Monitoring and Managing Anomaly Detection on OpenShift.pdfTosin Akinosho
Monitoring and Managing Anomaly Detection on OpenShift
Overview
Dive into the world of anomaly detection on edge devices with our comprehensive hands-on tutorial. This SlideShare presentation will guide you through the entire process, from data collection and model training to edge deployment and real-time monitoring. Perfect for those looking to implement robust anomaly detection systems on resource-constrained IoT/edge devices.
Key Topics Covered
1. Introduction to Anomaly Detection
- Understand the fundamentals of anomaly detection and its importance in identifying unusual behavior or failures in systems.
2. Understanding Edge (IoT)
- Learn about edge computing and IoT, and how they enable real-time data processing and decision-making at the source.
3. What is ArgoCD?
- Discover ArgoCD, a declarative, GitOps continuous delivery tool for Kubernetes, and its role in deploying applications on edge devices.
4. Deployment Using ArgoCD for Edge Devices
- Step-by-step guide on deploying anomaly detection models on edge devices using ArgoCD.
5. Introduction to Apache Kafka and S3
- Explore Apache Kafka for real-time data streaming and Amazon S3 for scalable storage solutions.
6. Viewing Kafka Messages in the Data Lake
- Learn how to view and analyze Kafka messages stored in a data lake for better insights.
7. What is Prometheus?
- Get to know Prometheus, an open-source monitoring and alerting toolkit, and its application in monitoring edge devices.
8. Monitoring Application Metrics with Prometheus
- Detailed instructions on setting up Prometheus to monitor the performance and health of your anomaly detection system.
9. What is Camel K?
- Introduction to Camel K, a lightweight integration framework built on Apache Camel, designed for Kubernetes.
10. Configuring Camel K Integrations for Data Pipelines
- Learn how to configure Camel K for seamless data pipeline integrations in your anomaly detection workflow.
11. What is a Jupyter Notebook?
- Overview of Jupyter Notebooks, an open-source web application for creating and sharing documents with live code, equations, visualizations, and narrative text.
12. Jupyter Notebooks with Code Examples
- Hands-on examples and code snippets in Jupyter Notebooks to help you implement and test anomaly detection models.
High performance Serverless Java on AWS- GoTo Amsterdam 2024Vadym Kazulkin
Java is for many years one of the most popular programming languages, but it used to have hard times in the Serverless community. Java is known for its high cold start times and high memory footprint, comparing to other programming languages like Node.js and Python. In this talk I'll look at the general best practices and techniques we can use to decrease memory consumption, cold start times for Java Serverless development on AWS including GraalVM (Native Image) and AWS own offering SnapStart based on Firecracker microVM snapshot and restore and CRaC (Coordinated Restore at Checkpoint) runtime hooks. I'll also provide a lot of benchmarking on Lambda functions trying out various deployment package sizes, Lambda memory settings, Java compilation options and HTTP (a)synchronous clients and measure their impact on cold and warm start times.
"Scaling RAG Applications to serve millions of users", Kevin GoedeckeFwdays
How we managed to grow and scale a RAG application from zero to thousands of users in 7 months. Lessons from technical challenges around managing high load for LLMs, RAGs and Vector databases.
LF Energy Webinar: Carbon Data Specifications: Mechanisms to Improve Data Acc...DanBrown980551
This LF Energy webinar took place June 20, 2024. It featured:
-Alex Thornton, LF Energy
-Hallie Cramer, Google
-Daniel Roesler, UtilityAPI
-Henry Richardson, WattTime
In response to the urgency and scale required to effectively address climate change, open source solutions offer significant potential for driving innovation and progress. Currently, there is a growing demand for standardization and interoperability in energy data and modeling. Open source standards and specifications within the energy sector can also alleviate challenges associated with data fragmentation, transparency, and accessibility. At the same time, it is crucial to consider privacy and security concerns throughout the development of open source platforms.
This webinar will delve into the motivations behind establishing LF Energy’s Carbon Data Specification Consortium. It will provide an overview of the draft specifications and the ongoing progress made by the respective working groups.
Three primary specifications will be discussed:
-Discovery and client registration, emphasizing transparent processes and secure and private access
-Customer data, centering around customer tariffs, bills, energy usage, and full consumption disclosure
-Power systems data, focusing on grid data, inclusive of transmission and distribution networks, generation, intergrid power flows, and market settlement data
Have you ever been confused by the myriad of choices offered by AWS for hosting a website or an API?
Lambda, Elastic Beanstalk, Lightsail, Amplify, S3 (and more!) can each host websites + APIs. But which one should we choose?
Which one is cheapest? Which one is fastest? Which one will scale to meet our needs?
Join me in this session as we dive into each AWS hosting service to determine which one is best for your scenario and explain why!
Main news related to the CCS TSI 2023 (2023/1695)Jakub Marek
An English 🇬🇧 translation of a presentation to the speech I gave about the main changes brought by CCS TSI 2023 at the biggest Czech conference on Communications and signalling systems on Railways, which was held in Clarion Hotel Olomouc from 7th to 9th November 2023 (konferenceszt.cz). Attended by around 500 participants and 200 on-line followers.
The original Czech 🇨🇿 version of the presentation can be found here: https://www.slideshare.net/slideshow/hlavni-novinky-souvisejici-s-ccs-tsi-2023-2023-1695/269688092 .
The videorecording (in Czech) from the presentation is available here: https://youtu.be/WzjJWm4IyPk?si=SImb06tuXGb30BEH .
Your One-Stop Shop for Python Success: Top 10 US Python Development Providersakankshawande
Simplify your search for a reliable Python development partner! This list presents the top 10 trusted US providers offering comprehensive Python development services, ensuring your project's success from conception to completion.
Freshworks Rethinks NoSQL for Rapid Scaling & Cost-EfficiencyScyllaDB
Freshworks creates AI-boosted business software that helps employees work more efficiently and effectively. Managing data across multiple RDBMS and NoSQL databases was already a challenge at their current scale. To prepare for 10X growth, they knew it was time to rethink their database strategy. Learn how they architected a solution that would simplify scaling while keeping costs under control.
Northern Engraving | Nameplate Manufacturing Process - 2024Northern Engraving
Manufacturing custom quality metal nameplates and badges involves several standard operations. Processes include sheet prep, lithography, screening, coating, punch press and inspection. All decoration is completed in the flat sheet with adhesive and tooling operations following. The possibilities for creating unique durable nameplates are endless. How will you create your brand identity? We can help!
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/temporal-event-neural-networks-a-more-efficient-alternative-to-the-transformer-a-presentation-from-brainchip/
Chris Jones, Director of Product Management at BrainChip , presents the “Temporal Event Neural Networks: A More Efficient Alternative to the Transformer” tutorial at the May 2024 Embedded Vision Summit.
The expansion of AI services necessitates enhanced computational capabilities on edge devices. Temporal Event Neural Networks (TENNs), developed by BrainChip, represent a novel and highly efficient state-space network. TENNs demonstrate exceptional proficiency in handling multi-dimensional streaming data, facilitating advancements in object detection, action recognition, speech enhancement and language model/sequence generation. Through the utilization of polynomial-based continuous convolutions, TENNs streamline models, expedite training processes and significantly diminish memory requirements, achieving notable reductions of up to 50x in parameters and 5,000x in energy consumption compared to prevailing methodologies like transformers.
Integration with BrainChip’s Akida neuromorphic hardware IP further enhances TENNs’ capabilities, enabling the realization of highly capable, portable and passively cooled edge devices. This presentation delves into the technical innovations underlying TENNs, presents real-world benchmarks, and elucidates how this cutting-edge approach is positioned to revolutionize edge AI across diverse applications.
AppSec PNW: Android and iOS Application Security with MobSFAjin Abraham
Mobile Security Framework - MobSF is a free and open source automated mobile application security testing environment designed to help security engineers, researchers, developers, and penetration testers to identify security vulnerabilities, malicious behaviours and privacy concerns in mobile applications using static and dynamic analysis. It supports all the popular mobile application binaries and source code formats built for Android and iOS devices. In addition to automated security assessment, it also offers an interactive testing environment to build and execute scenario based test/fuzz cases against the application.
This talk covers:
Using MobSF for static analysis of mobile applications.
Interactive dynamic security assessment of Android and iOS applications.
Solving Mobile app CTF challenges.
Reverse engineering and runtime analysis of Mobile malware.
How to shift left and integrate MobSF/mobsfscan SAST and DAST in your build pipeline.
Skybuffer SAM4U tool for SAP license adoptionTatiana Kojar
Manage and optimize your license adoption and consumption with SAM4U, an SAP free customer software asset management tool.
SAM4U, an SAP complimentary software asset management tool for customers, delivers a detailed and well-structured overview of license inventory and usage with a user-friendly interface. We offer a hosted, cost-effective, and performance-optimized SAM4U setup in the Skybuffer Cloud environment. You retain ownership of the system and data, while we manage the ABAP 7.58 infrastructure, ensuring fixed Total Cost of Ownership (TCO) and exceptional services through the SAP Fiori interface.
In the realm of cybersecurity, offensive security practices act as a critical shield. By simulating real-world attacks in a controlled environment, these techniques expose vulnerabilities before malicious actors can exploit them. This proactive approach allows manufacturers to identify and fix weaknesses, significantly enhancing system security.
This presentation delves into the development of a system designed to mimic Galileo's Open Service signal using software-defined radio (SDR) technology. We'll begin with a foundational overview of both Global Navigation Satellite Systems (GNSS) and the intricacies of digital signal processing.
The presentation culminates in a live demonstration. We'll showcase the manipulation of Galileo's Open Service pilot signal, simulating an attack on various software and hardware systems. This practical demonstration serves to highlight the potential consequences of unaddressed vulnerabilities, emphasizing the importance of offensive security practices in safeguarding critical infrastructure.
2. The HP IT mission
• Provide good information to
enable better business decisions
• Significantly reduce the cost of IT
while delivering more to the
business
• Lower risk to the enterprise with
better control of the infrastructure
• Be a showcase for enterprise
customers
2 11 December 2008
3. HP top 5 IT initiatives
Data Portfolio
Centers Management
Enterprise IT
Data Workforce
Warehouse Effectiveness
World-Class
IT
3 11 December 2008
4. HP understands: we’re in the new world
of business technology
IT as a line of business just a cost center
IT execs with executive
just IT budgets
accountability
Business initiatives just IT projects
NOT
just service level
Business requirements
agreements
Optimized infrastructure IT silos
Business services just IT services
5. HP IT 2005:
Large scale and scattered
100+ HP
IT sites >1,240+
IT 4+% of in 53 6,000 active IT
revenue countries applications projects
<50% of
resources 750+
time data 30% IT
dedicated to marts managed ~19,000 IT
innovation by IT professionals
85+ data
including
centers in
contingent
29
workforce
Under- countries
managed
network
5 11 December 2008
6. IT cost categories
People (Direct) (7%)
Facilities Infrastructure (40%) • Facility & security staff
• Capacity planners
• Building & land (real estate) • Installation & configuration
• Utility bills (60% of cost) • On-site maintenance
• Power & cooling equipment Direct DC labor - Contingents & Perms
• Physical security
• $40M capital/yr (maint & capacity)
REWS allocations
People (In-Direct) (51%)
•Operations bridge
Application & Infrastructure (11%) •Account mgmt
• Compute equipment •Enterprise Architecture & PMO
•Application Design
• Storage equipment
•Application Development
• Communications equipment •Application support
• 30-60% is dev/test •Other deep tech support
•3rd party providers
Maintenance contracts & Depreciation In-Direct labor - Contingents & Perms
Demand on contents (equipment & apps) drives spending in all other areas
6 11 December 2008
10. Enterprise Architecture Efforts
HP Combined Model Unification
Diversification
Both
Photo
Storage
Digital
Media
Halo Marketing
Collateral
Internet
Print
New &
Emerging
Business Unit Unique
Core Processes
Room
& Sharing Distribution Management Design & Management Processes
Management
Common Context
Mandatory
Processes
cture
rchite Customer Segments
rise A Vertical Markets
Enterp
Enterprise Information Data
MDM Master Management
Routes to Market
ess Data Subject Areas
Busin Business Plans Procurement
EDW Reference Data
Inventory
Analytical Data
Customers
Sales Engagement Models
Customer Support
Supply Chain Highways
10.0 Manage 11.0 Manage 12.0 Manage
7.0 Plan & 8.0 Manage 9.0 Manage
• Customers • Warranty & Repairs
services
support
• Inventory at Location
performanceaccounting resources
• Purchase Orders
human
• Business Charters • In-Transit Inventory • Installed Base • Customer Support
Product Business Areas
• Supplier Planned Deliveries
• Financial Plans • Spare Parts • Customer Product Usage Contracts
• Receipts
• Product Lifecycle Plans • Supplier Agreements ERP Transactions
• Customer Support
Product Design Models
• Sales Plans / Quota Events
• Supplier Performance Marketing
manage finances &
• Capacity Plans • Returns to Supplier Sales &
physical
assets
• Markets
• Order Forecasts
• Material Requirements Plans Equipment • Customer Purchase
Distribution
• Channels
Finances
Regions
Agreements • Product Pricing Models • General Ledger
• Production Plans
• Manufacturing lines & • Marketing Programs & • Payables
ter Data
• Shipment Plans machinery • Quotes
Promotions • Receivables
information
EDW & Mas
• Sales Orders
resources
• Contingency Plans • Spare Parts Descriptions
• Assets
• Vehicles
• Computers
1.0 CreateSales Order Backlog
• &
• 2.0 Market 3.0 Sell
manageShipments & products/ 4.0 order Perform
5.0 Manage
supply 6.0 Manage Business Model
Product Generation
• Product Requirements
• Office equipment & products
products• Invoices
services services management chain & support
Reference Data
customers Variable Core Processes
• Telephone equipment servicesCarrier Shipment Status
•
operations
• Countries
TAAP Process
• Experiments
• Prototypes • Routes & Legs • Currencies
• Returns from Customer • Languages
Manufacturing
• Reseller Inventory
14 9 August 2007 Facilities
Products & Services • Manufacturing Processes • Reseller Sales
• Plants & Buildings
• Work Orders • Reseller Commitments
• Products • Geographic Locations
TAAP Retiring Applications through FY08
• Product Specifications
• Product Categories/Groups
• Work Order Receipts
•Identify Planetary Applications
• Material Movements
• Manufacturing Events
Tax & Licensing
• Regulatory Documents Organizations
Human
Resources
• Employees
Materials •Collect New Dimension attributes to
• Lots & Batches
• Transaction Tax • Suppliers • Job Classifications
• Capacity • Employee Records
understand application overlaps
Solu
• Channel Partners
• Material Unit of Measure • Capability • Compensation
• Competitors
• Material Items • Staffing
•Begin rationalization process across
• Factory Schedules
tion
• Carriers • Training
• Bills of Material • Production Metrics • HP Organization
• Quality Tests organizations
L2
• Material Documents
Stac Post FY08 Retirements
Updated 4/20/00 by G. Robinson
ks
Go-Forward Applications
Proposed Solution Stack Model Planetary End-State Apps
Planetary •Complete investigations
Solution Stacks – What, How, Where Target
Standards
•Recommend End-State
Planetary Applications
•Communicate
DCC
TAAP Viewpoint = Target Planetary Apps Vision
4
How should you determine What belongs within
the Solution Stack ? a Solution Stack?
What:
Solution Stack has:
− Application Capability
− Approved/Preferred Technologies
How:
− Enterprise Services (Assets)
− Implementation Pattern
− Code Examples
Where:
− Corporate Guidelines, Standards
− Hosting Infrastructure
11 HP Confidential
11. HP data center transformation strategy
• Enable IT to be more nimble and provide better information
• Provide more dependable, simplified operations
• Enable faster delivery of new technologies, services, and information
• Accommodate growth
• Provide for improved business continuity
• Significantly reduce IT costs
11 11 December 2008
12. HP data center transformation includes…
• Technology refresh
• Standardized technology
environment
• Retirement of legacy applications
• Next-generation data center
build out
− State-of-the-art infrastructure for
today and tomorrow
− Automated monitoring and control
• HP Dynamic Smart Cooling
• Real business continuity/disaster
recovery
12 11 December 2008
13. HP data center locations
• Consolidating >85 global data
centers to six in three U.S.
geographical zones (Austin,
Houston, Atlanta) chosen for:
Zone A Zone C
− Proximity to major fiber optic Site 1
backbones Site 5
Site 2
− Access to multiple power grids Site 6
Atlanta
− Costs Austin
• Total white space 400,000 sq. ft.
• Within each zone: Houston
− 2 sites within 10-25 mile radius of Zone B
each other Site 3
− Each site designed for high Site 4
availability, disaster recovery and
business continuity
13 11 December 2008
14. Data centers – detail locations
Global data centers
829 Miles
Austin Houston Atlanta
150 717
Miles Miles
Austin - Site 1 Houston - Site 3 Atlanta - Site 5
- Completed 11/06 - Completed 06/06 - Completed 01/07
- 125KSF - 1700 server addition - 50KSF raised floor
2600 Pinemeadow
Austin - Site 2 Houston - Site 4 Atlanta - Site 6
- Completed 06/07 - Completed 05/07 - Completed 07/07
- Greenfield 50KSF - Greenfield 100KSF - Greenfield 50KSF
14 11 December 2008 KSF = 1,000 square feet
15. Data center business continuity and
disaster recovery strategy
Zone A Application Zone B
Next generation data centers Next generation data centers
Site – 1 Site – 2 Site – 3 Site – 4
Active
Active Active
Active Active Active
Tier 1
Active
Active
Dark Active
Active
Dark Dark Dark
Active Active Active Active
Tier 2
Dark Dark Dark Dark
Active Active Active Active
Tier 3
Dark Dark Active
Dark Active
Dark
Business Disaster Business
Availability continuity Reliability recovery Expandability continuity Agility
15 11 December 2008
16. Building the data center of the future
Today’s Data Center Next Generation Data Center
Traditional Data Center
Silo’d, Dedicated Infrastructure Shared, Automated, Virtual,
Monolithic Computing
Delivered as a Service
App1 App2 App3 Integrated, modular apps (SOA)
App
Server Server Server Shared, virtualized server pool
Server/
Storage
Shared storage Shared storage
Centralized, Rigid Practices Technology Integration Business Integration
• Proprietary focus • Islands of technologies Service-
• Service-centric IT : infrastructure,
(OS/architecture dependent) apps and IT delivered as a service
• Data center is the sum of all • Data center service catalog
• Partial consolidation, dedicated
projects • BU & Application groups provide
server and/or application stacks requirements – not solution
• Single vendor, hard-coded
hard- • Internet enabled • Global with ruthless standardization
solution stack • Multi-OS, multi-architecture &
Multi- multi- • Policy based automation, dynamic
multi-vendor environments
multi- resource (re)allocation
• Siloed technology and skills • Cost/complexity improved • Modular, virtualized, power and
space efficient hardware
Built for intra-net use primarily via infrastructure
• intra- • Availability and Continuity
consolidation and technology integrated with the data center
• ‘Static’
Static’ production deployment standardization
17. HP network 2008 architecture overview
Regional HP Sites
Regional sites
Inter DC Global MPLS
backbone
Austin
beltway Houston Atlanta
Data center MPLS Any site to any site
ports DWDM – HP Fiber (RAIL)
routing
VoIP and toll
bypass
17 11 December 2008
18. HP software broadly deployed
• Global NGDCs
− HP Business Availability Center
• Monitors UX, Windows & Linux systems;
>23,500 nodes; transactions for 1,600
new applications
− HP Asset Center
• Globally tracks physical & financial views
of all client, server, storage & network
devices
− HP Configuration Management
• Configuration management for PC &
server software
• HP Development Environment
− HP Quality Center
− Portfolio Management (PPM)
• Neoview
18 11 December 2008
19. Data center transformation –
HP environmental benefits
• 2x available power per square foot from average 60W to
120W+
• 60% reduction in annual energy consumption
• 65% reduction in energy costs
• CO2 emissions reduction equal to 900 km2 of forest-storing
carbon for 1 year
19 11 December 2008
20. HP data center transformation
Simplified infrastructure delivers more
Less = More
• 30% fewer • 80% more
servers processing power
• Decreased • Double the storage
storage cost (all data replicated)
• 50% lower • Triple the
networking cost bandwidth
• Faster application
• Fewer sites
rollout
20 11 December 2008
23. What is the Strategic Journey?
Operating Model Future
Co- ordinated Unified
Many companies are currently diversified benefiting from
High
synergies between Divisions who are free to pursue their
strategies, whilst not having fully standardised or integrated
business processes across business units.
Integrated
Replicated
Low
Diversified
Low Standardized High
Is Infrastructure transformation staying within a diversified operating
model or part of changing it?
Adapted from J. Ross, P. Weill, D. Robertson, Enterprise Architecture as Strategy:
11 December
Creating a Foundation for Business Execution, HBS Press, 2006. Used with
23 2008 permission.
24. Type 1 # Transformation
Run a best in class IT Infrastructure
Operating Model Diversified
Diversified business processes are not • Shift the glass ceiling by
standardised or integrated seduction & nudging
Process Defined
Enterprise wide standards for process
definition
Application Services
Shared and standardised application
services and effective lifecycle
management
Virtualized Services
Virtualized shared SOA compliant
infrastructure, service process fully
defined documented with
continuous monitoring
Shift the Infrastructure
Glass Ceiling
11 December
24 2008
25. Type 2# Transformation
Change the way business is run
Operating Model Remove the glass ceiling
Transformed • Top to bottom business & IT
Business processes are standardised alignment
or integrated + Seduction & nudging
• All of Type 1#
Process Managed
Enterprise-wide governance, single
repository, management of
variance, global owners
Applications Simplified
Global applications rolling out
reducing duplication, global teams,
lifecycle management, transparent
shared understanding of landscape
Adaptive Shared Services
Adaptive pooled automated
infrastructure shared services, quality
based end to end service management,
real time supply and demand
11 December
25 2008
26. Choosing the Transformation Type
Type #1 IT Transformation: Standardized Type #2 Business Transformation:
& Optimized Lean & Mean, Aligned
Operating Model Diversified Operating Model
Diversified business processes are not Transformed
standardised or integrated Business processes are standardised
or integrated
Process Defined Process Managed
Enterprise wide standards for process Enterprise-wide governance, single
definition repository, management of
variance, global owners
Application Services Applications Simplified
Shared and standardised application Global applications rolling out
services and effective lifecycle reducing duplication, global teams,
management lifecycle management, transparent
shared understanding of landscape
Virtualized Services
Virtualized shared SOA compliant
AdaptiveShared Services
infrastructure, service process fully
Adaptive pooled automated
defined documented with
infrastructure shared services, quality
continuous monitoring. IT
based end to end service
management, real time supply and
demand
11 December
26 2008