Access to full webinar: http://goo.gl/dQjxRe
This webinar by Hortonworks, VHA and Denodo provides information about the functionalities and benefits of Hadoop in Modern Data Architectures; how Hadoop along with data virtualization simplify data management and enable faster data discovery; and what data virtualization can offer in big data projects. VHA explains how they deployed data virtualization and Hadoop together and presents their lessons learned and best practices for data lake and data virtualization deployment.
BioStorage Technologies Case Study: How to build an informatics platform usin...Denodo
Rick Hart, Director of Global Technology Solutions at BioStorage Technologies, Inc., presents case study that will help you understand how BioStorage used data virtualization as a fast, flexible and secure logical data warehouse to build a transformational and scalable informatics platform. This advanced technology solution supports the identification of the best biological samples for the conduct of future clinical and translational research studies.
Access the webinar: http://goo.gl/p08pTz
These slides were presented in a webinar by Denodo in collaboration with BioStorage Technologies and Indiana Clinical and Translational Sciences Institute and Regenstrief Institute.
BioStorage Technologies, Inc., Indiana Clinical and Translational Sciences Institute, and Regenstrief Institute (CTSI) have joined Denodo to talk about the important role of technological advancements, such as data virtualization, in advancing biospecimen research.
By watching this webinar, you can gain insight into best practices around the integration of biospecimen and research data as well as technology solutions that provide consolidated views and rapid conversions of this data into valuable business insights. You will also learn how data virtualization can assist with the integration of data residing in heterogeneous repositories and can securely deliver aggregated data in real-time.
BIG Data & Hadoop Applications in HealthcareSkillspeed
Explore the applications of BIG Data & Hadoop in Healthcare via Skillspeed.
BIG Data & Hadoop in Healthcare is a key differentiator, especially in terms of providing superior patient care. They are used for optimizing clinical trials, disease detection & boosting healthcare profitability.
To get more details regarding BIG Data & Hadoop, please visit - www.SkillSpeed.com
Big Data Analytics for Healthcare Decision Support- Operational and ClinicalAdrish Sannyasi
Splunk’s data analytics platform could be utilized to solve many high impact business problems in healthcare delivery systems to reduce cost, improve patient outcome and safety, and enhance care coordination experience. Analyze observed behavior from healthcare event data and metadata to discover patterns, monitor compliance, and optimize the workflow. Furthermore 80% of healthcare data is unstructured (clinical free text and documentation), or semi-structured and many new data sources are such as tele health, mobile health, sensors, and devices are getting integrated in many healthcare systems specifically in the area of chronic disease management. So, one need analytics software that can harvest, interpret, enrich, normalize, and model diverse structured and unstructured data and analytics approaches that embrace the “data turmoil” by relying less on standardized data items and more on the capability to process data in any format.
The Hive Data Virtualization Introduction - Sanjay Krishnamurti, Chief Archit...The Hive
Talk by Sanjay Krishnamurthi, Chief Architect of Informatica at The Hive Panel Discussion "Data Virtualization: Beyond Traditional Data Integration" on May 28, 2013.
This prevention is a reflection of my vision on how Big Data impacts healthcare and the efforts that Oracle and VX Healthcare Analytics put into making Big Data work in the patient profiling space
BioStorage Technologies Case Study: How to build an informatics platform usin...Denodo
Rick Hart, Director of Global Technology Solutions at BioStorage Technologies, Inc., presents case study that will help you understand how BioStorage used data virtualization as a fast, flexible and secure logical data warehouse to build a transformational and scalable informatics platform. This advanced technology solution supports the identification of the best biological samples for the conduct of future clinical and translational research studies.
Access the webinar: http://goo.gl/p08pTz
These slides were presented in a webinar by Denodo in collaboration with BioStorage Technologies and Indiana Clinical and Translational Sciences Institute and Regenstrief Institute.
BioStorage Technologies, Inc., Indiana Clinical and Translational Sciences Institute, and Regenstrief Institute (CTSI) have joined Denodo to talk about the important role of technological advancements, such as data virtualization, in advancing biospecimen research.
By watching this webinar, you can gain insight into best practices around the integration of biospecimen and research data as well as technology solutions that provide consolidated views and rapid conversions of this data into valuable business insights. You will also learn how data virtualization can assist with the integration of data residing in heterogeneous repositories and can securely deliver aggregated data in real-time.
BIG Data & Hadoop Applications in HealthcareSkillspeed
Explore the applications of BIG Data & Hadoop in Healthcare via Skillspeed.
BIG Data & Hadoop in Healthcare is a key differentiator, especially in terms of providing superior patient care. They are used for optimizing clinical trials, disease detection & boosting healthcare profitability.
To get more details regarding BIG Data & Hadoop, please visit - www.SkillSpeed.com
Big Data Analytics for Healthcare Decision Support- Operational and ClinicalAdrish Sannyasi
Splunk’s data analytics platform could be utilized to solve many high impact business problems in healthcare delivery systems to reduce cost, improve patient outcome and safety, and enhance care coordination experience. Analyze observed behavior from healthcare event data and metadata to discover patterns, monitor compliance, and optimize the workflow. Furthermore 80% of healthcare data is unstructured (clinical free text and documentation), or semi-structured and many new data sources are such as tele health, mobile health, sensors, and devices are getting integrated in many healthcare systems specifically in the area of chronic disease management. So, one need analytics software that can harvest, interpret, enrich, normalize, and model diverse structured and unstructured data and analytics approaches that embrace the “data turmoil” by relying less on standardized data items and more on the capability to process data in any format.
The Hive Data Virtualization Introduction - Sanjay Krishnamurti, Chief Archit...The Hive
Talk by Sanjay Krishnamurthi, Chief Architect of Informatica at The Hive Panel Discussion "Data Virtualization: Beyond Traditional Data Integration" on May 28, 2013.
This prevention is a reflection of my vision on how Big Data impacts healthcare and the efforts that Oracle and VX Healthcare Analytics put into making Big Data work in the patient profiling space
CTO Perspectives: What's Next for Data Management and Healthcare?Health Catalyst
Health Catalyst's Chief Technology Officer, Bryan Hinton, shares his perspective, thoughts, and insights on new and emerging trends for data management in healthcare. Bryan offers a brief presentation on what hospitals and healthcare systems can expect, followed by an extended Q&A.
d-Wise offers an ability to reduce data integration efforts within Pharmaceutical clinical operations significantly with the implementation of SAS's Clinical Data Integration software. Discussion of the Opportunity, the Challenges, Data Quality Challenges, Integration Features, Benefits SDTM Implementation
Overview of d-Wise technologies, and their core competencies for building clinical systems and healthcare systems. Technology expertise in SAS, entimo, Oracle solutions. Significant thought leadership on implementation of clinical data standards
Combining Patient Records, Genomic Data and Environmental Data to Enable Tran...Perficient, Inc.
The average academic research organization (ARO) and hospital has many systems that house patient-related information, such as patient records and genomic data. Combining data from a variety of sources in an ongoing manner can enable complex and meaningful querying, reporting and analysis for the purposes of improving patient safety and care, boosting operational efficiency, and supporting personalized medicine initiatives.
In this webinar, Perficient’s Mike Grossman, a director of clinical data warehousing and analytics, and Martin Sizemore, a healthcare strategist, discussed:
-How AROs and hospitals can benefit from a systematic approach to combining data from diverse systems and utilizing a suite of data extraction, reporting, and analytical tools, in order to support a wide variety of needs and requests
-Examples of proposed solutions to real-life challenges AROs and hospitals often encounter
Baptist Health: Solving Healthcare Problems with Big DataMapR Technologies
Editor’s Note: Download the complimentary MapR Guide to Big Data in Healthcare for more information: https://mapr.com/mapr-guide-big-data-healthcare/
There is no better example of the important role that data plays in our lives than in matters of our health and our healthcare. There’s a growing wealth of health-related data out there, and it’s playing an increasing role in improving patient care, population health, and healthcare economics.
Join this webinar to hear how Baptist Health is using big data and advanced analytics to address a myriad of healthcare challenges—from patient to payer—through their consumer- centric approach.
MapR Technologies will cover broader big data healthcare trends and production use cases that demonstrate how to converge data and compute power to deliver data-driven healthcare applications.
How to Load Data More Quickly and Accurately into Oracle's Life Sciences Data...Perficient, Inc.
Sponsors and CROs know the value of having a consolidated and regulatory-compliant data warehouse, such as Oracle’s Life Sciences Data Hub (LSH), as well as the importance of consistently loading data into that warehouse quickly and accurately.
However, as data structures from the source files change over time, it can be very time consuming to modify the data structure in the warehouse itself. Additionally, for the large groups of SAS datasets that are typical for a clinical trial, the out-of-the-box load times can be quite long, as the data is loaded one set at a time.
Perficient has the answer. In this webinar, we discussed and demonstrated an autoloader tool that greatly simplifies the data loading process for LSH. We showed how the autoloader can automatically load files, detect metadata changes, upgrade target structures, and load data, all with no human intervention. In addition, we demonstrated how Perficient’s autoloader tool can load multiple datasets in parallel to minimize load times.
This presentation looks at the role of Big Data with Healthcare. Healthcare is big spending area for both the private and public sector as such it is important to look at ways to improve the delivery of healthcare to patient care.
Building an Intelligent Biobank to Power Research Decision-MakingDenodo
This presentation belongs to the workshop: "Building an Intelligent Biobank to Power Research Decision-Making", from ISBER 2015 Annual Meeting by Lori A. Ball (Chief Operating Officer, President of Integrated Client Solutions at BioStorage Technologies, Inc), Brian Brunner (Senior Manager, Clinical Practice at LabAnswer) and Suresh Chandrasekaran (Senior Vice President at Denodo).
The workshop cover three different topic areas:
- Research sample intelligence: the growing need for Global Data Integration (Biobank Sample and Data Stakeholders).
- Building a research data integration plan and cloud sourcing strategy (data integration).
- How data virtualization works and the value it delivers (a data virtualization introduction, solution portfolio and current customers in Life Sciences industry).
The biomedical R&D environment is increasingly dependent on data meta-analysis and bioinformatics to support research advancements. The integration of biorepository sample inventory data with biomarker and clinical research information has become a priority to R&D organizations. Therefore, a flexible IT system for managing sample collections, integrating sample data with clinical data and providing a data virtualization platform will enable the advancement of research studies. This workshop provides an overview of how sample data integration, virtualization and analytics can lead to more streamlined and unified sample intelligence to support global biobanking for future research.
Data-Driven is Passé: Transform Into An Insights-Driven EnterpriseDenodo
Watch the full webinar: http://goo.gl/c5rlCM
Speakers: Holger Kisker, Ph.D., Vice President and Research Director at Forrester Research Inc.
Listen to Holger Kisker, Vice President and Research Director at Forrester Research Inc., describe the three step plan for organizations to become insights-driven rather than data-driven enterprises. Adopting systems of insight and embedding them into your organization’s systems of engagement, record, and automation allows you to turn data into action. As a final step, data virtualization can help keep all systems in synch, being a key enabler for systems of insight.
(HLS305) Transforming Cancer Treatment: Integrating Data to Deliver on the Pr...Amazon Web Services
In the past ten years, the cost of sequencing a human genome has fallen from $3 billion dollars to $1,000, unlocking the ability for clinicians to use genomics in routine care. As the volume of genomic data used in the clinic begins to grow, healthcare providers are facing a number of new IT challenges, such as how to integrate this data with clinical data stored in electronic medical records, and how to make both available in real time to inform clinical decisions. In this session, find out how UCSF Medical Center and Syapse met these challenges head-on and solved them using AWS, all while remaining compliant with privacy and security requirements. Learn how Syapse's precision medicine platform uses Amazon VPC, Dedicated Instances, Amazon EC2, and Amazon EBS to build a high performance, scalable, and HIPAA-compliant data platform that enables UCSF to deliver on the promise of precision medicine by dramatically reducing time and increasing the accuracy and utility of genomic profiling in cancer treatment.
Challenges in Clinical Research: Aridhia's Disruptive Technology Approach to ...Aridhia Informatics Ltd
This webinar with our partner Pivotal aired in July 2016.
The increasing sophistication of modern medicine, a seemingly endless supply of data, and the ability to perform large-scale computation is transforming clinical research. However, utilising data to generate new treatments and therapies has continued to prove complicated. The silo-based information systems built over the last 30 years are simply unable to scale to support today’s use cases.
Aridhia, creators of AnalytiXagility, the ground-breaking research and healthcare data analysis platform, is now enabling its customers to rapidly analyse massive amounts of data in meaningful ways to change how diseases are understood, managed and treated. Powered by Pivotal Greenplum, AnalytiXagility is at the forefront of Advanced Clinical Research Information Systems (ACRIS), one of Gartner’s 10 “Transformational Digital Disruptors in Healthcare by 2025”.
Learn how big data and data science are being applied to clinical research and:
• Why research-oriented healthcare delivery organizations and academic medical centers need an ACRIS
• How improving collaboration and productivity accelerates the discovery of insights and increases competiveness
• Why robust data security is critical to modernizing engagement between academia, industry and healthcare
• How to reduce research costs while improving commercialization opportunities
• Why enabling transparent analysis and reproducibility of research are key to scientific progress
• Best practices to get started on your digital transformation and Big Data journey
Hadoop and Data Virtualization - A Case Study by VHAHortonworks
VHA (Voluntary Hospitals of America) is the largest member-owned health care company in the US delivering industry-leading supply chain management services and clinical improvement services to its members. At VHA, product, supplier, and member information is siloed across multiple sources. VHA sees value in consolidating the disparate data into a Data Lake, supported by the Hortonworks Data Platform, to enable the business users to discover the related data and provide services to their members. Because of their previous success with data virtualization, powered by Denodo, VHA decided to use data virtualization to enable their business users to discover data using the familiar SQL, and thus abstract their access directly to Hadoop.
During this webinar, you will learn:
- The role, use, and benefits of Hadoop in the Modern Data Architecture.
- How Hadoop and data virtualization simplified data management and enabled faster data discovery.
- What data virtualization is and how it can simplify big data projects.
- Lessons learned from and best practices for deploying data lake and data virtualization.
CTO Perspectives: What's Next for Data Management and Healthcare?Health Catalyst
Health Catalyst's Chief Technology Officer, Bryan Hinton, shares his perspective, thoughts, and insights on new and emerging trends for data management in healthcare. Bryan offers a brief presentation on what hospitals and healthcare systems can expect, followed by an extended Q&A.
d-Wise offers an ability to reduce data integration efforts within Pharmaceutical clinical operations significantly with the implementation of SAS's Clinical Data Integration software. Discussion of the Opportunity, the Challenges, Data Quality Challenges, Integration Features, Benefits SDTM Implementation
Overview of d-Wise technologies, and their core competencies for building clinical systems and healthcare systems. Technology expertise in SAS, entimo, Oracle solutions. Significant thought leadership on implementation of clinical data standards
Combining Patient Records, Genomic Data and Environmental Data to Enable Tran...Perficient, Inc.
The average academic research organization (ARO) and hospital has many systems that house patient-related information, such as patient records and genomic data. Combining data from a variety of sources in an ongoing manner can enable complex and meaningful querying, reporting and analysis for the purposes of improving patient safety and care, boosting operational efficiency, and supporting personalized medicine initiatives.
In this webinar, Perficient’s Mike Grossman, a director of clinical data warehousing and analytics, and Martin Sizemore, a healthcare strategist, discussed:
-How AROs and hospitals can benefit from a systematic approach to combining data from diverse systems and utilizing a suite of data extraction, reporting, and analytical tools, in order to support a wide variety of needs and requests
-Examples of proposed solutions to real-life challenges AROs and hospitals often encounter
Baptist Health: Solving Healthcare Problems with Big DataMapR Technologies
Editor’s Note: Download the complimentary MapR Guide to Big Data in Healthcare for more information: https://mapr.com/mapr-guide-big-data-healthcare/
There is no better example of the important role that data plays in our lives than in matters of our health and our healthcare. There’s a growing wealth of health-related data out there, and it’s playing an increasing role in improving patient care, population health, and healthcare economics.
Join this webinar to hear how Baptist Health is using big data and advanced analytics to address a myriad of healthcare challenges—from patient to payer—through their consumer- centric approach.
MapR Technologies will cover broader big data healthcare trends and production use cases that demonstrate how to converge data and compute power to deliver data-driven healthcare applications.
How to Load Data More Quickly and Accurately into Oracle's Life Sciences Data...Perficient, Inc.
Sponsors and CROs know the value of having a consolidated and regulatory-compliant data warehouse, such as Oracle’s Life Sciences Data Hub (LSH), as well as the importance of consistently loading data into that warehouse quickly and accurately.
However, as data structures from the source files change over time, it can be very time consuming to modify the data structure in the warehouse itself. Additionally, for the large groups of SAS datasets that are typical for a clinical trial, the out-of-the-box load times can be quite long, as the data is loaded one set at a time.
Perficient has the answer. In this webinar, we discussed and demonstrated an autoloader tool that greatly simplifies the data loading process for LSH. We showed how the autoloader can automatically load files, detect metadata changes, upgrade target structures, and load data, all with no human intervention. In addition, we demonstrated how Perficient’s autoloader tool can load multiple datasets in parallel to minimize load times.
This presentation looks at the role of Big Data with Healthcare. Healthcare is big spending area for both the private and public sector as such it is important to look at ways to improve the delivery of healthcare to patient care.
Building an Intelligent Biobank to Power Research Decision-MakingDenodo
This presentation belongs to the workshop: "Building an Intelligent Biobank to Power Research Decision-Making", from ISBER 2015 Annual Meeting by Lori A. Ball (Chief Operating Officer, President of Integrated Client Solutions at BioStorage Technologies, Inc), Brian Brunner (Senior Manager, Clinical Practice at LabAnswer) and Suresh Chandrasekaran (Senior Vice President at Denodo).
The workshop cover three different topic areas:
- Research sample intelligence: the growing need for Global Data Integration (Biobank Sample and Data Stakeholders).
- Building a research data integration plan and cloud sourcing strategy (data integration).
- How data virtualization works and the value it delivers (a data virtualization introduction, solution portfolio and current customers in Life Sciences industry).
The biomedical R&D environment is increasingly dependent on data meta-analysis and bioinformatics to support research advancements. The integration of biorepository sample inventory data with biomarker and clinical research information has become a priority to R&D organizations. Therefore, a flexible IT system for managing sample collections, integrating sample data with clinical data and providing a data virtualization platform will enable the advancement of research studies. This workshop provides an overview of how sample data integration, virtualization and analytics can lead to more streamlined and unified sample intelligence to support global biobanking for future research.
Data-Driven is Passé: Transform Into An Insights-Driven EnterpriseDenodo
Watch the full webinar: http://goo.gl/c5rlCM
Speakers: Holger Kisker, Ph.D., Vice President and Research Director at Forrester Research Inc.
Listen to Holger Kisker, Vice President and Research Director at Forrester Research Inc., describe the three step plan for organizations to become insights-driven rather than data-driven enterprises. Adopting systems of insight and embedding them into your organization’s systems of engagement, record, and automation allows you to turn data into action. As a final step, data virtualization can help keep all systems in synch, being a key enabler for systems of insight.
(HLS305) Transforming Cancer Treatment: Integrating Data to Deliver on the Pr...Amazon Web Services
In the past ten years, the cost of sequencing a human genome has fallen from $3 billion dollars to $1,000, unlocking the ability for clinicians to use genomics in routine care. As the volume of genomic data used in the clinic begins to grow, healthcare providers are facing a number of new IT challenges, such as how to integrate this data with clinical data stored in electronic medical records, and how to make both available in real time to inform clinical decisions. In this session, find out how UCSF Medical Center and Syapse met these challenges head-on and solved them using AWS, all while remaining compliant with privacy and security requirements. Learn how Syapse's precision medicine platform uses Amazon VPC, Dedicated Instances, Amazon EC2, and Amazon EBS to build a high performance, scalable, and HIPAA-compliant data platform that enables UCSF to deliver on the promise of precision medicine by dramatically reducing time and increasing the accuracy and utility of genomic profiling in cancer treatment.
Challenges in Clinical Research: Aridhia's Disruptive Technology Approach to ...Aridhia Informatics Ltd
This webinar with our partner Pivotal aired in July 2016.
The increasing sophistication of modern medicine, a seemingly endless supply of data, and the ability to perform large-scale computation is transforming clinical research. However, utilising data to generate new treatments and therapies has continued to prove complicated. The silo-based information systems built over the last 30 years are simply unable to scale to support today’s use cases.
Aridhia, creators of AnalytiXagility, the ground-breaking research and healthcare data analysis platform, is now enabling its customers to rapidly analyse massive amounts of data in meaningful ways to change how diseases are understood, managed and treated. Powered by Pivotal Greenplum, AnalytiXagility is at the forefront of Advanced Clinical Research Information Systems (ACRIS), one of Gartner’s 10 “Transformational Digital Disruptors in Healthcare by 2025”.
Learn how big data and data science are being applied to clinical research and:
• Why research-oriented healthcare delivery organizations and academic medical centers need an ACRIS
• How improving collaboration and productivity accelerates the discovery of insights and increases competiveness
• Why robust data security is critical to modernizing engagement between academia, industry and healthcare
• How to reduce research costs while improving commercialization opportunities
• Why enabling transparent analysis and reproducibility of research are key to scientific progress
• Best practices to get started on your digital transformation and Big Data journey
Hadoop and Data Virtualization - A Case Study by VHAHortonworks
VHA (Voluntary Hospitals of America) is the largest member-owned health care company in the US delivering industry-leading supply chain management services and clinical improvement services to its members. At VHA, product, supplier, and member information is siloed across multiple sources. VHA sees value in consolidating the disparate data into a Data Lake, supported by the Hortonworks Data Platform, to enable the business users to discover the related data and provide services to their members. Because of their previous success with data virtualization, powered by Denodo, VHA decided to use data virtualization to enable their business users to discover data using the familiar SQL, and thus abstract their access directly to Hadoop.
During this webinar, you will learn:
- The role, use, and benefits of Hadoop in the Modern Data Architecture.
- How Hadoop and data virtualization simplified data management and enabled faster data discovery.
- What data virtualization is and how it can simplify big data projects.
- Lessons learned from and best practices for deploying data lake and data virtualization.
The Top Seven Quick Wins You Get with a Healthcare Data WarehouseHealth Catalyst
In an industry known for its complex challenges that can take years to overcome, health systems can leverage healthcare data warehouses to generate seven quick wins—reporting and analytics efficiencies that empower healthcare organizations to thrive in a value-based world:
Provides significantly faster access to data.
Improves data-driven decision making.
Enables a data-driven culture.
Provides world class report automation.
Significantly improves data quality and accuracy.
Provides significantly faster product implementation.
Improves data categorization and organization.
Health systems that leverage healthcare data warehouses position themselves to do more than just survive the transition to value-based care; they empower themselves to achieve and sustain long-term outcomes improvement by enabling data-driven decision making based on high quality data.
Healthcare Interoperability: New Tactics and TechnologyHealth Catalyst
Every provider agrees on the need for healthcare interoperability to achieve clinical data insights at the point of care. The question is how to get there from the myriad technologies and the volumes of data that comprise electronic medical records. It’s been difficult to organize among participants that have had little incentive to cooperate. And standards for sending and receiving data have been slow to develop. This is changing, but the key components that are still vital to realizing insights are closed-loop analytics and its accompanying tools, an enterprise data warehouse and analytics applications. This article defines the problems and explores the solutions to optimizing clinical decision making where it’s needed most.
How Northwestern Medicine is Leveraging Epic to Enable Value-Based CarePerficient, Inc.
Value-based care and payment reform are prompting hospitals and healthcare providers to more closely manage population health. Hospitals and health systems rely on technology and data to outline the characteristics of their population and identify high-risk patients in order to manage chronic diseases and deliver enhanced preventative care.
Our webinar covered how Cadence Health, now part of Northwestern Medicine, is leveraging the native capabilities of Epic to manage their population health initiatives and value-based care relationships across the continuum of care.
Our speakers:
-Analyzed how Epic’s Healthy Planet and Cogito platforms can be used to manage value-based care initiatives.
-Examined the three steps for effective population health management: Collect data, analyze data and engage with patients.
-Covered how access to analytics allows physicians at Northwestern Medicine to deliver enhanced preventive care and better manage chronic diseases.
-Discussed Northwestern Medicine’s strategy to integrate data from Epic and other data sources.
How to Choose the Best Healthcare Analytics Software Solution in a Crowded Ma...Health Catalyst
There’s a new trend in the healthcare industry to adopt analytics software solutions to help organizations achieve clinical and financial success. Because of the high demand for analytics, there are many players touting their ability to delivery comprehensive solutions. With so many options available, health systems need to be able to cut through the marketing hype to find tools that provide the best value for their needs. Key solutions include an enterprise data warehouse and analytics software applications (from foundational to discovery to advanced). Other considerations include the organization’s readiness for cultural change, the total cost of ownership required, and the viability of the company providing the technology.
Why Your Healthcare Business Intelligence Strategy Can't WinHealth Catalyst
Business intelligence may hold tremendous promise but it can’t answer healthcare’s challenges unless it’s built on the solid foundation of a clinical data warehouse. Learn the definition of business intelligence, why a clinical data warehouse is needed for any healthcare BI strategy, the various options in data warehousing, which one is most effective for hospitals and the industry and why.
Emerging Standards and the Disruption of HIE 1.0Jitin Asnaani
Emerging standards in health information exchange, driven by the ONC and others, are going to change what health IT customers (hospitals, physicians, labs, etc) are going to pay for. This is an overview of those new standards, and my perspective on the implications for health technology companies, particularly EHR and HIE vendors.
The HEALTHieR Cloud is the world's first Global Electronic Healthcare Repository (GEHR). We believe that full adoption may reduce R&D costs by 90%. Researchers, Hospitals, Providers, and Patients can access the system to view patient information.
Steve Rayner
(16/10/08, DHB CIO Open Forum)
The videos at the end of the presentation are on YouTube:
www.youtube.com/watch?v=WcU4t6zRAKg
www.youtube.com/watch?v=ANHNTi9vsNk
www.youtube.com/watch?v=sgugwvHpK7c
As population health management goes mainstream, providers need robust, integrated software solutions to aggregate and analyze data, coordinate care, engage patients and clinicians, and provide full administrative and financial functionality. Population Health Management is a journey, and the number of approaches to population health are varied.
"Health Information Exchange in Oregon – Where We Are & Where We Are Going"
Moderator: Eric McLaughlin, Project Manager, Cognosante
Abigail Sears, Chief Executive Officer, OCHIN
Sharon Wentz, RN, Business Development Coordinator, CareAccord
Laurie Miller, RHIT, CCS-P, HISP Administrator, Gorge Health Connect
Paula Weldon, Project Manager, Jefferson Health Information Exchange
Pairing HIE Data with an Analytics Platform: Four Key Improvement CategoriesHealth Catalyst
Population health and value-based payment demand data from multiple sources and multiple organizations. Health systems must access information from across the continuum of care to accurately understand their patients’ healthcare needs beyond the acute-care setting (e.g., reports and results from primary care and specialists). While health system EHRs have a wealth of big-picture data about healthcare delivery (e.g., patient satisfaction, cost, and outcomes), HIEs add the clinical data (e.g., records and transactions) to round out the bigger picture of patient care, as well as the data sharing capabilities needed to disseminate the information.
By pairing HIE capability with an advanced analytics platform, a health system can leverage data to improve processes in four important outcomes improvement areas:
Workflow
Machine learning
Professional services
Data governance
Enterprise Monitoring and Auditing in DenodoDenodo
Watch full webinar here: https://buff.ly/3P3l4oK
Proper monitoring of an enterprise system is critical to understanding its capacity and growth, anticipating potential issues, and even understanding key ROI metrics. This also facilitates the implementation of policies and user access audits which are key to optimizing the resource utilization in an organization. Do you want to learn more about the new Denodo features for monitoring, auditing, and visualizing enterprise monitoring data?
Join us for the session with Vijayalakshmi Mani, Data Engineer at Denodo, to understand how the new features and components help in monitoring your Denodo Servers and the resource utilizations and how to extract the most out of the logs that the Denodo Platform generates including FinOps information.
Watch on-demand and Learn:
- What is a Denodo Monitor and what’s new in it?
- How to visualize the Denodo Monitor Information and use of Diagnostics & Monitoring Tool
- Introduction to the new Denodo Dashboard
- Demonstration on the Denodo Dashboard
Lunch and Learn ANZ: Mastering Cloud Data Cost Control: A FinOps ApproachDenodo
Watch full webinar here: https://buff.ly/4bYOOgb
With the rise of cloud-first initiatives and pay-per-use systems, forecasting IT costs has become a challenge. It's easy to start small, but it's equally easy to get skyrocketing bills with little warning. FinOps is a discipline that tries to tackle these issues, by providing the framework to understand and optimize cloud costs in a more controlled manner. The Denodo Platform, being a middleware layer in charge of global data delivery, sits in a privileged position not only to help us understand where costs are coming from, but also to take action, manage, and reduce them.
Attend this session to learn:
- The importance of FinOps in a cloud architecture.
- How the Denodo Platform can help you collect and visualize key FinOps metrics to understand where your costs are coming from?
- What actions and controls the Denodo Platform offers to keep costs at bay.
Achieving Self-Service Analytics with a Governed Data Services LayerDenodo
Watch full webinar here: https://buff.ly/3wBhxYb
In an increasingly distributed and complex data landscape, it is becoming increasingly difficult to govern and secure data effectively throughout the enterprise. Whether it be securing data across different repositories or monitoring access across different business units, the proliferation of data technologies and repositories across both on-premises and in the cloud is making the task unattainable. The challenge is only made greater by the ongoing pressure to offer self-service data access to business users.
Watch on-demand and learn:
- How to use a logical data fabric to build an enterprise-wide data access role model.
- Centralise security when data is spread across multiple systems residing both on-premises and in the cloud.
- Control and audit data access across different regions.
What you need to know about Generative AI and Data Management?Denodo
Watch full webinar here: https://buff.ly/3UXy0A2
It should be no surprise that Generative AI will have a profound impact to data management in years to come. Much like other areas of the technology sector, the opportunities presented by GenAI will accelerate our efforts around all aspects of data management, including self-service, automation, data governance and security. On the other hand, it is also becoming clearer that to unleash the true potential of AI assistants powered by GenAI, we need novel implementation strategies and a reimagined data architecture. This presents an exhilarating yet challenging future, demanding innovative thinking and methodologies in data management.
Join us on this webinar to learn about:
- The opportunities and challenges presented by GenAI today.
- Exploiting GenAI to democratize data management.
- How to augment GenAI applications with corporate data and knowledge.
- How to get started.
Mastering Data Compliance in a Dynamic Business LandscapeDenodo
Watch full webinar here: https://buff.ly/48rpLQ3
Join us for an enlightening webinar, "Mastering Data Compliance in a Dynamic Business Landscape," presented by Denodo Technologies and W5 Consulting. This session is tailored for business leaders and decision-makers who are navigating the complexities of data compliance in an ever-evolving business environment.
This webinar will focus on why data compliance is crucial for your business. Discover how to turn compliance into a competitive advantage, enhancing operational efficiency and market trust. We'll also address the risks of non-compliance, including financial penalties and the loss of customer trust, and provide strategies to proactively overcome these challenges.
Key Takeaways:
- How can your business leverage data management practices to stay agile and compliant in a rapidly changing regulatory landscape?
- Keys to balancing data accessibility with security and privacy in today's data-driven environment.
- What are the common pitfalls in achieving compliance with regulations like GDPR, CCPA, and HIPAA, and how can your business avoid them?
We will go beyond the technical aspects and delve into how you can strategically position your organization in the realm of data management and compliance. Learn how to craft a data compliance strategy that aligns with your business goals, enhances operational efficiency, and builds stakeholder trust.
Denodo Partner Connect: Business Value Demo with Denodo Demo LiteDenodo
Watch full webinar here: https://buff.ly/3OCQvGk
In this session, Denodo Sales Engineer, Yik Chuan Tan, will guide you through the art of delivering a compelling demo of the Denodo Platform with Denodo Demo Lite. Watch to uncover the significant functionalities that set Denodo apart and learn how to effectively win over potential customers.
In this session, we will cover:
Understanding the Denodo Platform & Tailoring Your Demo to Prospect Needs: By gaining a comprehensive understanding of the Denodo Platform, its architecture, and how it addresses data management challenges, you can customize your demo to align with the specific needs and pain points of your prospects, including:
- seamless data integration with real-time access
- data security and governance
- self-service data discovery
- advanced analytics and reporting
- performance optimization scalability and deployment
Watch this Denodo demo session and acquire the skills and knowledge necessary to captivate your prospects. Whether you're a seasoned technical professional or new to the field, this session will equip you with the skills to deliver compelling demos that lead to successful conversions.
Expert Panel: Overcoming Challenges with Distributed Data to Maximize Busines...Denodo
Watch full webinar here: https://buff.ly/3wdI1il
As organizations compete in new markets and new channels, business data requirements include new data platforms and applications. Migration to the cloud typically adds more distributed data when operations set up their own data platforms. This spreads important data across on-premises and cloud-based data platforms. As a result, data silos proliferate and become difficult to access, integrate, manage, and govern. Many organizations are using cloud data platforms to consolidate data, but distributed environments are unlikely to go away.
Organizations need holistic data strategies for unifying distributed data environments to improve data access and data governance, optimize costs and performance, and take advantage of modern technologies as they arrive. This TDWI Expert Panel will focus on overcoming challenges with distributed data to maximize business value.
Key topics this panel will address include:
- Developing the right strategy for your use cases and workloads in distributed data environments, such as data fabrics, data virtualization, and data mesh
- Deciding whether to consolidate data silos or bridge them with distributed data technologies
- Enabling easier self-service access and analytics across a distributed data environment
- Maximizing the value of data catalogs and other data intelligence technologies for distributed data environments
- Monitoring and data observability for spotting problems and ensuring business satisfaction
Watch full webinar here: https://buff.ly/3UE5K5l
The ability to recognize and flag sensitive information within corporate datasets is essential for compliance with emerging privacy laws, for completing a privacy impact assessment (PIA) or data subject access request (DSAR), and also for cyber-insurance compliance. During this session, we will discuss data privacy laws, the challenges they present, and how they can be applied with modern tools.
Join us for the session driven by Mark Rowan, CEO at Data Sentinel, and Bhavita Jaiswal, SE at Denodo, who will show how a data classification engine augments Data Catalog to support data governance and compliance objectives.
Watch on-demand & Learn:
- Changing landscape of data privacy laws and compliance requirements
- How to create a data classification framework
- How Data Sentinel classifies data and this can be integrated into Denodo
- Using the enhanced data classifications via consuming tools such as Data Catalog and Power BI
Знакомство с виртуализацией данных для профессионалов в области данныхDenodo
Watch full webinar here: https://buff.ly/3OETC08
По данным аналитической компании Gartner, "к 2022 году 60% предприятий включат виртуализацию данных в качестве основного метода доставки данных в свою интеграционную архитектуру". Компания Gartner назвала Denodo лидером в Магическом квадранте 2020 года по инструментам интеграции данных.
В ходе этого 1,5-часового занятия вы узнаете, как виртуализация данных революционизирует бизнес и ИТ-подход к доступу, доставке, потреблению, управлению и защите данных, независимо от возраста вашей технологии, формата данных или их местонахождения. Эта зрелая технология устраняет разрыв между ИТ и бизнес-пользователями и обеспечивает значительную экономию средств и времени.
**ФОРМАТ
Онлайн-семинар продолжительностью 1 час 30 минут.
Благодаря записи вы можете выполнять упражнения в своем собственном темпе.
**ДЛЯ КОГО ЭТОТ СЕМИНАР?
ИТ-менеджеры / архитекторы
Специалисты по анализу данных / аналитики
CDO
**СОДЕРЖАНИЕ
В программе: введение в суть виртуализации данных, примеры использования, реальные примеры из практики клиентов и демонстрация возможностей платформы Denodo Platform:
Интеграция и предоставление данных быстро и легко с помощью платформы Denodo Platform 8.0
Оптимизатор запросов Denodo предоставляет данные в режиме реального времени, по запросу, даже для очень больших наборов данных
Выставлять данные в качестве "сервисов данных" для потребления различными пользователями и инструментами
Каталог данных: Открывайте и документируйте данные с помощью нашего Каталога данных
пространства для самостоятельного доступа к данным.
Виртуализация данных играет ключевую роль в управлении и обеспечении безопасности данных в вашей организации
**ПОВЕСТКА
Введение в виртуализацию данных
Примеры использования и примеры из практики клиентов
Архитектура - Управление и безопасность
Производительность
Демо
Следующие шаги: как самостоятельно протестировать и внедрить платформу
Интерактивная сессия вопросов и ответов
Data Democratization: A Secret Sauce to Say Goodbye to Data FragmentationDenodo
Watch full webinar here: https://buff.ly/41Zf31D
Despite recent and evolving technological advances, the vast amounts of data that exist in a typical enterprise is not always available to all stakeholders when they need it. In modern enterprises, there are broad sets of users, with varying levels of skill sets, who strive to make data-driven decisions daily but struggle to gain access to the data needed in a timely manner.
Join our webinar to learn how to:
- Unlock the Power of Your Data: Discover how data democratization can transform your organization by giving every user access to the data they need, when they need it.
- Say 'Goodbye' to Data Fragmentation: Learn practical strategies to break down data silos and foster a more collaborative and efficient data environment.
- Realize the Full Potential of Your Data: Hear success stories about industry leaders who have embraced data democratization and witnessed tangible results.
Denodo Partner Connect - Technical Webinar - Ask Me AnythingDenodo
Watch full webinar here: https://buff.ly/48ZpEf1
In this session, we will cover a deeper dive into the Denodo Platform 8.0 Certified Architect Associate (DEN80EDUCAA) exam by answering any questions that have developed since the previous session.
Additionally, we invite partners to bring any general questions related to Denodo, the Denodo Platform, or data management.
Lunch and Learn ANZ: Key Takeaways for 2023!Denodo
Watch full webinar here: https://buff.ly/3SnH5QY
2023 is coming to an end where organisations dependency on trusted, accurate, secure and contextual data only grows more challenging. The perpetual aspect in seeking new architectures, processes, organisational team structures to "get the business their data" and reduce the operating costs continues unabated. While confidence from the business in what "value" is being derived or "to be" delivered from these investments in data, is being heavily scrutinised. 2023 saw significant new releases from vendors, focusing on the Data Fabric.
At this session we will look at these topics and key takeaways for 2023, including;
- Data management and data integration market highlights for 2023
- Key achievements for Denodo in their journey as a leader in this market
- A few case studies from Australian organisations in how they are delivering strategic business value through Denodo's Data Fabric platform and what they have been doing differently
It’s a Wrap! 2023 – A Groundbreaking Year for AI and The Way ForwardDenodo
Watch full webinar here: https://buff.ly/3S4Y49o
A little over a year ago, we would not have expected the disruptions caused by the rise of Generative AI. If 2023 was a groundbreaking year for AI, what will 2024 bring? More importantly, what can you do now to take advantage of these trends and ensure you are future-proof?
For example:
- Generative AI will become more powerful and user-friendly, enabling novel and realistic content creation and automation.
- Data Architectures will need to adapt to feed these powerful new models.
- Data ecosystems are moving to the cloud, but there is a growing need to maintain control of costs and optimize workloads better.
Join us for a discussion on the most significant trends in the Data & AI space, and how you can prepare to ride this wave!
Quels sont les facteurs-clés de succès pour appliquer au mieux le RGPD à votr...Denodo
Watch full webinar here: https://buff.ly/3O7rd2R
Afin d’être conformes au RGPD, les entreprises ont besoin d'avoir une vue d'ensemble sur toutes leurs données et d'établir des contrôles de sécurité sur toute l'infrastructure. La virtualisation des données de Denodo permet de rassembler les multiples sources de données, de les rendre accessibles à partir d'une seule couche, et offre des capacités de monitoring pour surveiller les changements.
Pour cela, Square IT Services a développé pour l’un de ses grands clients français prestigieux dans le secteur du luxe une interface utilisateur ergonomique qui lui permet de consulter les informations personnelles de ses clients, vérifier leur éligibilité à pratiquer leur droit à l'oubli, et de désactiver leurs différents canaux de notification. Elle dispose aussi d'une fonctionnalité d'audit qui permet de tracer l'historique des opérations effectuées, et lui permet donc de retrouver notamment la date à laquelle la personne a été anonymisée.
L'ensemble des informations remontées au niveau de l'application sont récupérées à partir des APIs REST exposées par Denodo.
Dans ce webinar, nous allons détailler l’ensemble des fonctionnalités de l’application DPO-Cockpit autour d’une démo, et expliquer à chaque étape le rôle central de Denodo pour réussir à simplifier la gestion du RGPD tout en étant compliant.
Les points clés abordés:
- Contexte client face aux enjeux du RGPD
- Défis et challenges rencontrés
- Options et choix retenu (Denodo)
- Démarche: architecture de la solution proposée
- Démo de l'outil: fonctionnalités principales
Lunch and Learn ANZ: Achieving Self-Service Analytics with a Governed Data Se...Denodo
Watch full webinar here: https://buff.ly/48zzN2h
In an increasingly distributed and complex data landscape, it is becoming increasingly difficult to govern and secure data effectively throughout the enterprise. Whether it be securing data across different repositories or monitoring access across different business units, the proliferation of data technologies and repositories across both on-premises and in the cloud is making the task unattainable. The challenge is only made greater by the ongoing pressure to offer self-service data access to business users.
Tune in and learn:
- How to use a logical data fabric to build an enterprise-wide data access role model.
- Centralise security when data is spread across multiple systems residing both on-premises and in the cloud.
- Control and audit data access across different regions.
How to Build Your Data Marketplace with Data Virtualization?Denodo
Watch full webinar here: https://buff.ly/4aAi0cS
Organizations continue to collect mounds of data and it is spread over different locations and in different formats. The challenge is navigating the vastness and complexity of the modern data ecosystem to find the right data to suit your specific business purpose. Data is an important corporate asset and it needs to be leveraged but also protected.
By adopting an alternate approach to data management and adapting a logical data architecture, data can be democratized while providing centralized control within a distributed data landscape. The web-based Data Catalog tool acts as a single access point for secure enterprise-wide data access and governance. This corporate data marketplace provides visibility into your data ecosystem and allows data to be shared without compromising data security policies.
Catch this live webinar to understand how this approach can transform how you leverage data across the business:
- Empower the knowledge worker with data and increase productivity
- Promote data accuracy and trust to encourage re-use of important data assets
- Apply consistent security and governance policies across the enterprise data landscape
Webinar #2 - Transforming Challenges into Opportunities for Credit UnionsDenodo
Watch full webinar here: https://buff.ly/3vhzqL5
Join our exclusive webinar series designed to empower credit unions with transformative insights into the untapped potential of data. Explore how data can be a strategic asset, enabling credit unions to overcome challenges and foster substantial growth.
This webinar will delve into how data can serve as a catalyst for addressing key challenges faced by credit unions, propelling them towards a future of enhanced efficiency and growth.
Enabling Data Catalog users with advanced usabilityDenodo
Watch full webinar here: https://buff.ly/48A4Yu1
Data catalogs are increasingly important in any modern data-driven organization. They are essential to manage and make the most of the huge amount of data that any organization uses. As this information is continuously growing in size and complexity, data catalogs are key to providing Data Discovery, Data Governance, and Data Lineage capabilities.
Join us for the session driven by David Fernandez, Senior Technical Account Manager at Denodo, to review the latest features aimed at improving the usability of the Denodo Data Catalog.
Watch on-demand & Learn:
- Enhanced search capabilities using multiple terms.
- How to create workflows to manage internal requests.
- How to leverage the AI capabilities of Data Catalog to generate SQL queries from natural language.
Watch full webinar here: https://buff.ly/3vjrn0s
The purpose of the Denodo Platform 8.0 Certified Architect Associate (DEN80EDUCAA) exam is to provide organizations that use Denodo Platform 8.0 with a means of identifying suitably qualified data architects who understand the role and position of the Denodo Platform within their broader information architecture.
This exam covers the following technical topics and subject areas:
- Denodo Platform functionality, including
- Governance and metadata management
- Security
- Performance optimization
- Caching
- Defining Denodo Platform use scenarios
Along with some sample questions, a Denodo Sales Engineer will help you prepare for exam topics and ace the exam.
Join us now to start your journey toward becoming a Certified Denodo Architect Associate!
GenAI y el futuro de la gestión de datos: mitos y realidadesDenodo
Watch full webinar here: https://buff.ly/3NLMSNM
El Generative AI y los Large Language Models (LLMs), encabezados por GPT de OpenAI, han supuesto la mayor revolución en el mundo de la computación de los últimos años. Pero ¿Cómo afectan realmente a la gestión de datos? ¿Reemplazarán los LLMs al profesional de la gestion de datos? ¿Cuánto hay de mito y cuánto de realidad?
En esta sesión revisaremos:
- Que es la Generative AI y por qué es importante para la gestión de datos
- Presente y futuro de aplicación de genAI en el mundo de los datos
- Cómo preparar tu organización para la adopción de genAI
Data Centers - Striving Within A Narrow Range - Research Report - MCG - May 2...pchutichetpong
M Capital Group (“MCG”) expects to see demand and the changing evolution of supply, facilitated through institutional investment rotation out of offices and into work from home (“WFH”), while the ever-expanding need for data storage as global internet usage expands, with experts predicting 5.3 billion users by 2023. These market factors will be underpinned by technological changes, such as progressing cloud services and edge sites, allowing the industry to see strong expected annual growth of 13% over the next 4 years.
Whilst competitive headwinds remain, represented through the recent second bankruptcy filing of Sungard, which blames “COVID-19 and other macroeconomic trends including delayed customer spending decisions, insourcing and reductions in IT spending, energy inflation and reduction in demand for certain services”, the industry has seen key adjustments, where MCG believes that engineering cost management and technological innovation will be paramount to success.
MCG reports that the more favorable market conditions expected over the next few years, helped by the winding down of pandemic restrictions and a hybrid working environment will be driving market momentum forward. The continuous injection of capital by alternative investment firms, as well as the growing infrastructural investment from cloud service providers and social media companies, whose revenues are expected to grow over 3.6x larger by value in 2026, will likely help propel center provision and innovation. These factors paint a promising picture for the industry players that offset rising input costs and adapt to new technologies.
According to M Capital Group: “Specifically, the long-term cost-saving opportunities available from the rise of remote managing will likely aid value growth for the industry. Through margin optimization and further availability of capital for reinvestment, strong players will maintain their competitive foothold, while weaker players exit the market to balance supply and demand.”
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...Subhajit Sahu
Abstract — Levelwise PageRank is an alternative method of PageRank computation which decomposes the input graph into a directed acyclic block-graph of strongly connected components, and processes them in topological order, one level at a time. This enables calculation for ranks in a distributed fashion without per-iteration communication, unlike the standard method where all vertices are processed in each iteration. It however comes with a precondition of the absence of dead ends in the input graph. Here, the native non-distributed performance of Levelwise PageRank was compared against Monolithic PageRank on a CPU as well as a GPU. To ensure a fair comparison, Monolithic PageRank was also performed on a graph where vertices were split by components. Results indicate that Levelwise PageRank is about as fast as Monolithic PageRank on the CPU, but quite a bit slower on the GPU. Slowdown on the GPU is likely caused by a large submission of small workloads, and expected to be non-issue when the computation is performed on massive graphs.
Adjusting primitives for graph : SHORT REPORT / NOTESSubhajit Sahu
Graph algorithms, like PageRank Compressed Sparse Row (CSR) is an adjacency-list based graph representation that is
Multiply with different modes (map)
1. Performance of sequential execution based vs OpenMP based vector multiply.
2. Comparing various launch configs for CUDA based vector multiply.
Sum with different storage types (reduce)
1. Performance of vector element sum using float vs bfloat16 as the storage type.
Sum with different modes (reduce)
1. Performance of sequential execution based vs OpenMP based vector element sum.
2. Performance of memcpy vs in-place based CUDA based vector element sum.
3. Comparing various launch configs for CUDA based vector element sum (memcpy).
4. Comparing various launch configs for CUDA based vector element sum (in-place).
Sum with in-place strategies of CUDA mode (reduce)
1. Comparing various launch configs for CUDA based vector element sum (in-place).
Opendatabay - Open Data Marketplace.pptxOpendatabay
Opendatabay.com unlocks the power of data for everyone. Open Data Marketplace fosters a collaborative hub for data enthusiasts to explore, share, and contribute to a vast collection of datasets.
First ever open hub for data enthusiasts to collaborate and innovate. A platform to explore, share, and contribute to a vast collection of datasets. Through robust quality control and innovative technologies like blockchain verification, opendatabay ensures the authenticity and reliability of datasets, empowering users to make data-driven decisions with confidence. Leverage cutting-edge AI technologies to enhance the data exploration, analysis, and discovery experience.
From intelligent search and recommendations to automated data productisation and quotation, Opendatabay AI-driven features streamline the data workflow. Finding the data you need shouldn't be a complex. Opendatabay simplifies the data acquisition process with an intuitive interface and robust search tools. Effortlessly explore, discover, and access the data you need, allowing you to focus on extracting valuable insights. Opendatabay breaks new ground with a dedicated, AI-generated, synthetic datasets.
Leverage these privacy-preserving datasets for training and testing AI models without compromising sensitive information. Opendatabay prioritizes transparency by providing detailed metadata, provenance information, and usage guidelines for each dataset, ensuring users have a comprehensive understanding of the data they're working with. By leveraging a powerful combination of distributed ledger technology and rigorous third-party audits Opendatabay ensures the authenticity and reliability of every dataset. Security is at the core of Opendatabay. Marketplace implements stringent security measures, including encryption, access controls, and regular vulnerability assessments, to safeguard your data and protect your privacy.
12. VHA Inc. Confidential information.
Since 1977 – when 30 hospital CEOs established VHA as the nation’s first membership
organization for acute care providers – the company has applied knowledge in
analytics, contracting, consulting and network development to help members achieve
their strategic objectives.
VHA is based in Irving, Texas, and has 11 regional offices. Our unique family of companies
brings industry-leading innovation and expertise to help organizations thrive in a dynamic
health care environment:
A legacy of innovation
As the first hospital membership organization, we were born of innovation.
We introduced the concept of supply networks, drawing on the power of collaboration to
achieve greater cost savings for health organizations.
We pioneered comparative data analysis and exchange as well as the industry’s first
committed contracting program and private label program, all of which continue to deliver
exceptional value today.
Who is VHA?
In 2013, VHA
delivered
$2.2 billion in savings
and additional value
to members.
12 |
13. VHA Inc. Confidential information.
At UHC, collaboration drives success.
For nearly 30 years, UHC has been a catalyzing force in:
Supporting academic medical centers in their efforts
Fostering new ideas
Building solid relationships that withstand the test of time
Our members have been agents of progress—driving the advancement of patient care,
medical knowledge, and fiscal acuity by coming together to candidly discuss their ideas and
vision.
UHC continually expands and strengthens services to offer insights and solutions to
members.
As the leader in providing relevant comparative data and as a single-source provider of
information and insights that promote change, UHC has created UHC Intelligence™, a
versatile suite of business tools that power performance improvement.
Who is UHC?
13 |
UHC offers
Transparency.
14. VHA Inc. Confidential information.
UHC and VHA have a track record of partnering successfully
History of Collaboration
14 |
1998 – Current
Novation was formed in 1998 and is a joint
venture between UHC and VHA.
1998 – 2011
A supply chain improvement company focused on
non-acute care market, Provista was formed in
1998 as a joint venture between UHC and VHA.
VHA acquired UHC’s minority interest in 2011.
2013 – Current
A subsidiary of Novation, aptitude is the health
care industry's first online direct contracting
marketplace.
UHC and VHA have formed the largest contracting services company
Since forming Novation in 1998, UHC and VHA have worked in partnership to grow Novation to the nation’s largest contracting services company, representing
more than $50 billion in purchasing volume and delivering more than
$1 billion in contract price savings for UHC and VHA members and other affiliate organizations over the past
five years.
We have continued to expand on this successful partnership. Our advanced shared analytic capabilities and innovative cost management tools have helped
purchasing through Novation to save an additional $1.4 billion, over the last
four years.
15. VHA Inc. Confidential information.
UHC and VHA Have Complementary Strengths Which When Combined
Create Enhanced Member Value
A Powerful Network
The Nation’s Leading Academic
Medical Centers and Community
Health Care Providers
Foundation
Supply
Chain
Management
Advisory
Services
Comprehensive Data
and Analytics
Core Capabilities
Targeted Solutions
Customized Insights
NewCo
15 |
16. VHA Inc. Confidential information.
Our new organization offers superior access to leading practices, networking and knowledge sharing for our
members, which includes the majority of this country’s preeminent academic medical centers and community-
based health systems.
The newly combined organization:
Serves more than 5,200 health system members and affiliates.
Provides services to nearly 30 percent of the nation’s hospitals, including virtually all the academic medical centers and
health systems.
Serves more than 118,000 non-acute health care customers.
Includes more than $50 billion in purchasing volume, the largest in the industry.
Provides services to all of the top 10 hospitals on the US News and World Reports annual list of America’s Top
Hospitals.
Delivers the industry’s most in-depth clinical data combined with the nation’s
most robust supply chain data to address cost and quality.
VHA and UHC Are Now the Largest Member-owned Health Care Company
16 |
18. VHA Inc. Confidential information.
Move from silos to streamlined data processing….
18 |
DataInternal
External
Data
Logic
Apps
Internal External Internal External Internal External
Clinical Supply Academic
Process
Logic
Apps
Logic
Apps
Process Process
Data Data
Process
Logic
Clinical
Supply
Academic
Apps
Apps
Apps
Logic
Logic
19. VHA Inc. Confidential information.
ManagementAcquisition Delivery
Where we want to go…
19 |
Data
Lake
BusinessAccessLayer
Data Warehouse
Discovery Zone
Systems of Record
Oracle
SQL
Other
Exadata
Data Access Layer
Applications
Reports
Dashboards
DM
Queries
EDI
Future Capabilities
Data Gateway
LandingZone
HCO
XYZ
Data Aggregators
Raw Data Useful Information
Data Owners, Technical Support Data Stewards, Data SME/Scientist,
Analyst, Advisors
Advisors, Analysts, Members,
Collaboratives
DM
DM
Data Owners, Technical Support
Data Stewards, Data SME’s, Data Scientists, Analyst,
Advisors, Data QA
Advisors, Analysts, Members,
Collaboratives
Discovery Zone
Posts
21. VHA Inc. Confidential information.
ManagementAcquisition Delivery
Where we want to go…
21 |
Data
Lake
BusinessAccessLayer
Data Warehouse
Discovery Zone
Systems of Record
Oracle
SQL
Other
Exadata
Data Access Layer
Applications
Reports
Dashboards
DM
Queries
EDI
Future Capabilities
Data Gateway
LandingZone
HCO
XYZ
Data Aggregators
Raw Data Useful Information
Data Owners, Technical Support Data Stewards, Data SME/Scientist,
Analyst, Advisors
Advisors, Analysts, Members,
Collaboratives
DM
DM
Data Owners, Technical Support
Data Stewards, Data SME’s, Data Scientists, Analyst,
Advisors, Data QA
Advisors, Analysts, Members,
Collaboratives
Discovery Zone
Posts
22. VHA Inc. Confidential information.
Built on Hadoop
Hortonworks is the distribution
Business Need
Move to a modern data architecture
– Disparate data sources into a single data lake
– Flexibility of schema on read (not write)
– Ease of doing analysis on subsets of large data sets
– Capture all types of data (even data that might only have a future purpose)
– Lower cost to store large amounts of data
Data Lake
22 |
Data
Lake
23. VHA Inc. Confidential information.
Area to discover value from data
Access roles:
Data scientists and SMEs
Product Managers
Analysts
Data Stewards
Challenge
Business users have been trained to use SQL or CSV exports
Introduction of Hadoop will require training on PIG and HIVE for access
Possibility of slowing down adoption and deriving value from new solution
Data Discovery
23 |
Discovery Zone
24. VHA Inc. Confidential information.
Utilize data virtualization
“Data virtualization is an umbrella term used to describe any approach to data management that allows an
application to retrieve and manipulate data without requiring technical details about the data, such as how
it is formatted or where it is physically located” – Margaret Rouse, TechTarget.com
Our solution…
24 |
Data
Lake
Discovery Zone
25. VHA Inc. Confidential information.
Proven platform
Denodo is our DV platform
Successes in our company
Salesforce reporting environment (cloud based plug-in)
Physician dashboard (disparate data sources)
Data Virtualization
25 |
26. VHA Inc. Confidential information.
Discovery Zone
Utilize Denodo HDFS, HBase and Map Reduce custom wrappers
Abstract data from lake
– Protects data source asset
– Enhanced security
Simplified access for data discovery users
– Can use SQL to query
Easy to augment discovery process
– Can pull in other sources of data to DV view (Excel, PDF, Websites)
Data Virtualization for Discovery Zone
26 |
27. VHA Inc. Confidential information.
Discovery zone
presentation layer
Virtual data views
Data Lake
Systems of record
Architecture
27 |
28. VHA Inc. Confidential information.
Data Lake approach on Hadoop
Simplifies data management
reduces data costs
Scalable
Flexibility
Data Virtualization
Simplified data access
Less training for business users
Faster data discovery
Augmented discovery process (adding new sources)
Recommendation and Benefits
28 |