Case study of massive Hadoop deployment by Cardinal Health to achieve both strong security & substantive analytical utility.
This was distributed publicly in 2014 in Krakow, Poland, and at multiple big data conferences 2014-2015. This is being hosted on SlideShare for posterity.
Tips and Techniques for Improving the Performance of Validation Procedures in...Perficient, Inc.
Ensuring the validity of patient data in your clinical data management and EDC system is essential. However, without a way to programmatically identify discrepancies and inconsistencies, such a task can inadvertently leave bad data in your system. Through validation procedures, an endless assortment of expressions and formulas, Oracle Clinical offers the powerful ability to clean and compare patient data.
In this slideshare, Perficient's Dr. Steve Rifkin, a leading expert in Oracle Clinical, demonstrates the structure of validation procedures, as well as provides various tips and techniques for developing procedures that improve the performance of edit checks.
NoSQL Databases for Enterprises - NoSQL Now Conference 2013Dave Segleau
Talk delivered at Dataversity NoSQL Now! Conference in San Jose, August 2013. Describes primary NoSQL functionality and the key features and concerns that Enterprises should consider when choosing a NoSQL technology provider.
The Value of the Modern Data Architecture with Apache Hadoop and Teradata Hortonworks
This webinar discusses why Apache Hadoop most typically the technology underpinning "Big Data". How it fits in a modern data architecture and the current landscape of databases and data warehouses that are already in use.
Introducing New AI Ops Innovations in Oracle 19c Autonomous Health Framework ...Sandesh Rao
Oracle Autonomous Health Framework (AHF) is Oracle’s Artificial Intelligence Operations platform for autonomous database health management. This session will focus on enhancements to current functionality and new features in 18c and coming in 19c. First successfully introduced in Cluster Health Advisor, and extended to Trace File Analyzer and Hang Manager, Oracle AHF’s applied machine learning technology now enhances additional framework components. You will learn how to utilize these features for determining workload footprint, ongoing monitoring, early detection of anomalies and performance issues, their root causes and corrective actions, prevention of node or database failures, and targeted postmortem analysis enabling quick resolution.
Tips and Techniques for Improving the Performance of Validation Procedures in...Perficient, Inc.
Ensuring the validity of patient data in your clinical data management and EDC system is essential. However, without a way to programmatically identify discrepancies and inconsistencies, such a task can inadvertently leave bad data in your system. Through validation procedures, an endless assortment of expressions and formulas, Oracle Clinical offers the powerful ability to clean and compare patient data.
In this slideshare, Perficient's Dr. Steve Rifkin, a leading expert in Oracle Clinical, demonstrates the structure of validation procedures, as well as provides various tips and techniques for developing procedures that improve the performance of edit checks.
NoSQL Databases for Enterprises - NoSQL Now Conference 2013Dave Segleau
Talk delivered at Dataversity NoSQL Now! Conference in San Jose, August 2013. Describes primary NoSQL functionality and the key features and concerns that Enterprises should consider when choosing a NoSQL technology provider.
The Value of the Modern Data Architecture with Apache Hadoop and Teradata Hortonworks
This webinar discusses why Apache Hadoop most typically the technology underpinning "Big Data". How it fits in a modern data architecture and the current landscape of databases and data warehouses that are already in use.
Introducing New AI Ops Innovations in Oracle 19c Autonomous Health Framework ...Sandesh Rao
Oracle Autonomous Health Framework (AHF) is Oracle’s Artificial Intelligence Operations platform for autonomous database health management. This session will focus on enhancements to current functionality and new features in 18c and coming in 19c. First successfully introduced in Cluster Health Advisor, and extended to Trace File Analyzer and Hang Manager, Oracle AHF’s applied machine learning technology now enhances additional framework components. You will learn how to utilize these features for determining workload footprint, ongoing monitoring, early detection of anomalies and performance issues, their root causes and corrective actions, prevention of node or database failures, and targeted postmortem analysis enabling quick resolution.
IDERA Live | Maintaining Data Governance During Rapidly Changing ConditionsIDERA Software
You can watch the replay for this IDERA Live webcast in the IDERA Resource Center: https://www.idera.com/resourcecentral/webcasts/maintaining-data-governance
Everything is changing right now. We see evolving systems to suit our changing world, we have exciting new data platform products, we are moving data platforms to the cloud, and data warehouses and data lakes are becoming more valuable. Not only do we need to make these changes quickly and with minimal risk but we need to make sure we have considered the implications on our data and the rules that apply to them. We then need to publish what data we make available, where is it and what rules apply to it. In this session we will see how ER/Studio helps manage and migrate our data all classified against a business glossary and allow Data Architects to work within a collaborative ecosystem with other groups and tools.
Speaker: Jamie Knowles is a senior product manager at IDERA, and has been in the field of architecture and modeling for over 20 years. Jamie has been involved with the evolution of enterprise architecture, data modeling, and data governance and seen its challenges and achievements. He has worked in product management and in the field within the banking, finance, and energy industries.
IDERA Live | Why You Need Data Warehouse Automation Now More Than EverIDERA Software
You need to ensure the delivery of data (regardless of its location and presentation) to the people who need it. In organizations where data drives important strategic changes, the effective design, build, and documentation of complex data ecosystems is more critical today than ever before.
Teams that combine the gains provided by data automation and cloud computing see tremendous leaps in agility and productivity. The benefits of such initiatives include:
- Reduced cost and resources used for data projects
- Less time spent by developers on custom data infrastructure and more time dedicated to data delivery
- Standardized procedures and adoption of best-practice templates that democratize data warehouses
Speaker: Stan Geiger manages a skilled team of Product Managers responsible for Idera's multi-platform databases which includes WhereScape. Stan has worked in various industries from fraud detection to healthcare and is a highly experienced data practitioner having built many data warehouse and ETL platforms, BI analytics, and OLTP systems.
You can watch the replay for this IDERA Live webcast in the IDERA Resource Center: https://www.idera.com/resourcecentral/webcasts/modern-query-optimizer
Data changes happen quickly—and the DBA can’t easily monitor query performance 24/7. In recent releases SQL Server has introduced a number of new features to improve query performance in the event of performance degradation. In this session you will get an overview of the new additions and how they can help their workloads and improve your execution plans. You will learn about how the SQL Server query optimizer has changed to make adaptive decisions at query execution, and has changed some former anti-patterns. This session will focus on SQL Server 2019, but will highlight some changes introduced in SQL Server 2017.
Speaker: Joey D'Antoni is a Senior Consultant and SQL Server MVP with over a decade of experience working in both Fortune 500 and smaller firms. He is a Principal Architect for Denny Cherry and Associates and lives in Malvern, PA. He is a frequent speaker at major tech events, and blogger about all topics technology. He believes that no single platform is the answer to all technology problems. He holds a BS in Computer Information Systems from Louisiana Tech University and an MBA from North Carolina State University, and is the co-author of the Microsoft white paper "Using Power BI in a Hybrid Environment.”
IDERA Live | Have No Fear the DBA is Here: Protecting Data ResourcesIDERA Software
You can watch the replay for this IDERA Live webcast in the IDERA Resource Center: https://www.idera.com/resourcecentral/webcasts/protecting-data-resources-recovery-strategy
The DBA wears many hats—they perform database design and governance, capacity planning, performance tuning and monitoring, troubleshooting, security duties, and sometimes even work on ETLs for data transformations and cloud migrations. With today’s rapidly changing work environment, a DBA should take into account the applications running on database servers, how critical they are for business operations and how fast they can get the essentials back up and running when a disaster occurs to prevent data loss, in order to save time and costs for the business. This session is for DBAs that want to learn more about the SQL Safe Backup capabilities for mission-critical backup, restore and recovery using its complete "hands-free" automated solutions so they can meet all of their challenges and exceed their duties during these demanding times.
Speaker: Elan Kol is a senior product manager at IDERA Software and his main focus is on the SQL Server auditing, security, optimization, and DBA productivity product lines. Elan brings over ten years of experience in the financial technology, IT security and game development industries. His passion is building, delivering, managing and optimizing products with great market fit through data-driven and market-backed facts.
I. What can be expected with Meaningful Use
II. Two possible workflows for compliance
III. Three components of Meaningful Use data
IV. What does Meaningful Use mean for radiology?
V. How CARESTREAM RIS can help
VI. Meaningful Use compliance with RIS
Additional Meaningful Use resources:
A. Meaningful Use Podcast Series
i. Keith Dreyer, DO, Ph.D, Massachusetts General Hospital
ii. Steven Fischer, CIO, Center for Diagnostic Imaging
B. Webinar
i. Keith Dreyer, DO, Ph.D, Massachusetts General Hospital
ii. Marjorie Calvetti, Administrative Director, Radiology, Memorial Medical Center
C. Whitepaper: Customizable CARESTREAM RIS Enables US Facilities to Meet Meaningful Use Requirements
For more about Carestream RIS, visit http://www.carestream.com/ris
NZOUG-GroundBreakers-2018 - Troubleshooting and Diagnosing 18c RACSandesh Rao
Learn about new diagnostic features in the 18c database product, tools on how to read trace and log files, automatically troubleshoot hangs, perform best practices on your stack automatically and how to act on the recommendations. We will cover RAC, ASM basics, the newest features of the diagnostic tools like Trace File Analyzer Collector, orachk, exachk, OSWatcher, Procwatcher, Hang Manager, Cluster Health Monitor and Cluster Health Analyzer. You can use Trace File Analyzer Collector to do all your first failure diagnostic collections and reduce the amount of back and forth with Oracle Support due to 90% of all the files Support needs being included by default. We will also cover analyzing logfiles using Machine Learning
Arthur C. Nielsen, the founder of ACNielsen said, “The price of light is less than the cost of darkness.” This is becoming even more important in the day and age of IoT devices and ubiquitous internet connectivity. The amount of data that is at the fingertips of our companies’ decision makers is colossal. Yet very few business leaders and their direct teams can analyze their data by themselves to uncover insights that will improve our products and services to delight their customers and grow their business.
With the rise of low-code/no-code tools, cloud infrastructure, and the convergence of AI and BI, the democratization of analytics can accelerate the time to answer a question while improving its relevancy.
In this presentation, we will cover the 12 critical capabilities to succeed in enabling self-service analytics and augmenting data literacy across the enterprise.
Strategic imperative the enterprise data modelDATAVERSITY
With today's increasingly complex data ecosystems, the Enterprise Data Model (EDM) is a strategic imperative that every organization should adopt. An Enterprise Data Model provides context and consistency for all organizational data assets, as well as a classification framework for data governance. Enterprise modeling is also totally consistent with agile workflows, evolving incrementally to keep pace with changing organizational factors. In this session, IDERA’s Ron Huizenga will discuss the increasing importance of the EDM, how it serves as a framework for all enterprise data assets, and provides a foundation for data governance.
Data Done Right: Ensuring Information IntegritySharala Axryd
It’s the ultimate “garbage in, garbage out” quandary. Data can be an organization’s most valuable asset — but only to the degree its quality can be validated and trusted. Without the right guidelines, processes, and solutions in place to control the way applications, systems, databases, messages, and documents are managed, "dirty" data can permeate systems across the enterprise, negatively impacting everything from strategic planning to day-to-day decision making. High-quality data will ensure more efficiency in driving a company’s success because of the dependence on fact-based decisions, instead of habitual or human intuition.
To gain a better understanding of this topic, this speaking session will examine:
- what data quality and master data management is
- why they are so crucial for successful business operations and strategies
- how to improve data quality by organizational, procedural and technological means
IDERA Live | Maintaining Data Governance During Rapidly Changing ConditionsIDERA Software
You can watch the replay for this IDERA Live webcast in the IDERA Resource Center: https://www.idera.com/resourcecentral/webcasts/maintaining-data-governance
Everything is changing right now. We see evolving systems to suit our changing world, we have exciting new data platform products, we are moving data platforms to the cloud, and data warehouses and data lakes are becoming more valuable. Not only do we need to make these changes quickly and with minimal risk but we need to make sure we have considered the implications on our data and the rules that apply to them. We then need to publish what data we make available, where is it and what rules apply to it. In this session we will see how ER/Studio helps manage and migrate our data all classified against a business glossary and allow Data Architects to work within a collaborative ecosystem with other groups and tools.
Speaker: Jamie Knowles is a senior product manager at IDERA, and has been in the field of architecture and modeling for over 20 years. Jamie has been involved with the evolution of enterprise architecture, data modeling, and data governance and seen its challenges and achievements. He has worked in product management and in the field within the banking, finance, and energy industries.
IDERA Live | Why You Need Data Warehouse Automation Now More Than EverIDERA Software
You need to ensure the delivery of data (regardless of its location and presentation) to the people who need it. In organizations where data drives important strategic changes, the effective design, build, and documentation of complex data ecosystems is more critical today than ever before.
Teams that combine the gains provided by data automation and cloud computing see tremendous leaps in agility and productivity. The benefits of such initiatives include:
- Reduced cost and resources used for data projects
- Less time spent by developers on custom data infrastructure and more time dedicated to data delivery
- Standardized procedures and adoption of best-practice templates that democratize data warehouses
Speaker: Stan Geiger manages a skilled team of Product Managers responsible for Idera's multi-platform databases which includes WhereScape. Stan has worked in various industries from fraud detection to healthcare and is a highly experienced data practitioner having built many data warehouse and ETL platforms, BI analytics, and OLTP systems.
You can watch the replay for this IDERA Live webcast in the IDERA Resource Center: https://www.idera.com/resourcecentral/webcasts/modern-query-optimizer
Data changes happen quickly—and the DBA can’t easily monitor query performance 24/7. In recent releases SQL Server has introduced a number of new features to improve query performance in the event of performance degradation. In this session you will get an overview of the new additions and how they can help their workloads and improve your execution plans. You will learn about how the SQL Server query optimizer has changed to make adaptive decisions at query execution, and has changed some former anti-patterns. This session will focus on SQL Server 2019, but will highlight some changes introduced in SQL Server 2017.
Speaker: Joey D'Antoni is a Senior Consultant and SQL Server MVP with over a decade of experience working in both Fortune 500 and smaller firms. He is a Principal Architect for Denny Cherry and Associates and lives in Malvern, PA. He is a frequent speaker at major tech events, and blogger about all topics technology. He believes that no single platform is the answer to all technology problems. He holds a BS in Computer Information Systems from Louisiana Tech University and an MBA from North Carolina State University, and is the co-author of the Microsoft white paper "Using Power BI in a Hybrid Environment.”
IDERA Live | Have No Fear the DBA is Here: Protecting Data ResourcesIDERA Software
You can watch the replay for this IDERA Live webcast in the IDERA Resource Center: https://www.idera.com/resourcecentral/webcasts/protecting-data-resources-recovery-strategy
The DBA wears many hats—they perform database design and governance, capacity planning, performance tuning and monitoring, troubleshooting, security duties, and sometimes even work on ETLs for data transformations and cloud migrations. With today’s rapidly changing work environment, a DBA should take into account the applications running on database servers, how critical they are for business operations and how fast they can get the essentials back up and running when a disaster occurs to prevent data loss, in order to save time and costs for the business. This session is for DBAs that want to learn more about the SQL Safe Backup capabilities for mission-critical backup, restore and recovery using its complete "hands-free" automated solutions so they can meet all of their challenges and exceed their duties during these demanding times.
Speaker: Elan Kol is a senior product manager at IDERA Software and his main focus is on the SQL Server auditing, security, optimization, and DBA productivity product lines. Elan brings over ten years of experience in the financial technology, IT security and game development industries. His passion is building, delivering, managing and optimizing products with great market fit through data-driven and market-backed facts.
I. What can be expected with Meaningful Use
II. Two possible workflows for compliance
III. Three components of Meaningful Use data
IV. What does Meaningful Use mean for radiology?
V. How CARESTREAM RIS can help
VI. Meaningful Use compliance with RIS
Additional Meaningful Use resources:
A. Meaningful Use Podcast Series
i. Keith Dreyer, DO, Ph.D, Massachusetts General Hospital
ii. Steven Fischer, CIO, Center for Diagnostic Imaging
B. Webinar
i. Keith Dreyer, DO, Ph.D, Massachusetts General Hospital
ii. Marjorie Calvetti, Administrative Director, Radiology, Memorial Medical Center
C. Whitepaper: Customizable CARESTREAM RIS Enables US Facilities to Meet Meaningful Use Requirements
For more about Carestream RIS, visit http://www.carestream.com/ris
NZOUG-GroundBreakers-2018 - Troubleshooting and Diagnosing 18c RACSandesh Rao
Learn about new diagnostic features in the 18c database product, tools on how to read trace and log files, automatically troubleshoot hangs, perform best practices on your stack automatically and how to act on the recommendations. We will cover RAC, ASM basics, the newest features of the diagnostic tools like Trace File Analyzer Collector, orachk, exachk, OSWatcher, Procwatcher, Hang Manager, Cluster Health Monitor and Cluster Health Analyzer. You can use Trace File Analyzer Collector to do all your first failure diagnostic collections and reduce the amount of back and forth with Oracle Support due to 90% of all the files Support needs being included by default. We will also cover analyzing logfiles using Machine Learning
Arthur C. Nielsen, the founder of ACNielsen said, “The price of light is less than the cost of darkness.” This is becoming even more important in the day and age of IoT devices and ubiquitous internet connectivity. The amount of data that is at the fingertips of our companies’ decision makers is colossal. Yet very few business leaders and their direct teams can analyze their data by themselves to uncover insights that will improve our products and services to delight their customers and grow their business.
With the rise of low-code/no-code tools, cloud infrastructure, and the convergence of AI and BI, the democratization of analytics can accelerate the time to answer a question while improving its relevancy.
In this presentation, we will cover the 12 critical capabilities to succeed in enabling self-service analytics and augmenting data literacy across the enterprise.
Strategic imperative the enterprise data modelDATAVERSITY
With today's increasingly complex data ecosystems, the Enterprise Data Model (EDM) is a strategic imperative that every organization should adopt. An Enterprise Data Model provides context and consistency for all organizational data assets, as well as a classification framework for data governance. Enterprise modeling is also totally consistent with agile workflows, evolving incrementally to keep pace with changing organizational factors. In this session, IDERA’s Ron Huizenga will discuss the increasing importance of the EDM, how it serves as a framework for all enterprise data assets, and provides a foundation for data governance.
Data Done Right: Ensuring Information IntegritySharala Axryd
It’s the ultimate “garbage in, garbage out” quandary. Data can be an organization’s most valuable asset — but only to the degree its quality can be validated and trusted. Without the right guidelines, processes, and solutions in place to control the way applications, systems, databases, messages, and documents are managed, "dirty" data can permeate systems across the enterprise, negatively impacting everything from strategic planning to day-to-day decision making. High-quality data will ensure more efficiency in driving a company’s success because of the dependence on fact-based decisions, instead of habitual or human intuition.
To gain a better understanding of this topic, this speaking session will examine:
- what data quality and master data management is
- why they are so crucial for successful business operations and strategies
- how to improve data quality by organizational, procedural and technological means
Business Value Metrics for Data GovernanceDATAVERSITY
As data professionals, we recognize and understand the need for data governance, focusing on data quality in particular. We have made progress in this area, as illustrated by the emergence of the Chief Data Officer role in recent years. However, in many organizations, the need for governance is still largely unrecognized, and remains very tough to sell internally. You may need some detailed information and metrics to demonstrate the business value. This session will focus on business justification for establishing a data governance framework, including:
Data classification
Data quality
Business value metrics (KPIs)
Data Architecture - The Foundation for Enterprise Architecture and GovernanceDATAVERSITY
Organizations are faced with an increasingly complex data landscape, finding themselves unable to cope with exponentially increasing data volumes, compounded by additional regulatory requirements with increased fines for non-compliance. Enterprise architecture and data governance are often discussed at length, but often with different stakeholder audiences. This can result in complementary and sometimes conflicting initiatives rather than a focused, integrated approach. Data governance requires a solid data architecture foundation in order to support the pillars of enterprise architecture. In this session, IDERA’s Ron Huizenga will discuss a practical, integrated approach to effectively understand, define and implement an cohesive enterprise architecture and data governance discipline with integrated modeling and metadata management.
IDERA Live | Business Value Metrics for Data GovernanceIDERA Software
You can watch the replay for this IDERA Live webcast, Business Value Metrics for Data Governance, on the IDERA Resource Center, http://ow.ly/imPU50A4rRC
As data professionals, we recognize and understand the need for data governance, focusing on data quality in particular. We have made progress in this area, as illustrated by the emergence of the Chief Data Officer role in recent years. However, in many organizations, the need for governance is still largely unrecognized, and remains very tough to sell internally. You may need some detailed information and metrics to demonstrate the business value. This session will focus on business justification for establishing a data governance framework, including:
-Data classification
-Data quality
-Business value metrics (KPIs)
-Alignment with Business Strategy
Speaker: Ron Huizenga is the Senior Product Manager of Enterprise Architecture and Modeling at IDERA. Ron has over 30 years of business and IT experience across many different industries including manufacturing, retail, healthcare, and transportation. His hands-on consulting experience with large-scale data development engagements provides practical, real-world insights to enterprise data architecture, business architecture, and governance initiatives.
This webinar featuring Claudia Imhoff, President of Intelligent Solutions & Founder of the Boulder BI Brain Trust (BBBT), Matt Schumpert, Director of Product Management and Azita Martin, CMO at Datameer, will highlight the latest technology trends in extending BI with big data analytics and the top high impact use cases.
Attendees will hear about:
-- The extended architecture for today's modern analytics environment
-- The Internet of Things (IoT) and big data
-- The evolution of analytics – from descriptive to prescriptive
-- High impact use cases as a result of the changing analytics world
Fast Data Overview for Data Science Maryland MeetupC. Scyphers
An overview of Open Source Fast Data platforms (Spark, Kafka, HBase, Impala, Apex, H20, Druid, Flink, Storm, Samza, ElasticSearch, Lucene, Solr, SMACK, PANCAKE)
Extreme Analytics - What's New With Oracle Exalytics X3-4 & T5-8?KPI Partners
http://www.kpipartners.com/watch-extreme-analytics-whats-new-with-oracle-exalytics-x3-4-t5-8 … Analytics is all about gaining insights from data for better decision making.
Part 1 - Engineered Systems
Part 2 - Hardware & Software Together
Part 3 - Exalytics Benefits
Part 4 - Customer Results & Pricing
Part 5 - Success Story: Getting Started w/Exalytics
Part 6 - Q&A Session
A recent study by Harvard Business Review cited that top performing organizations use analytics five times more than low performers. However, the vision of delivering fast, interactive, insightful analytics has remained elusive for most organizations.
Most enterprise analytics solutions require dealing with a number of hardware, software, storage and networking vendors, and precious resources are wasted integrating the hardware and software components to deliver a complete analytical solution. A high-performance business intelligence system also requires fast connectivity to data warehouses, operational systems and other data sources.
Oracle Exalytics is an optimized engineered system to provide the highest levels of performance for business intelligence (BI) and enterprise performance management (EPM) applications such as Oracle Business Intelligence, Endeca, and Essbase.
Join team members from Oracle and KPI Partners for this virtual event that examines new releases of the leading engineered system for enterprise analytics: Exalytics X3-4 & T5-8.
IDERA Live | Decode your Organization's Data DNAIDERA Software
You can watch the replay for this webcast in the IDERA Resource Center: http://ow.ly/xbaO50A59Ah
Deoxyribonucleic acid (DNA) is the fundamental building block that specifies the structure and function of living things. The information in DNA is stored as a code made up of four chemical bases in which the sequencing determines unique characteristics, similar to the way in which letters of the alphabet appear in a certain order to form words and sentences.
Organizations can also be regarded as organic, with a need to adapt to changes in their environment. Every aspect of an organization also has a corresponding data representation, which can be regarded as its DNA. Without the correct tools and techniques, decoding that data structure can be extremely complex. Data modeling reveals that data in most organizations follows similar patterns. Once we recognize that, we can focus on the data characteristics that make each organization unique.
Establishing a data culture is vital to success, enabling a transformational breakthrough to translate data into knowledge and ultimately, strategic advantage. IDERA’s Ron Huizenga will explain how a business-driven data architecture enables you to leverage your data as a valuable strategic asset.
About Ron: Ron Huizenga is the Senior Product Manager of Enterprise Architecture and Modeling at IDERA. Ron has over 30 years of business and IT experience across many different industries including manufacturing, retail, healthcare, and transportation. His hands-on consulting experience with large-scale data development engagements provides practical, real-world insights to enterprise data architecture, business architecture, and governance initiatives.
Case Study: Cardinal Health Experiences “Black Friday” Every DayCA Technologies
Cardinal Health specializes in distribution of pharmaceuticals and medical products, serving more than 100,000 locations worldwide. To ensure the best customer experience possible for its mission critical applications, Cardinal Health takes the approach that every day is “Black Friday” with quick detection and quick resolution. Over the past several years, Cardinal Health has used CA Application Performance Management to move from being reactive to application performance issues to proactively addressing issues before they impact the customer. In this session you will learn some of the steps and approaches they take to improve application performance across more than 100 different applications, including their e-commerce platform.
For more information, please visit http://cainc.to/Nv2VOe
Hadoop is regarded as a key capability for implementing Big Data initiatives in the enterprise, but organizations have yet to realize its full business benefits. In this webinar, Pivotal and guest Forrester Research, Inc. Identify the use cases driving Hadoop adoption, and explore what is needed to transform initial investments into results.
Learn about:
Challenges Hadoop introduces, and how the right tools and platforms can help address them
Shifts in the industry with regards to SQL and NoSQL systems and their implications to Big Data analytics
Applying in-memory technologies for data management systems, data analytics, transactional processing and operational databases
Watch the on-demand webinar here:
http://www.pivotal.io/big-data/pivotal-forrester-operationalizing-data-analytics-webinar
Learn how to maximize business value from all of your data here: http://www.pivotal.io/big-data/pivotal-hd
Solving the Data Management Challenge for HealthcareDelphix
Need a proven blueprint to fast-track application development in your healthcare organization? With triple-digit growth, 3,000+ databases and over a petabyte of data, Molina Healthcare needed a way to accelerate application development and drive digital transformation.
Success meant slashing time to provision new dev and test environments in half, putting self-service data access in the hands of application teams―and doing it all without taking an eye off data security and HIPAA compliance.
What you need to know before migrating to SAP HanaDataVard
SAP HANA is superfast in memeory database and plaform enabling new possibilities in terms of analytical reporting and realtime data acquisition and consumption. However, the business case is often hard to prove due to high licence costs.
Based on our experience with SAP HANA migrations we have collected the most important points for Data Management in SAP BW and system optimisation before moving all data to SAP HANA. Just in few steps you can enhance the benefits of SAP HANA. This presentation will explain you how to analyse your BW, why to implement near line storage what data to housekeep and how code optimisation will help you.
2013 (ISC)² Congress: This Curious Thing Called EthicsDan Houser
The (ISC)² Ethics Committee helped provide this overview of professionalism, ethics, the (ISC)² Code of Ethics and case studies to help explain the ethics complaint & review process. Co-developed with William H. Murray, Graham Jackon & Mano Paul.
RSA2008: What Vendors Won’t Tell You About Federated IdentityDan Houser
Federated Identity overview, including the little known traps and issues with implementing federated identity for SSO using SAML. Lessons learned, build vs. buy, support, SLAs, and legal issues. Jointly developed with Bob West.
The Challenges & Risks of New Technology: Privacy Law & PolicyDan Houser
Guest Lecture at Taylor University, discussion of (then) emergent and existing privacy law, information security challenges to privacy, wiretap, bluetooth, war dialing, and some case studies, including Danny Kylio v. United States, and the litmus test provided in the summation by Hon. Antonin Scalia.
Perimeter Defense in a World Without WallsDan Houser
Perimeter Defense when you don't have a perimeter, and how to change the paradigm to protect hosts, and hide from the bad guys. Introduction of the Big Freakin' Haystack project (that, sadly, went nowhere).
Risk Based Planning for Mission ContinuityDan Houser
Introduction to Continuity Management & Risk Based Continuity Planning. Risk-based approach to provide mission-critical BCP. Models are provided to conduct quantitative & qualitative analytics, prioritize activity and integrate continuity planning into risk management activities.
Security Capability Model - InfoSec Forum VIIIDan Houser
A security capability model for evaluation of risk based on mapping controls based on attack vectors, using the OSI 7-layer model plus 4 categories outside OSI, as well as the four disciplines of Security Management. This creates a matrix that permits scoring of capabilities by discipline and control layer.
Certifications and Career Development for Security ProfessionalsDan Houser
Joint presentation by Kevin Flanagan & Dan Houser, RSA 2008. Overview of career development, professional security/risk certifications, and how to develop and drive your career plan.
Advanced IAM Audit Considerations for surviving or performing the IAM Audit. High-level overview intended to lead a discussion. Co-developed by Erik Heidt and Dan Houser, presented RSA 2008.
This paper shines a spotlight risk management, particularly on the dogma and BS in security "best practice", and utilizes primary research in password strength and compromise as a case study to blow the lid off password mythology.
Humorous and insightful look at breaking into security conferences by conquering the CFP. Uses Hacking Exposed methodology for targeting, prioritizing and mounting your "attack" on the CFP, and steps for strong execution as a speaker.
Building & Running A Successful Identity ProgramDan Houser
Two hour presentation on steps to build a successful identity access management program, including stakeholder buyin, strategy, roadmaps, selling I&AM, foundational components of I&AM, etc.
Crypto in the Real World: or How to Scare an IT AuditorDan Houser
Real world cryptography & theoretical cryptography are not the same. Bad ciphers, weak keys, cleartext keys, bad SSL, TLS, SSH permutations, and snake-oil crypto can undermine all your hard security work. This presentation provides real-world examples of broken crypto, and how to detect bad crypto, in the real world.
The Building Blocks of QuestDB, a Time Series Databasejavier ramirez
Talk Delivered at Valencia Codes Meetup 2024-06.
Traditionally, databases have treated timestamps just as another data type. However, when performing real-time analytics, timestamps should be first class citizens and we need rich time semantics to get the most out of our data. We also need to deal with ever growing datasets while keeping performant, which is as fun as it sounds.
It is no wonder time-series databases are now more popular than ever before. Join me in this session to learn about the internal architecture and building blocks of QuestDB, an open source time-series database designed for speed. We will also review a history of some of the changes we have gone over the past two years to deal with late and unordered data, non-blocking writes, read-replicas, or faster batch ingestion.
Enhanced Enterprise Intelligence with your personal AI Data Copilot.pdfGetInData
Recently we have observed the rise of open-source Large Language Models (LLMs) that are community-driven or developed by the AI market leaders, such as Meta (Llama3), Databricks (DBRX) and Snowflake (Arctic). On the other hand, there is a growth in interest in specialized, carefully fine-tuned yet relatively small models that can efficiently assist programmers in day-to-day tasks. Finally, Retrieval-Augmented Generation (RAG) architectures have gained a lot of traction as the preferred approach for LLMs context and prompt augmentation for building conversational SQL data copilots, code copilots and chatbots.
In this presentation, we will show how we built upon these three concepts a robust Data Copilot that can help to democratize access to company data assets and boost performance of everyone working with data platforms.
Why do we need yet another (open-source ) Copilot?
How can we build one?
Architecture and evaluation
State of Artificial intelligence Report 2023kuntobimo2016
Artificial intelligence (AI) is a multidisciplinary field of science and engineering whose goal is to create intelligent machines.
We believe that AI will be a force multiplier on technological progress in our increasingly digital, data-driven world. This is because everything around us today, ranging from culture to consumer products, is a product of intelligence.
The State of AI Report is now in its sixth year. Consider this report as a compilation of the most interesting things we’ve seen with a goal of triggering an informed conversation about the state of AI and its implication for the future.
We consider the following key dimensions in our report:
Research: Technology breakthroughs and their capabilities.
Industry: Areas of commercial application for AI and its business impact.
Politics: Regulation of AI, its economic implications and the evolving geopolitics of AI.
Safety: Identifying and mitigating catastrophic risks that highly-capable future AI systems could pose to us.
Predictions: What we believe will happen in the next 12 months and a 2022 performance review to keep us honest.
Global Situational Awareness of A.I. and where its headedvikram sood
You can see the future first in San Francisco.
Over the past year, the talk of the town has shifted from $10 billion compute clusters to $100 billion clusters to trillion-dollar clusters. Every six months another zero is added to the boardroom plans. Behind the scenes, there’s a fierce scramble to secure every power contract still available for the rest of the decade, every voltage transformer that can possibly be procured. American big business is gearing up to pour trillions of dollars into a long-unseen mobilization of American industrial might. By the end of the decade, American electricity production will have grown tens of percent; from the shale fields of Pennsylvania to the solar farms of Nevada, hundreds of millions of GPUs will hum.
The AGI race has begun. We are building machines that can think and reason. By 2025/26, these machines will outpace college graduates. By the end of the decade, they will be smarter than you or I; we will have superintelligence, in the true sense of the word. Along the way, national security forces not seen in half a century will be un-leashed, and before long, The Project will be on. If we’re lucky, we’ll be in an all-out race with the CCP; if we’re unlucky, an all-out war.
Everyone is now talking about AI, but few have the faintest glimmer of what is about to hit them. Nvidia analysts still think 2024 might be close to the peak. Mainstream pundits are stuck on the wilful blindness of “it’s just predicting the next word”. They see only hype and business-as-usual; at most they entertain another internet-scale technological change.
Before long, the world will wake up. But right now, there are perhaps a few hundred people, most of them in San Francisco and the AI labs, that have situational awareness. Through whatever peculiar forces of fate, I have found myself amongst them. A few years ago, these people were derided as crazy—but they trusted the trendlines, which allowed them to correctly predict the AI advances of the past few years. Whether these people are also right about the next few years remains to be seen. But these are very smart people—the smartest people I have ever met—and they are the ones building this technology. Perhaps they will be an odd footnote in history, or perhaps they will go down in history like Szilard and Oppenheimer and Teller. If they are seeing the future even close to correctly, we are in for a wild ride.
Let me tell you what we see.
06-04-2024 - NYC Tech Week - Discussion on Vector Databases, Unstructured Data and AI
Discussion on Vector Databases, Unstructured Data and AI
https://www.meetup.com/unstructured-data-meetup-new-york/
This meetup is for people working in unstructured data. Speakers will come present about related topics such as vector databases, LLMs, and managing data at scale. The intended audience of this group includes roles like machine learning engineers, data scientists, data engineers, software engineers, and PMs.This meetup was formerly Milvus Meetup, and is sponsored by Zilliz maintainers of Milvus.
ViewShift: Hassle-free Dynamic Policy Enforcement for Every Data LakeWalaa Eldin Moustafa
Dynamic policy enforcement is becoming an increasingly important topic in today’s world where data privacy and compliance is a top priority for companies, individuals, and regulators alike. In these slides, we discuss how LinkedIn implements a powerful dynamic policy enforcement engine, called ViewShift, and integrates it within its data lake. We show the query engine architecture and how catalog implementations can automatically route table resolutions to compliance-enforcing SQL views. Such views have a set of very interesting properties: (1) They are auto-generated from declarative data annotations. (2) They respect user-level consent and preferences (3) They are context-aware, encoding a different set of transformations for different use cases (4) They are portable; while the SQL logic is only implemented in one SQL dialect, it is accessible in all engines.
#SQL #Views #Privacy #Compliance #DataLake
Cardinal Health is a multi-billion dollar healthcare services company. Actually, we like to say we’re the business behind healthcare because we focus on making it more cost-effective so our customers can focus on their patients. We work with pharmacies, hospitals, doctor’s offices, surgery centers and clinical labs- basically anywhere healthcare services are offered.
As a leading provider of products and services in the healthcare supply chain, we have the broadest view of healthcare in the industry:
We have more than 30,000 employees with direct operations around the world
We deliver products and services to 40,000 customers at 60,000 locations daily
86 percent of hospitals in the U.S. use Cardinal Health products and services
We supply pharmaceuticals to fill 25 percent of branded prescriptions in the U.S.
In fact, a third of all distributed pharmaceutical, laboratory and medical products in the U.S. and Puerto Rico flow through the Cardinal Health supply chain.
We are proud to be #21 on the Fortune 500 list
Cardinal Health is committed to using our deep understanding of healthcare to deliver inventive and meaningful and solutions that make healthcare more cost-effective.
As a result, our customers have more time to focus on what matters most – their patients.
Our position within healthcare is very unique.
We have the broadest perspective of the entire healthcare system by looking across medical and pharmaceutical manufacturers to acute care, ambulatory care and retail providers. This view allows us to understand the increasing complexity of activities across the entire continuum of care.
We also focus in on each customer segment and class of trade. We have greater, deeper understanding of our customers' needs, issues and pain points. We are in the physician’s office, the lab, the hospital, the pharmacy and the retail business.
We improve the total cost of healthcare. We do this not only by efficiently managing a complex supply system, but also by improving quality, helping to reduce errors and effectively aggregating supply and demand. The by-product of this is that we are able to give providers more time to focus on caring for their patients while we focus on the supply chain.
10
Copy conceptual architecture diagram from the Concept Analysis Document (CAD)
I hope you agree …
Being essential to care is our privilege.
That’s our tagline.
And that’s our promise.
Please let me know what questions we can answer for you.