The document discusses how companies are using big data to improve business operations and outcomes. It outlines four main ways that big data is put to work: getting fast answers to new questions, predicting more and more accurately, creating a centralized data reservoir, and accelerating data-driven actions. Case studies of companies like Dell, a large bank and a European bank demonstrate how they have benefited from these big data strategies. The document advocates for tightly integrating big data into business analytics in order to realize its full potential.
The document discusses how organizations can leverage big data. It notes that the amount of data being produced is growing dramatically and will continue to do so. It outlines four ways that organizations can benefit from big data: getting fast answers to new questions, creating a centralized data reservoir, predicting outcomes more accurately, and accelerating data-driven actions. It provides examples of companies that have achieved benefits like increased revenue and customer satisfaction through big data analytics. Finally, it argues that Oracle offers an integrated platform for organizations to fully leverage big data within their business analytics.
Contexti / Oracle - Big Data : From Pilot to ProductionContexti
The document discusses challenges in moving big data projects from pilots to production. It highlights that pilots have loose SLAs and focus on a few use cases and demonstrated insights, while production requires enforced SLAs, supporting many use cases and delivering actionable insights. Key challenges in the transition include establishing governance, skills, funding models and integrating insights into operations. The document also provides examples of technology considerations and common operating models for big data analytics.
Big Data Roundtable. Why, how, where, which, and when to start doing Big DataRaul Goycoolea Seoane
Big Data Roundtable. Why, how, where, which, and when to start doing Big Data. Why Big Data is not just a new keyword, can be a competitive advantage if it's doing right and on time, and most important, before you competition.
Digital Government: Data + Government Isn't Enough | Wrangle Conference 2017Cloudera, Inc.
Government agencies are collecting and producing data at an accelerating rate, and constituents want access to this data with decreasing latency. Meeting a digitally savvy polity's desire for data while ensuring that data is open, accessible, and interpretable by all comes with unique challenges. I'll share some of these while walking through how governments are building their own data products using open data as well as empowering civic hackers. I'll also walk through why data science at the government level is fundamentally different than data science in the private sector.
The document discusses opportunities for enriching a data warehouse with Hadoop. It outlines challenges with ETL and analyzing large, diverse datasets. The presentation recommends integrating Hadoop and the data warehouse to create a "data reservoir" to store all potentially valuable data. Case studies show companies using this approach to gain insights from more data, improve analytics performance, and offload ETL processing to Hadoop. The document advocates developing skills and prototypes to prove the business value of big data before fully adopting Hadoop solutions.
Intel Big Data Analysis Peer Research Slideshare 2013Intel IT Center
This PowerPoint presentation provides insights into results of a 2013 survey about big data analytics, including a comparison to 2012 big data survey results.
This session describes the roles and skill sets required when building a Data Science team, and starting a data science initiative, including how to develop Data Science capabilities, select suitable organizational models for Data Science teams, and understand the role of executive engagement for enhancing analytical maturity at an organization.
Objective 1: Understand the knowledge and skills needed for a Data Science team and how to acquire them.
After this session you will be able to:
Objective 2: Learn about the different organizational models for forming a Data Science team and how to choose the best for your organization.
Objective 3: Understand the importance of Executive support for Data Science initiatives and role it plays in their successful deployment.
The document discusses how organizations can leverage big data. It notes that the amount of data being produced is growing dramatically and will continue to do so. It outlines four ways that organizations can benefit from big data: getting fast answers to new questions, creating a centralized data reservoir, predicting outcomes more accurately, and accelerating data-driven actions. It provides examples of companies that have achieved benefits like increased revenue and customer satisfaction through big data analytics. Finally, it argues that Oracle offers an integrated platform for organizations to fully leverage big data within their business analytics.
Contexti / Oracle - Big Data : From Pilot to ProductionContexti
The document discusses challenges in moving big data projects from pilots to production. It highlights that pilots have loose SLAs and focus on a few use cases and demonstrated insights, while production requires enforced SLAs, supporting many use cases and delivering actionable insights. Key challenges in the transition include establishing governance, skills, funding models and integrating insights into operations. The document also provides examples of technology considerations and common operating models for big data analytics.
Big Data Roundtable. Why, how, where, which, and when to start doing Big DataRaul Goycoolea Seoane
Big Data Roundtable. Why, how, where, which, and when to start doing Big Data. Why Big Data is not just a new keyword, can be a competitive advantage if it's doing right and on time, and most important, before you competition.
Digital Government: Data + Government Isn't Enough | Wrangle Conference 2017Cloudera, Inc.
Government agencies are collecting and producing data at an accelerating rate, and constituents want access to this data with decreasing latency. Meeting a digitally savvy polity's desire for data while ensuring that data is open, accessible, and interpretable by all comes with unique challenges. I'll share some of these while walking through how governments are building their own data products using open data as well as empowering civic hackers. I'll also walk through why data science at the government level is fundamentally different than data science in the private sector.
The document discusses opportunities for enriching a data warehouse with Hadoop. It outlines challenges with ETL and analyzing large, diverse datasets. The presentation recommends integrating Hadoop and the data warehouse to create a "data reservoir" to store all potentially valuable data. Case studies show companies using this approach to gain insights from more data, improve analytics performance, and offload ETL processing to Hadoop. The document advocates developing skills and prototypes to prove the business value of big data before fully adopting Hadoop solutions.
Intel Big Data Analysis Peer Research Slideshare 2013Intel IT Center
This PowerPoint presentation provides insights into results of a 2013 survey about big data analytics, including a comparison to 2012 big data survey results.
This session describes the roles and skill sets required when building a Data Science team, and starting a data science initiative, including how to develop Data Science capabilities, select suitable organizational models for Data Science teams, and understand the role of executive engagement for enhancing analytical maturity at an organization.
Objective 1: Understand the knowledge and skills needed for a Data Science team and how to acquire them.
After this session you will be able to:
Objective 2: Learn about the different organizational models for forming a Data Science team and how to choose the best for your organization.
Objective 3: Understand the importance of Executive support for Data Science initiatives and role it plays in their successful deployment.
Accelerate Digital Transformation with an Enterprise Big Data FabricCambridge Semantics
In this webinar by Cambridge Semantics' VP of Solution Engineering, Ben Szekely, you will learn more about how the Enterprise Data Fabric prevails as the bedrock of enterprise digital strategy. Connected and highly available data is the new normal - powering analytics and AI. The data lake itself is commoditized, like raw compute or disk, and becomes an unseen part of the stack. Semantic graph technology is central to Data Fabric initiatives that meaningfully contribute to digital transformation.
We share our vision for digital innovation - a shift to something powerful, expedient and future-proof. The Data Fabric connects enterprise data for unprecedented access in an overlay fashion that does not disrupt current investments. Interconnected and reliable data drives business outcomes by automating scalable AI and ML efforts. Graph technology is the way forward to realize this future.
As we begin to dive deeper into the connected world, there has been an explosion of structured and unstructured data. Additionally, advancements in Apache Hadoop and other Big Data technologies, cloud computing and machine learning tools all play into how this world will evolve. Over the last ten years, Apache Hadoop has proven to be a popular platform among seasoned developers who require a technology that can power large, complex applications. However, for customers, partners and application ISVs who write on-top of Hadoop, there is still one huge issue that remains; Interoperability. In this talk, john Mertic will take a closer look at how Apache Hadoop can become more interoperable to accelerate big data implementations.
A Modern Data Strategy for Precision MedicineCloudera, Inc.
Genomics is upon us, made possible by big data and the technologies designed to support it. Doctors, who historically used clinical data, and researchers, who historically used genomic data, are now increasingly focused on analyzing the same single data set: introducing the opportunity to share bodies of knowledge, fostering collaborative innovation, and driving toward higher standards of care.
However, this data is enormous – volumes of genomic data are expected to reach two to four exabytes per year by 2025, yet the cost of genetic sequencing has decreased 100-fold over the past 10 years.
Cloudera is helping solve the big data problem with its Apache Hadoop-based platform for large-scale data processing, discovery, and analytics; putting precision medicine within reach.
Challenges in Clinical Research: Aridhia Disrupts Technology Approach to Rese...VMware Tanzu
Join Jeff Kelly, Pivotal’s Big Data Strategist and Chris Roche, Aridhia’s CEO, to learn how Big Data and data science are being applied to clinical research. Learn…
• Why research-oriented healthcare delivery organizations and academic medical centers need an ACRIS
• How improving collaboration and productivity accelerates the discovery of insights and increases competitiveness
• Why robust data security is critical to modernizing engagement between academia, industry and healthcare
• How to reduce research costs while improving commercialization opportunities
• Why enabling transparent analysis and reproducibility of research are key to scientific progress
• Best practices to get started on your digital transformation and Big Data journey
Knowledge Graphs for Transformation: Dynamic Context for the Intelligent Ente...Neo4j
The document discusses knowledge graphs and their benefits for enterprises. Some key points:
- 2/3 of Neo4j customers have implemented knowledge graphs and 88% of CXOs believe they will significantly improve business outcomes.
- A knowledge graph is an interconnected dataset enriched with meaning to allow reasoning about data and confident decision-making.
- Neo4j offers knowledge graph products like Bloom for visualization, Graph Data Science for analytics, and Workbench for knowledge graph management.
- Knowledge graphs can transform businesses by providing dynamic context, bridging silos, and enabling predictions and innovations.
BDaas- BigData as a service by "Sherya Pal" from "Saama". The presentation was done at #doppa17 DevOps++ Global Summit 2017. All the copyrights are reserved with the author
Open Source in the Energy Industry - Creating a New Operational Model for Dat...DataWorks Summit
Centrica supplies energy to 28 million customers globally. It is developing integrated energy solutions for commercial and industrial customers through its Distributed Energy & Power division. Centrica created Io-Tahoe to provide a new operational model for data management that empowers businesses and IT to innovate using data. Io-Tahoe ingests diverse data sources into Centrica's data lake and uses smart data discovery and metadata management to create a known data model. This allows Centrica to extract more value from data through data science and gain business insights.
1° Sessione Oracle CRUI: Analytics Data Lab, the power of Big Data Investiga...Jürgen Ambrosi
I dati sono il nuovo Capitale: come il capitale finanziario, sono una risorsa che deve essere gestita, raccolta e tenuta al sicuro, ma deve essere anche investita dalle organizzazioni che vogliono ottenere vantaggio competitivo. I dati non sono una risorsa nuova, ma soltanto oggi per la prima volta sono disponbili in abbondanza assieme alle tecnologie necessarie per massimizzarne il ritorno. Esattamente come l'elettricità fu una curiosità da laboratorio per molto tempo, finché non venne resa disponibile alle masse e dunque cambiò totalmente il volto dell'industria moderna.Ecco perché per accelerare il cambiamento è necessario un approccio innovativo alla esecuzione delle iniziative orientate ai Big Data: un laboratorio analitico come catalizzatore dell'innovazione (Data Lab).In questo webinar sulle tecnologie Oracle, utilizzeremo il consueto approccio del racconto basato su casi d’uso ed esperienze concrete.
CTO Perspectives: What's Next for Data Management and Healthcare?Health Catalyst
Health Catalyst's Chief Technology Officer, Bryan Hinton, shares his perspective, thoughts, and insights on new and emerging trends for data management in healthcare. Bryan offers a brief presentation on what hospitals and healthcare systems can expect, followed by an extended Q&A.
Big Data LDN 2017: The New Dominant Companies Are Running on DataMatt Stubbs
The document discusses solutions for deriving value from data through data integration and analytics. It describes three approaches companies have taken: 1) Building a custom machine learning platform like Uber's Michelangelo. 2) Developing custom integrations for a large multinational corporation with many technologies. 3) Implementing a cloud-first enterprise data stack for a 360-degree view of customers. The cloud-first approach provides benefits like scalability, collaboration, and reduced maintenance costs.
This document discusses how Informatica's Big Data Edition and Vibe Data Stream products can be used for offloading data warehousing to Hadoop. It provides an overview of each product and how they help with challenges of developing and maintaining Hadoop-based data warehouses by improving developer productivity, making skills easier to acquire, and lowering risks. It also includes a demo of how the products integrate various data sources and platforms.
The Future of Data Management: The Enterprise Data HubCloudera, Inc.
The document discusses the enterprise data hub (EDH) as a new approach for data management. The EDH allows organizations to bring applications to data rather than copying data to applications. It provides a full-fidelity active compliance archive, accelerates time to insights through scale, unlocks agility and innovation, consolidates data silos for a 360-degree view, and enables converged analytics. The EDH is implemented using open source, scalable, and cost-effective tools from Cloudera including Hadoop, Impala, and Cloudera Manager.
ING Bank has developed a data lake architecture to centralize and govern all of its data. The data lake will serve as the "memory" of the bank, holding all data relevant for reporting, analytics, and data exchanges. ING formed an international data community to collaborate on Hadoop implementations and identify common patterns for file storage, deep data analytics, and real-time usage. Key challenges included the complexity of Hadoop, difficulty of large-scale collaboration, and ensuring analytic data received proper security protections. Future steps include standardizing building blocks, defining analytical model production, and embedding analytics in governance for privacy compliance.
Enterprise Data Hub: The Next Big Thing in Big DataCloudera, Inc.
If you missed Strata + Hadoop World, you missed quite a bit. This year's event was packed with Big Data practitioners across industries who shared their experiences and how they are driving new innovations like never before. Just because you weren't there, doesn't mean you missed out.
In this session, we'll touch on a few of the key highlights from the show, including:
Key trends in Big Data adoption
The enterprise data hub
How the enterprise data hub is used in practice
Product Keynote: Denodo 8.0 - A Logical Data Fabric for the Intelligent Enter...Denodo
Watch full webinar here: https://bit.ly/2O9gcBT
Denodo 8 expands data integration and management to data fabric with advanced data virtualization capabilities. What are they? Denodo CTO Alberto Pan will touch upon the key Denodo 8 capabilities.
The document discusses how the filmmakers used genre and narrative elements to target their intended young audience aged 15-25 and attract viewers. To establish the horror genre, they included classic stereotypes like a mysterious little girl and stingers for jump scares. They also opened with the narrative of a young married couple moving into a new home to make the audience emotionally invested in the characters before things go wrong. Feedback was gathered through social media and from other media students to get honest opinions about how successfully these techniques were employed.
Dokumen tersebut membahas tentang penguasaan bahasa manusia sebagai makhluk sosial yang membutuhkan komunikasi dengan orang lain. Bahasa merupakan alat komunikasi utama manusia yang dipelajari secara bertahap melalui lingkungan sejak usia dini. Penguasaan bahasa dapat dilakukan secara aktif maupun pasif, dan dipengaruhi oleh berbagai faktor seperti kesehatan, kecerdasan, lingkungan sosial e
Syed Kamran Raza Trimzi is an organized and highly motivated individual with excellent communication skills. He has over 15 years of experience in project management, quality management, and student support services. His career includes roles managing projects at nonprofit organizations in Pakistan and the UK, as well as roles in technical analysis, computer operations, and IT management. He has strong skills in strategic thinking, decision-making, management, and communication.
Accelerate Digital Transformation with an Enterprise Big Data FabricCambridge Semantics
In this webinar by Cambridge Semantics' VP of Solution Engineering, Ben Szekely, you will learn more about how the Enterprise Data Fabric prevails as the bedrock of enterprise digital strategy. Connected and highly available data is the new normal - powering analytics and AI. The data lake itself is commoditized, like raw compute or disk, and becomes an unseen part of the stack. Semantic graph technology is central to Data Fabric initiatives that meaningfully contribute to digital transformation.
We share our vision for digital innovation - a shift to something powerful, expedient and future-proof. The Data Fabric connects enterprise data for unprecedented access in an overlay fashion that does not disrupt current investments. Interconnected and reliable data drives business outcomes by automating scalable AI and ML efforts. Graph technology is the way forward to realize this future.
As we begin to dive deeper into the connected world, there has been an explosion of structured and unstructured data. Additionally, advancements in Apache Hadoop and other Big Data technologies, cloud computing and machine learning tools all play into how this world will evolve. Over the last ten years, Apache Hadoop has proven to be a popular platform among seasoned developers who require a technology that can power large, complex applications. However, for customers, partners and application ISVs who write on-top of Hadoop, there is still one huge issue that remains; Interoperability. In this talk, john Mertic will take a closer look at how Apache Hadoop can become more interoperable to accelerate big data implementations.
A Modern Data Strategy for Precision MedicineCloudera, Inc.
Genomics is upon us, made possible by big data and the technologies designed to support it. Doctors, who historically used clinical data, and researchers, who historically used genomic data, are now increasingly focused on analyzing the same single data set: introducing the opportunity to share bodies of knowledge, fostering collaborative innovation, and driving toward higher standards of care.
However, this data is enormous – volumes of genomic data are expected to reach two to four exabytes per year by 2025, yet the cost of genetic sequencing has decreased 100-fold over the past 10 years.
Cloudera is helping solve the big data problem with its Apache Hadoop-based platform for large-scale data processing, discovery, and analytics; putting precision medicine within reach.
Challenges in Clinical Research: Aridhia Disrupts Technology Approach to Rese...VMware Tanzu
Join Jeff Kelly, Pivotal’s Big Data Strategist and Chris Roche, Aridhia’s CEO, to learn how Big Data and data science are being applied to clinical research. Learn…
• Why research-oriented healthcare delivery organizations and academic medical centers need an ACRIS
• How improving collaboration and productivity accelerates the discovery of insights and increases competitiveness
• Why robust data security is critical to modernizing engagement between academia, industry and healthcare
• How to reduce research costs while improving commercialization opportunities
• Why enabling transparent analysis and reproducibility of research are key to scientific progress
• Best practices to get started on your digital transformation and Big Data journey
Knowledge Graphs for Transformation: Dynamic Context for the Intelligent Ente...Neo4j
The document discusses knowledge graphs and their benefits for enterprises. Some key points:
- 2/3 of Neo4j customers have implemented knowledge graphs and 88% of CXOs believe they will significantly improve business outcomes.
- A knowledge graph is an interconnected dataset enriched with meaning to allow reasoning about data and confident decision-making.
- Neo4j offers knowledge graph products like Bloom for visualization, Graph Data Science for analytics, and Workbench for knowledge graph management.
- Knowledge graphs can transform businesses by providing dynamic context, bridging silos, and enabling predictions and innovations.
BDaas- BigData as a service by "Sherya Pal" from "Saama". The presentation was done at #doppa17 DevOps++ Global Summit 2017. All the copyrights are reserved with the author
Open Source in the Energy Industry - Creating a New Operational Model for Dat...DataWorks Summit
Centrica supplies energy to 28 million customers globally. It is developing integrated energy solutions for commercial and industrial customers through its Distributed Energy & Power division. Centrica created Io-Tahoe to provide a new operational model for data management that empowers businesses and IT to innovate using data. Io-Tahoe ingests diverse data sources into Centrica's data lake and uses smart data discovery and metadata management to create a known data model. This allows Centrica to extract more value from data through data science and gain business insights.
1° Sessione Oracle CRUI: Analytics Data Lab, the power of Big Data Investiga...Jürgen Ambrosi
I dati sono il nuovo Capitale: come il capitale finanziario, sono una risorsa che deve essere gestita, raccolta e tenuta al sicuro, ma deve essere anche investita dalle organizzazioni che vogliono ottenere vantaggio competitivo. I dati non sono una risorsa nuova, ma soltanto oggi per la prima volta sono disponbili in abbondanza assieme alle tecnologie necessarie per massimizzarne il ritorno. Esattamente come l'elettricità fu una curiosità da laboratorio per molto tempo, finché non venne resa disponibile alle masse e dunque cambiò totalmente il volto dell'industria moderna.Ecco perché per accelerare il cambiamento è necessario un approccio innovativo alla esecuzione delle iniziative orientate ai Big Data: un laboratorio analitico come catalizzatore dell'innovazione (Data Lab).In questo webinar sulle tecnologie Oracle, utilizzeremo il consueto approccio del racconto basato su casi d’uso ed esperienze concrete.
CTO Perspectives: What's Next for Data Management and Healthcare?Health Catalyst
Health Catalyst's Chief Technology Officer, Bryan Hinton, shares his perspective, thoughts, and insights on new and emerging trends for data management in healthcare. Bryan offers a brief presentation on what hospitals and healthcare systems can expect, followed by an extended Q&A.
Big Data LDN 2017: The New Dominant Companies Are Running on DataMatt Stubbs
The document discusses solutions for deriving value from data through data integration and analytics. It describes three approaches companies have taken: 1) Building a custom machine learning platform like Uber's Michelangelo. 2) Developing custom integrations for a large multinational corporation with many technologies. 3) Implementing a cloud-first enterprise data stack for a 360-degree view of customers. The cloud-first approach provides benefits like scalability, collaboration, and reduced maintenance costs.
This document discusses how Informatica's Big Data Edition and Vibe Data Stream products can be used for offloading data warehousing to Hadoop. It provides an overview of each product and how they help with challenges of developing and maintaining Hadoop-based data warehouses by improving developer productivity, making skills easier to acquire, and lowering risks. It also includes a demo of how the products integrate various data sources and platforms.
The Future of Data Management: The Enterprise Data HubCloudera, Inc.
The document discusses the enterprise data hub (EDH) as a new approach for data management. The EDH allows organizations to bring applications to data rather than copying data to applications. It provides a full-fidelity active compliance archive, accelerates time to insights through scale, unlocks agility and innovation, consolidates data silos for a 360-degree view, and enables converged analytics. The EDH is implemented using open source, scalable, and cost-effective tools from Cloudera including Hadoop, Impala, and Cloudera Manager.
ING Bank has developed a data lake architecture to centralize and govern all of its data. The data lake will serve as the "memory" of the bank, holding all data relevant for reporting, analytics, and data exchanges. ING formed an international data community to collaborate on Hadoop implementations and identify common patterns for file storage, deep data analytics, and real-time usage. Key challenges included the complexity of Hadoop, difficulty of large-scale collaboration, and ensuring analytic data received proper security protections. Future steps include standardizing building blocks, defining analytical model production, and embedding analytics in governance for privacy compliance.
Enterprise Data Hub: The Next Big Thing in Big DataCloudera, Inc.
If you missed Strata + Hadoop World, you missed quite a bit. This year's event was packed with Big Data practitioners across industries who shared their experiences and how they are driving new innovations like never before. Just because you weren't there, doesn't mean you missed out.
In this session, we'll touch on a few of the key highlights from the show, including:
Key trends in Big Data adoption
The enterprise data hub
How the enterprise data hub is used in practice
Product Keynote: Denodo 8.0 - A Logical Data Fabric for the Intelligent Enter...Denodo
Watch full webinar here: https://bit.ly/2O9gcBT
Denodo 8 expands data integration and management to data fabric with advanced data virtualization capabilities. What are they? Denodo CTO Alberto Pan will touch upon the key Denodo 8 capabilities.
The document discusses how the filmmakers used genre and narrative elements to target their intended young audience aged 15-25 and attract viewers. To establish the horror genre, they included classic stereotypes like a mysterious little girl and stingers for jump scares. They also opened with the narrative of a young married couple moving into a new home to make the audience emotionally invested in the characters before things go wrong. Feedback was gathered through social media and from other media students to get honest opinions about how successfully these techniques were employed.
Dokumen tersebut membahas tentang penguasaan bahasa manusia sebagai makhluk sosial yang membutuhkan komunikasi dengan orang lain. Bahasa merupakan alat komunikasi utama manusia yang dipelajari secara bertahap melalui lingkungan sejak usia dini. Penguasaan bahasa dapat dilakukan secara aktif maupun pasif, dan dipengaruhi oleh berbagai faktor seperti kesehatan, kecerdasan, lingkungan sosial e
Syed Kamran Raza Trimzi is an organized and highly motivated individual with excellent communication skills. He has over 15 years of experience in project management, quality management, and student support services. His career includes roles managing projects at nonprofit organizations in Pakistan and the UK, as well as roles in technical analysis, computer operations, and IT management. He has strong skills in strategic thinking, decision-making, management, and communication.
This document provides a summary of the status of drawings and documents submitted for the project to upgrade the Wadi Bani Auf Primary Substation in Oman. It includes a table listing 31 civil drawings with their planned and actual status mostly marked as approved. Another table lists 15 electrical drawings and their status, with some approved with comments. A third table lists 13 electrical calculation documents and their partial status. Finally, it provides an overall summary of the total planned and actual quantities and remarks.
Looking back at your Preliminary task, what do you feel you have learnt in th...salesian2014as
The document discusses the progression in filmmaking skills from the preliminary task to the final product. Areas of improvement included camera work through better composition and use of rules of thirds. Editing improved through invisible cuts, extended pre-recording, and developing the narrative. Music and mise en scene were not utilized in the preliminary but added important elements to the final film. Better planning and production scheduling led to stronger time management.
Edudex: één standaard en één adres voor opleidingsinformatieRino Schreuder
EDU-DEX maakt aantrekkende opleidingsmarkt transparant
EDU-DEX vormt het centrale adres voor het gemeenschappelijke Nederlandse opleidingsaanbod. Duizenden cursussen en opleidingen van al meer dan 30 opleidingsinstituten worden nu bij elkaar gebracht - op basis van één nationale data-standaard. Hieronder zijn o.a. AOG, de Baak, Boertien Vergouwen Overduin, GITP, Interlingua, ISBW, NCOI, Nyenrode, Schouten & Nelissen, Tias en Zuidema. Afnemers kunnen nu op één adres selecteren welke opleidingen het beste bij ze passen; en de gegevens hierover gratis downloaden in hun eigen systeem.
Question 1 media studies evaluation mark final salesian2014as
The document discusses how the media product uses conventions of horror films. It creates the opening two minutes of a horror film set in modern-day Surrey. It uses stock narratives, characters, and settings common to the genre. These include an innocent family, a possessed child antagonist, and ordinary suburban settings. The narrative combines elements of mystery and a normal family. Video cameras are used to make it seem more realistic. Negative space is utilized but wasn't entirely effective at startling viewers so a "sting" was added to draw attention.
The Region of Murcia is an autonomous province in Spain known for being the largest producer of fruits, vegetables and flowers in Europe. Some of its most important monuments include the Cathedral of Santa Maria and the Roman Theater of Cartagena. Typical foods from the region are zarangollo, paparajotes, and Calatrava bread.
The document discusses obtaining audience feedback on a trailer posted to the video sharing site Vimeo. While Vimeo provided vague feedback through likes and downloads, more detailed feedback was gathered through a survey created on SurveyMonkey.com. The survey utilized multiple choice questions, open response boxes, and dropdown menus. Key learnings from the survey responses were that the characters and narratives were not well established in the first draft. To address this, the second half of the trailer was re-filmed taking inspiration from another trailer to better showcase the characters and conflict between communities.
This document outlines the benefits and process for getting a WordPress plugin listed on the WordPress.org repository. Key benefits include getting more exposure for the plugin, encouraging quality code through reviews, and allowing for version control and updates. The process involves developing a decent plugin, creating a readme file that passes validation, submitting for review, and uploading the code to the subversion repository. Additional recommendations are making the plugin translation ready, adding a banner and screenshots, and providing support.
This document discusses the conventions and themes commonly found in gangster-crime genre media. It identifies stock narratives around corrupt police, revenge, crime families, and star-crossed lovers. Common stock characters are mentioned like mafia bosses, recruits, rivals, and hoodlums. Settings often include cities, nightclubs, mansions, and casinos. Iconography may include drugs, crime, money, suits, fast cars, and guns. The document then evaluates how the described media product did or did not use, develop, or challenge these conventions.
This document appears to be a scanned receipt from a company called intsig.com. It lists a date of January 15th, 2023 and includes items purchased, prices, subtotals, tax amounts and a total due of $58.23. The document serves as a record of a transaction that occurred on this date involving this company.
Illuminaon Consulting is a management consulting firm that provides strategic advisory services to help businesses address their most important challenges and opportunities. They work with clients across all major industries to develop comprehensive strategies, optimize operations, and transform organizations. Their team of experienced consultants leverage data and analytics to deliver practical solutions and ensure clients achieve meaningful, sustainable results.
The document discusses how organizations can leverage big data. It notes that the amount of data being produced is rapidly increasing and will continue to do so with more smart devices. The document outlines how organizations can use big data to improve existing processes, create new opportunities, run their business more effectively by organizing data for specific uses, and change their business by exploring raw data to discover new applications. It provides examples of companies in various industries that have been able to gain competitive advantages by leveraging big data in these ways.
The document discusses how big data and analytics can transform businesses. It notes that the volume of data is growing exponentially due to increases in smartphones, sensors, and other data producing devices. It also discusses how businesses can leverage big data by capturing massive data volumes, analyzing the data, and having a unified and secure platform. The document advocates that businesses implement the four pillars of data management: mobility, in-memory technologies, cloud computing, and big data in order to reduce the gap between data production and usage.
The document discusses big data and business analytics. It notes that the volume of data created in the last two years is greater than the previous history and is estimated to grow 50 times by 2020. It highlights challenges of volume, velocity, and variety of data and the importance of analyzing data to run and change businesses. The document promotes Oracle's comprehensive big data solutions including Hadoop, NoSQL databases, and analytics applications.
The document discusses Oracle's fast data solutions for helping organizations remove event-to-action latency and maximize the value of high-velocity data. It describes how fast data solutions can filter, move, transform, analyze and act on data in real-time to drive better business outcomes. Oracle provides a portfolio of products for fast data including Oracle Event Processing, Oracle Coherence, Oracle Data Integrator and Oracle Real-Time Decisions that work together to capture, filter, enrich, load and analyze streaming data and trigger automated decisions.
Conociendo y entendiendo a tu cliente mediante monitoreo, analíticos y big dataMundo Contact
“…Yo soy tu consumidor”… Conociendo y entendiendo a tu cliente mediante monitoreo, analíticos y big data.
Simón Torres, Oracle Pre-Sales Consultants, CX.
Expand a Data warehouse with Hadoop and Big Datajdijcks
After investing years in the data warehouse, are you now supposed to start over? Nope. This session discusses how to leverage Hadoop and big data technologies to augment the data warehouse with new data, new capabilities and new business models.
A6 harnessing the power of big data and business analytics to transform bus...Dr. Wilfred Lin (Ph.D.)
The document discusses harnessing the power of big data and business analytics to transform business. It provides an overview of Oracle's business analytics products and solutions, including Oracle BI, Endeca, Exalytics, cloud offerings, and industry-specific applications. Key capabilities covered include capturing all types of data, combining data for deeper insights, predictive analytics, and accelerating insights to actions. Examples of strategic recommendations for big data are also provided.
Paul Sonderegger, Oracle MassTLC Big Data Summit KeynoteMassTLC
The document discusses strategies for businesses in a big data world. It explains that digitization and datafication have led to more data being created about thoughts, things, and activities. It emphasizes that strategy is about creating unique value in a unique way. The document outlines different ways that businesses can use data to either run their existing business or change and transform their business. It stresses the importance of building a big data strategy that focuses on data market share, proprietary data assets, and using data to generate more data.
3 reach new heights of operational effectiveness while simplifying it with or...Dr. Wilfred Lin (Ph.D.)
This document discusses how Oracle Business Analytics and Oracle Exalytics can help organizations optimize processes, simplify operations, and innovate through business analytics. It highlights key capabilities of Oracle's business intelligence and analytics platforms, such as providing a single view of data, advanced in-memory analytics, pre-built analytics applications, and the ability to gain insights from both structured and unstructured data in real-time. The platforms are presented as ways to improve business performance, manage risk through a common analytics framework, and lower costs through simplified IT architectures.
Tdwi austin simplifying big data delivery to drive new insights finalSal Marcus
Khader Mohiuddin, a Big Data Solution Architect at Oracle, presented on simplifying big data delivery and driving new insights. He discussed opportunities and challenges with big data, including using customer data to improve experiences and manage risk. Mohiuddin also outlined Oracle's vision for analyzing all data types and described Oracle's big data platform and engineered systems for high-performance data acquisition, organization, analysis, and visualization. Case studies were presented on customers achieving new revenue, optimizing operations, and managing risk through big data analytics on Oracle's platform.
Extreme Analytics - What's New With Oracle Exalytics X3-4 & T5-8?KPI Partners
http://www.kpipartners.com/watch-extreme-analytics-whats-new-with-oracle-exalytics-x3-4-t5-8 … Analytics is all about gaining insights from data for better decision making.
Part 1 - Engineered Systems
Part 2 - Hardware & Software Together
Part 3 - Exalytics Benefits
Part 4 - Customer Results & Pricing
Part 5 - Success Story: Getting Started w/Exalytics
Part 6 - Q&A Session
A recent study by Harvard Business Review cited that top performing organizations use analytics five times more than low performers. However, the vision of delivering fast, interactive, insightful analytics has remained elusive for most organizations.
Most enterprise analytics solutions require dealing with a number of hardware, software, storage and networking vendors, and precious resources are wasted integrating the hardware and software components to deliver a complete analytical solution. A high-performance business intelligence system also requires fast connectivity to data warehouses, operational systems and other data sources.
Oracle Exalytics is an optimized engineered system to provide the highest levels of performance for business intelligence (BI) and enterprise performance management (EPM) applications such as Oracle Business Intelligence, Endeca, and Essbase.
Join team members from Oracle and KPI Partners for this virtual event that examines new releases of the leading engineered system for enterprise analytics: Exalytics X3-4 & T5-8.
This document discusses Oracle's data integration and governance solutions for big data. It describes how Oracle uses data integration to load and transform data from various sources into a data reservoir. It also emphasizes the importance of data governance when managing big data and describes Oracle's metadata management, data profiling, and data cleansing tools to help govern data in the reservoir.
This document is a presentation on Big Data by Oleksiy Razborshchuk from Oracle Canada. The presentation covers Big Data concepts, Oracle's Big Data solution including its differentiators compared to DIY Hadoop clusters, and use cases and implementation examples. The agenda includes discussing Big Data, Oracle's solution, and use cases. Key points covered are the value of Oracle's Big Data Appliance which provides faster time to value and lower costs compared to building your own Hadoop cluster, and how Oracle provides an integrated Big Data environment and analytics platform. Examples of Big Data solutions for financial services are also presented.
The document discusses Oracle Database 12c and its capabilities for cloud computing, database as a service, and big data. It highlights features like Oracle Multitenant that allow for more efficient consolidation on clouds and simpler provisioning of database as a service. It also describes Oracle's approach to integrating Hadoop and Oracle Database for big data and analytics.
The document outlines Oracle's approach to helping organizations move to the cloud through six journeys. It discusses Oracle's cloud offerings including Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) and how they can be used to create new applications, move and improve existing workloads, and upgrade existing on-premises applications to Software as a Service (SaaS). It also describes Oracle's cloud deployment models of public cloud, cloud at customer, and bringing the cloud on-premises. Case studies are provided on how various organizations have benefited from optimizing their on-premises environments and upgrading to Oracle SaaS applications in the cloud.
Big Data Tools: A Deep Dive into Essential ToolsFredReynolds2
Today, practically every firm uses big data to gain a competitive advantage in the market. With this in mind, freely available big data tools for analysis and processing are a cost-effective and beneficial choice for enterprises. Hadoop is the sector’s leading open-source initiative and big data tidal roller. Moreover, this is not the final chapter! Numerous other businesses pursue Hadoop’s free and open-source path.
Insights into Real-world Data Management ChallengesDataWorks Summit
Oracle began with the belief that the foundation of IT was managing information. The Oracle Cloud Platform for Big Data is a natural extension of our belief in the power of data. Oracle’s Integrated Cloud is one cloud for the entire business, meeting everyone’s needs. It’s about Connecting people to information through tools which help you combine and aggregate data from any source.
This session will explore how organizations can transition to the cloud by delivering fully managed and elastic Hadoop and Real-time Streaming cloud services to built robust offerings that provide measurable value to the business. We will explore key data management trends and dive deeper into pain points we are hearing about from our customer base.
A7 getting value from big data how to get there quickly and leverage your c...Dr. Wilfred Lin (Ph.D.)
The document discusses how organizations can get value from big data quickly by leveraging their current infrastructure. It outlines Oracle's big data reference architecture and services for strategy, implementation, and optimization. Case studies show how Land O' Lakes optimized sales performance and a consumer goods company gained insights into shopper behavior to increase revenue.
The document discusses Oracle's cloud-based data lake and analytics platform. It provides an overview of the key technologies and services available, including Spark, Kafka, Hive, object storage, notebooks and data visualization tools. It then outlines a scenario for setting up storage and big data services in Oracle Cloud to create a new data lake for batch, real-time and external data sources. The goal is to provide an agile and scalable environment for data scientists, developers and business users.
The Cloudera Impala project is pioneering the next generation of Hadoop capabilities: the convergence of interactive SQL queries with the capacity, scalability, and flexibility of a Hadoop cluster. In this webinar, join Cloudera and MicroStrategy to learn how Impala works, how it is uniquely architected to provide an interactive SQL experience native to Hadoop, and how you can leverage the power of MicroStrategy 9.3.1 to easily tap into more data and make new discoveries.
Similar to 13 2792 big-data_keynote_presentation_finalpass_05_d_v02 (20)
Things to Consider When Choosing a Website Developer for your Website | FODUUFODUU
Choosing the right website developer is crucial for your business. This article covers essential factors to consider, including experience, portfolio, technical skills, communication, pricing, reputation & reviews, cost and budget considerations and post-launch support. Make an informed decision to ensure your website meets your business goals.
“An Outlook of the Ongoing and Future Relationship between Blockchain Technologies and Process-aware Information Systems.” Invited talk at the joint workshop on Blockchain for Information Systems (BC4IS) and Blockchain for Trusted Data Sharing (B4TDS), co-located with with the 36th International Conference on Advanced Information Systems Engineering (CAiSE), 3 June 2024, Limassol, Cyprus.
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
Have you ever been confused by the myriad of choices offered by AWS for hosting a website or an API?
Lambda, Elastic Beanstalk, Lightsail, Amplify, S3 (and more!) can each host websites + APIs. But which one should we choose?
Which one is cheapest? Which one is fastest? Which one will scale to meet our needs?
Join me in this session as we dive into each AWS hosting service to determine which one is best for your scenario and explain why!
Best 20 SEO Techniques To Improve Website Visibility In SERPPixlogix Infotech
Boost your website's visibility with proven SEO techniques! Our latest blog dives into essential strategies to enhance your online presence, increase traffic, and rank higher on search engines. From keyword optimization to quality content creation, learn how to make your site stand out in the crowded digital landscape. Discover actionable tips and expert insights to elevate your SEO game.
Monitoring and Managing Anomaly Detection on OpenShift.pdfTosin Akinosho
Monitoring and Managing Anomaly Detection on OpenShift
Overview
Dive into the world of anomaly detection on edge devices with our comprehensive hands-on tutorial. This SlideShare presentation will guide you through the entire process, from data collection and model training to edge deployment and real-time monitoring. Perfect for those looking to implement robust anomaly detection systems on resource-constrained IoT/edge devices.
Key Topics Covered
1. Introduction to Anomaly Detection
- Understand the fundamentals of anomaly detection and its importance in identifying unusual behavior or failures in systems.
2. Understanding Edge (IoT)
- Learn about edge computing and IoT, and how they enable real-time data processing and decision-making at the source.
3. What is ArgoCD?
- Discover ArgoCD, a declarative, GitOps continuous delivery tool for Kubernetes, and its role in deploying applications on edge devices.
4. Deployment Using ArgoCD for Edge Devices
- Step-by-step guide on deploying anomaly detection models on edge devices using ArgoCD.
5. Introduction to Apache Kafka and S3
- Explore Apache Kafka for real-time data streaming and Amazon S3 for scalable storage solutions.
6. Viewing Kafka Messages in the Data Lake
- Learn how to view and analyze Kafka messages stored in a data lake for better insights.
7. What is Prometheus?
- Get to know Prometheus, an open-source monitoring and alerting toolkit, and its application in monitoring edge devices.
8. Monitoring Application Metrics with Prometheus
- Detailed instructions on setting up Prometheus to monitor the performance and health of your anomaly detection system.
9. What is Camel K?
- Introduction to Camel K, a lightweight integration framework built on Apache Camel, designed for Kubernetes.
10. Configuring Camel K Integrations for Data Pipelines
- Learn how to configure Camel K for seamless data pipeline integrations in your anomaly detection workflow.
11. What is a Jupyter Notebook?
- Overview of Jupyter Notebooks, an open-source web application for creating and sharing documents with live code, equations, visualizations, and narrative text.
12. Jupyter Notebooks with Code Examples
- Hands-on examples and code snippets in Jupyter Notebooks to help you implement and test anomaly detection models.
Building Production Ready Search Pipelines with Spark and MilvusZilliz
Spark is the widely used ETL tool for processing, indexing and ingesting data to serving stack for search. Milvus is the production-ready open-source vector database. In this talk we will show how to use Spark to process unstructured data to extract vector representations, and push the vectors to Milvus vector database for search serving.
Full-RAG: A modern architecture for hyper-personalizationZilliz
Mike Del Balso, CEO & Co-Founder at Tecton, presents "Full RAG," a novel approach to AI recommendation systems, aiming to push beyond the limitations of traditional models through a deep integration of contextual insights and real-time data, leveraging the Retrieval-Augmented Generation architecture. This talk will outline Full RAG's potential to significantly enhance personalization, address engineering challenges such as data management and model training, and introduce data enrichment with reranking as a key solution. Attendees will gain crucial insights into the importance of hyperpersonalization in AI, the capabilities of Full RAG for advanced personalization, and strategies for managing complex data integrations for deploying cutting-edge AI solutions.
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdfMalak Abu Hammad
Discover how MongoDB Atlas and vector search technology can revolutionize your application's search capabilities. This comprehensive presentation covers:
* What is Vector Search?
* Importance and benefits of vector search
* Practical use cases across various industries
* Step-by-step implementation guide
* Live demos with code snippets
* Enhancing LLM capabilities with vector search
* Best practices and optimization strategies
Perfect for developers, AI enthusiasts, and tech leaders. Learn how to leverage MongoDB Atlas to deliver highly relevant, context-aware search results, transforming your data retrieval process. Stay ahead in tech innovation and maximize the potential of your applications.
#MongoDB #VectorSearch #AI #SemanticSearch #TechInnovation #DataScience #LLM #MachineLearning #SearchTechnology
Hello, welcome, etc.An apology for human heart, truth, beauty? Those days are gone for all the reasons that McAfee said.Industry going to go through the Moneyball experience. Movie scene: Great face, pretty girlfriend means confidence. But it’s not true.Data helps you overcome biases.What’s moneyball like for you? What aspects of your business transformed by data? And how do you get there first?Can’t think of industry that won’t be affected by big data. Big data is electricity of 21st C
In 1752, Benjamin Franklin conducted the now famous experiment of attaching a key to his kite and flying it during a thunderstormThe sparks that jumped from the key to his hand confirmed that lighting was, indeed, electric in nature.Knuckles close to key and sparks jump. Captures in leden jar – proved lightning same as what he could do in his labProblem he was trying to solve was lightning strikes – instaled first rods that summer
His discovery, and the work of many other inventors in the field of electricity (i.e. Nikola Tesla, Thomas Edison, Alessandro Volta), went on to change the world in ways previously unimaginableThe power provided by electricity opened the floodgates of innovation in business, government, and private lifeTook a while to make electric useful – change factories, lights in houses, etc. Huge
Today, we are on the cusp of the same magnitude of transformation thanks to dataficationDatafication is the capture and use of information in more daily activities, and we are seeing this happen everywhereBig data is at the same stage. Huge wave of innovation coming that come from capturing dat.Datafication is just as big a deal.
The datafication of things through sensors collecting information from cars, medical devices, stop lights, and factory equipmentOur daily activities and processes are experiencing datafication and have been for years. You no longer have to go to the bank to deposit a check; you can use your phone. Running on the treadmill at the gym? Chances are good you might be wearing a heart-rate monitor or a Nike Fuel Band to track your physical activityEven the natural world is being data-fied through the use of satellite imagery and climate sensing devicesThe sheer amount of data captured in our day-to-day activities is astounding.In just one minute, Facebook users share over half a million pieces of content (Source: http://www.visualnews.com/2012/06/19/how-much-data-created-every-minute/)And it’s not just people generating data: “a single jet engine can generate 10TB of data in 30 minutes. With more than 25,000 airline flights per day, the daily volume of just this single data source runs into the Petabytes.” http://www.oracle.com/us/products/database/big-data-for-enterprise-519135.pdfEngines aren’t owned. Airlines buy the thrust. So need good data on what that engine delivers.
All this new data holds tremendous power and potential to change the way our organizations do businessWhether it be capturing the thoughts and opinions of our customers to create better marketing campaignsUsing sensors to manage buildings, capital equipment, and improve maintenance and service costsOr streamlining processes based on new data insights, the possibilities are endlessOracle bought Proteus. Pills swallowed. Stomach acid supplied power. Reduces readmissions.Huge datafication of all.
While we are in the early stages of datafication, all signs point to continued growthSmart Devices are predicted to grow from 1.3B in 2013 to 12.5B in 2020And data generated from “things” is growing at a rate of 22 times over 5 years, from 2011-2016 (Source: IDC 2011, Cisco,, Cloudera, and Machina Research http://blog.iobridge.com/2012/02/cisco-reports-mobile-internet-of-things-traffic-to-grow/)Don’t run into things that grow that big that quick.What’s not to like
However, while we are creating and collecting mountains of data, our ability to produce it has outstripped our ability to use itAccording to a study we conducted with The Economist Intelligence Unit, only 12% of executives feel they understand the impact data will have on their organizations over the next three years.” (Source: http://www.oracle.com/webapps/dialogue/ns/dlgwelcome.jsp?p_ext=Y&p_dlg_id=13367869&src=7634271&Act=143 )A great example of this is airlines Airlines were one of the first data innovators, going all the way back to the 1960s when they began using SABRE: the first online ticketing system and one of the first big enterprise applications.Today, data is still very important to airlines that publish an average of half a million fares every day, and update them four times per day.Most have large analytics teams, with dozens of operations research analysts.But even with a large team dedicated to analytics, they throw away their fleet operational data every day because it’s so big there’s nowhere to put it and analyze itAs a result, they don’t have access to the potential insights this data holds (Source: a presentation by Jim Diamond, Managing Director of Operations & Research at American Airlines. Given at the Evanta CIO event in Dallas, TX 6/7/13)The same is true for many businesses: the information they need to improve products and services already exists, they’re just not quite sure how to use it.
Electricity followed a similar two path trajectory.Benjamin Franklin conducted his experiment in 1752, but it took a century for his discovery to move from a scientific phenomenon to something with practical implicationsWhen we think of how electricity changed the world, we often think of the major innovations that replaced previous methods of generating energyThe incandescent light bulb replaced dangerous kerosene and gas lampsThe electric locomotive replaced the steam engineElectrical power transformed factories and production facilities, ushering in the Second Industrial RevolutionBut electricity also changed the world by providing a platform for the creation of products that never existed prior to its discoveryMicrowaves, toaster ovens, and dishwashers are just a few examplesToday, it’s difficult to imagine a life without these modern conveniences.PS:Don’t worry. This has happened before. Electricity to a similar path. Benjamin Franklin conducted his experiment in 1752, but it took a century for his discovery to move from a scientific phenomenon to something with practical implications. The same is happening with big data. And a lot faster.But this raises a new question: How do you make big data useful? Every company is asking this question right now. But for large organizations like yours, the question is slightly different. How do you bring big data into the enterprise to make it useful? You already have hundreds of millions, if not billions, of dollars invested in analytical technologies. How do you bring big data into the enterprise analytical environment to make it useful?To understand this, we need to look at big data’s effect on the enterprise analytical environment.Don’t have 150 years with big dataNot a greenfield. So how do you bring big data into the existing environment with $MMM already invested. What effedt does big data have in what exists
Until recently, most companies have been extracting value from data by carefully selecting and standardizing the data collected based on pre-determined relationshipsWe call this the “Run the Business” approach to big data because it is primarily about keeping existing systems and processes functioning properly.PS:You already have analytics you use to run the business – data warehouses, reports and dashboards. You carefully select and standardize the data you need to solve a specific problems, like running a marketing campaign or billing customers. Relational, analytical envs run the business
This enables companies to solve very specific problems, like automating customer bill pay, running supply chain systems efficiently, or confidently closing fiscal periods with accurate numbers—and is a powerful way to reduce the time, cost and effort of standardizing and controlling processes This is the world of the relational database that modern companies and economies run on todayPS:This is a powerful way to reduce the time, cost and effort of standardizing and controlling processes to run the business. The data coursing through this enviroment will increase in volume and velocity – it will get bigger and arrive faster than today. You’ll need more processing power to handle this, but that’s pretty straightforward.
Now there’s something new. New streams all different. Not easy to represente relationally. Oportunity to learn from data before org into model is where you can change the business.Learn things about your business, supplietsetcHowever, with the magnitude of data now available (proprietary and third-party), it is not always clear which information might be useful By examining the data in a non-relational environment and letting it tell YOU what you can learn from it, companies are able to form and test more hypotheses more quickly, resulting in new insights they would have missed otherwisePS:But that’s not all. The increase in variety, volume, and velocity from the datafication of everything opens a new possibility – the possibility to learn from the data in new ways. Now there’s a huge amount of available data, most of it captured by other organizations. But it’s not always clear which data might be useful to you or what you might learn from it. By examining the data in a non-relational environment and letting it tell YOU what you can learn from it, companies are able to form and test more hypotheses more quickly, resulting in new insights they would have missed otherwise
We call this the “Change the Business” approach because new ideas uncovered through learning from the data often leads companies to make changes, or pivot processes and systems to achieve better results
The critical difference between the run-the-business and change-the-business environments boils down to one thing:To run the business, you organize data to make it do something specific; to change the business, you take data as-is to figure out what it can do for you.Relational technologies excel at the first, non-relational technologies at the second.
These two approaches are more powerful together than either alone. Real problem is to bring the two together.Rivals will be running the experiemtes to get their first. You need to get their first.Like the electrification of the 21st century, big data is about powering your business AND providing a platform for innovation.But to bring a non-relational environment into the corporate fold, it has to have the same basic capabilities as the relational environment the company already counts onIt has to acquire, manage,and analyze, whatever data happens to be in it, just like the traditional relational environment
To bring big data into the enterprise analytical environment you’re going to need all the standard enterprise capabilties for data acquisition, management, and analysis for relational, non-relational, and streaming environments (which can be both).For acquisition, this means you’ll need relational databases as well as NoSQL databases, plus super-small-footprint Java embedded in devices for real-time capture.For management, you’ll still need your relational data warehouses, and they’ll be complemented by Hadoop clusters plus real-time caching and event processing.For analysis, you’ll still use BI reports and dashboards, and they’ll be complemented by non-relational discovery plus real-time recommendations, alerts, and predictive analyticsThese technologies will be deployed on-premise, as well as in private and public clouds, depending on your needs.But the real magic of big data at work is having these relational, non-relational, and streaming environments seamlessly integrated together.What if you could do that? What if you could have acquisition, management and analysis for any kind of data for any purpose you could imagine? What would it mean to you?Not all data speeds up to real time, but some of both will.Go through the table and talk about how they will all interact. This is big data at work from a capability standpoint. Lofot work
Only Oracle creates products for every aspect of this unified architecture. Rather than relying on custom solutions So we can help with all that work. We have one of everything. Link the various products back to previous page. Can all be deplyed in cloud as well.But what would it mean to you to buy this stuff?
An integrated big data solution will enable you to…Get fast answers to new questions Predict more, and more accuratelyCreate a reservoir of data for potential reuseAccelerate data-driven actionLet’s look at a couple companies realizing the benefits of big data at workHere’s what it would mean (cover each of these categories and give a sentence or two on each of 4So let’s look at some stories.
Whether you’re analyzing data in a run, or change the business environment, it’s common for new insights to lead to new questions which, in turn, requires a deeper dive in to the data for answersThis process may sound never-ending, but has the potential to present new marketing opportunities, and help you discover information for solving lurking product issuesThis is the heart anssould of learning about data before it’s organized (eg cholera in haiti)
ProblemDelphi Electronics and Safety , a division of leading global auto parts supplier Delphi Automotive, had a data analysis challenge: they needed to determine if the performance of certain parts were meeting contractual levels and if improvements were neededSeems straightforward, right? Not so much. Delphi receives huge amounts of warranty data generated by its customer. Every month, the automotive manufacturers (OEMs) delivers performance data including verbatim text descriptions of issues related to its 340,000 active parts in service in millions of vehicles worldwide That’s data from over a dozen different OEM systems, each with its own distinct format, as well as data from Delphi’s own parts databases, manufacturing systems, and industry data. Additionally, Delphi had to adhere to strict time guidelines to provide responses to performance issues with parts—including a complete analysis to support their response—or be financially penalized. The real challenge was the diversity of the issues. Delphi’s warranty engineers needed to quickly combine and explore a variety of customer data sets based on the issue under investigation. Warranty Engineers were spending more time manipulating data than getting answers from it. Contractually required to pay unless prove it’s not there. All investigations different.CLICKSolutionIn the first month alone, engineers discovered the root cause of three field performance issues that could have cost them lots of money. Since then, Endeca has paid for itself many times overBut more importantly, their warranty analysts could now spend more time investigating issues and less time manipulating data. The shift in this work was so great that they had a new idea. They realized they could have a warranty strategy for each of 20,000 individual parts they manufacture and ship at a rate of 7 million pieces per month– an unprecedented innovation.This is the power of Big Data At Work – creating discoveries in a change-the-business environment which are then injected into run-the-business processes and applications to perform at a higher level. Too much data to meet 30 day window.EID pour data together without modeling. Why matters? Paid for itself in 30 days by proving not responsible for a claimBut the bigger prize was new warranty plan for 20000 parts. Much ore granular. Feed also info into design to improve products.Fast answers to new questions means more problems solves, more granular faster.
Using Endeca Information Discovery, Delphi combined these sources without having to build a model first. By indexing all the necessary data – with its diverse structures, the text with no structure at all – Delphi combined the data without having to predetermine a model to hold it. This is extremely important when you can’t know ahead of time what questions you’ll want to ask.By combining diverse sources of data types together, warranty engineers were able to quickly explore the data using Endeca’s easy to use visual analysisBringing data together. Fastest onramp to big dataBI platform can be indexed into EID. So all that investment you’ve already made comes into discovery env. Also now have native connector to HIVE and HDFS data.
Big Data has the potential to enable organizations to better predict customer action and forecast for the futureAnd, the more precise your organization can be at predicting the future (or potential future), the more you are able to adjust, reallocate resources, and reduce uncertaintyOne of those ways is real-time pricing to meet the customer demand—taking into account the unique individual, their needs, and their specific price sensitivity It can also help you prevent negative outcomes, like fraudCapturing additional details about customers can require you to upgrade your current data warehouse capabilitiesBut like adding insulation to an old house to improve energy efficiency, the investment pays offNate Silver book “why most predictions fail and some don’t” use data appropriately
Dell is a great example of a company using predictive analytics to improve their customer experience with targeted cross-sell and upsell offers.Dell brings together data from its website, social media channels, as well as offline customer data into a big data farm. This big variety of data then drives predictive analytics for promotions at the website, in the call center, in email campaigns, and even on-demand print materials.Since deploying the system, Dell has realized $132M in incremental revenue for FY12They have also seen a 10% increase in revenue, and a 20% increase in profit margin per call at their call centersThis also is the power of Big Data At Work – predictive analytics driven by machine-learning algorithms chewing on masses of diverse and changing data. This new non-relational technology is now integral to Dell’s cross-channel customer experience. Here, change-the-business analytics have become the way Dell runs its business. Making many small decisions correctly adds up over time,All this data made avail to RTD, recommends offers, etc, Dell has seen the aboveFeed all this data into algorithms, tracks what happens, iproves thingsUse it for on-demand print to determine what offers printed. Big data and physical world
They are currently using Oracle’s Real-Time Decisions in 15 countries and 30 languages in their call centers for technical service and sales, email correspondence, and social media to offer personalized, targeted product and service recommendations across multiple channels.Each of these channels is able to perform self-learning decisions to optimize the next best action.Real-Time Decisions is the non-relational predictive analytics technology making all these predictions. It learns from the responses and adjusts its algorithms to continuously improve. It also adjusts its algorithms to use new data sources Dell believes will be valuable.
Information is unlike almost anything else in that it is not used up when you use it. You can reuse it and even repurpose it infinitely.The reason this matters is that all of you are sitting on data assets with huge potential value. According to McKinsey, the vast majority of American companies store more data than the US Library of Congress. But most of it is locked away in separate buckets. What if you could pour it all together into a great big reservoir, ready to be tapped at any moment?That would be great, but how can you do this in a cost effective way when you don’t know what value the reservoir will produce?What is the option value of data? Unlike electricty you can use and reuse data.EIU (Kukier, Mahershomberger) Value created by secondary uses of data. (cfPassur)
We worked with a large, full-service bank faced with this exact problem. The bank had to comply with regulations requiring more data to support stress testing. But there was a problem. The bank could only pull 10-15% of the necessary data from their source systems, which took 16 different nightly extracts, resulting in multiple data marts. Plus, managers also suspected there would be new requirements to the stress tests which would start this whole process over again.So they needed to evolve their information architecture to support their run the business relational warehouse with a change the business data reservoir. The bank is reaping the benefits of lower costs thanks to the reduction in the number of data marts, duplicate data stores, and fewer extracts. They can now also work with all their data; not just the 10-15% they could access beforeTheir big data solution added the missing 85%. And because they are not Hadoop experts, they appreciated the speed, time to value, and overall TCO of an appliance So created a reservoir. All into Hadoop on BDANow got access to all the data.
The bank is reaping the benefits of lower costs thanks to the reduction in the number of data marts, duplicate data stores, and fewer extracts. They can now also work with all their data; not just the 10-15% they could access beforeTheir big data solution added the missing 85%. And because they are not Hadoop experts, they appreciated the speed, time to value, and overall TCO of an appliance
They used a combination of Oracle’s Big Data Appliance with Cloudera’s distribution of Hadoop and Exadata running Oracle Database, and they seamlessly integrated those two together using Oracle’s Big Data Connectors and extreme network performance provided by Infiniband.For those of you not familiar with Hadoop, it’s a non-relational method of storing data and processing it. This bank filled the Big Data Appliance with data from legacy mainframes, operational databases, enterprise applications, and more. They created a great, big reservoir of diverse data that’s ready to be tapped and siphoned into the enterprise warehouse at a moment’s notice.Not only is this bank prepared for future changes to the stress tests, it’s now also prepared to do customer, product, and process analyses that would have been cost-prohibitive before.They are prepared for the inevitable changes in stress test and also new kinds of analysis. Data is accessible for repurposing.
One of the most critical day-to-day necessities of big data is being able to operate at a high-speedIn a constantly changing business environment, the value of real-time analytics can reduce fraud, and help your workforce create fast and accurate reporting based on the most up to date information Some data arrives quickly and want to take action really quickly
La Caixa is another example of a bank that needed a big data at work solutionThis bank was looking to monetize the relationship they have with their existing customers by delivering a location-based serviceThey do this with customers who have joined their shopping club looking for potential dealsWhen a customer uses an ATM, the bank knows where they are. They can use this information to deliver a relevant, targeted advertisement or message.But in order to make the message relevant and targeted they have to know the customer well. So they build a model that incorporates internal (to the bank and customers) and external social media, purchase records and location information (where has the customer purchased items, interacted with the bank, used ATMs)
From this model of both relational and non-relational data, they can build a model of potential interests.Once the customer uses an ATM they take the location, what the model tells them, and a list of partner merchants with offers and discounts. They automatically send the offer that seems to be the best fit to the customer’s mobile phone.They monitor the success and failure of the offers to improve their knowledge of the customer and increase future success rates
An integrated big data solution will enable you to…Get fast answers to new questions Predict more, and more accuratelyCreate a reservoir of data for potential reuseAccelerate data-driven actionLet’s look at a couple companies realizing the benefits of big data at workThis is what we are talking about – run and change business togetherHere’s why we think we can work with you
With products created for every place in the big data picture, Oracle’s portfolio provides customers with…The highest performance real-time data collectionMarket-leading data management solutionsOnly enterprise self-service Information DiscoveryBroadest range of analytics for every needBest performance with Engineered SystemsNo-Compromise deployment options: cloud, on premise, =hybrid.Unparalleled security and privacy featuresLowest total cost of ownership to procure, deploy and maintainTight integ into what you haveDiscovery in weeks not months, faster and change requirementsPredictive analytics for all envs. RTD complemented by OAA (Turkcell/fraud example)Best price/perf – see ESG paper. Exadata perfFast connection between the envs (eg Hadoop connector, fast IB etc) And ODI/HadoopSo fastest path because we make products for all these tieres.
Today, more than 250 years after electricity was discovered, people continue to use it as a platform for innovationThe electric car and the smart grid are just a couple examplesWe are in the early days of Big Data and already, we are experiencing the tremendous value and power it has to change our worldReturn to the startTook a long time for elect1893, Worlds Fair, 200,000 bulbs. Lamps were gas inhouses. This was still “magic”. Outside fair, world dark.Same with data. World is waiting for your ideas and what you can do with your data. Look forward to working with you on this.
It’s time to think about building a big data strategy that will give you a competitive advantage for years to comeThank you!
So what is the next step for your organization?Maybe you’re in need of a change the business big data environment Or perhaps you are looking to add processing power to your current run the business big data warehouseRegardless of where you find yourself, let me give you three ideas to consider as you build your big data strategy
Look beyond the value of data captured from the daily activities of your organization to the extended value chainEverything every player in your industry does produces dataWhat share of the data available does your company have access to?You might need to consider buying external dataOr extending your capabilities to capture more proprietary dataThink of this as an investment in a powerful, revenue generating source of energy for your organization
Even an investment in buying or capturing data available to your competition can give you a competitive advantageBy mixing publicly available data with your proprietary data, you make the entire collection proprietary
And using proprietary data can give you greater data market shareThis is a staple internet strategy:Google uses data from search result click-throughs to refine results for the next person who searches for the same term; Amazon uses data about customers’ past purchases to create product bundles and recommendations for other shoppersUsing data to make data opens up a first-mover advantage, and is effective in even the most traditional industriesFor example, a shipping company can use package-level sensor data to cut the cost of handling perishable goods, opening up the service to a new marketBy capturing the data exhaust from new customers using the service, they are able to refine the service to better meet customer needsThe cycle of data capture and use creates a competitive advantage that is very difficult, if not impossible for rivals to catch