This document discusses building intelligent data lakes and the challenges of data-driven digital transformation. It outlines goals around engaging customers, optimizing operations, transforming products, and empowering employees. It then discusses the generational market disruption underway and challenges around data volume/velocity, new users, new data types, and data in the cloud. Key capabilities of modern data lake architectures are presented to address these challenges. The document recommends building a data catalog, using an abstraction layer, and choosing a tightly integrated platform. It provides an example customer, BICS, and their roadmap to migrate data storage/processing from Teradata to a hybrid platform.
Big Data Analytic with Hadoop: Customer StoriesYellowfin
Why watch?
Looking to analyze your growing data assets to unlock real business benefits today? But, are you sick of all the Big Data hype and whoopla?
Watch this on-demand Webinar from Actian and Yellowfin – Big Data Analytics with Hadoop – to discover how we’re making Big Data Analytics fast and easy:
Learn how a telecommunications provider has already transformed its business using Big Data Analytics with Hadoop.
Hold on as we go from data in Hadoop to predictive analytics in just 40-minutes.
Learn how to combine Hadoop with the most advanced Big Data technologies, and world’s easiest BI solution, to quickly generate real business value from Big Data Analytics.
What will you learn?
Discover how Actian’s market-leading Big Data Analytics technologies, combined with Yellowfin’s consumer-oriented platform for reporting and analytics, makes generating value from Big Data Analytics faster and easier than you thought possible.
Join us as we demonstrate how to:
• Connect to, prepare and optimize Big Data in Hadoop for reporting and analytics.
• Perform predictive analytics on streaming Big Data: Learn how to empower all your analytics stakeholders to move from historical reports to predictive analytics and gain a sustainable competitive advantage.
• Communicate insights attained from Big Data: Optimize the value of your Big Data insights by learning how to effectively communicate analytical information to defined user groups and types.
This Webinar is ideal if…
• You want to act on more data and data types in shorter timeframes
• You want to understand the steps involved in achieving Big Data success – both front and back end
• You want to see how market leaders are leveraging Big Data to become data-driven organizations today
Looking to analyze and exploit Big Data assets stored in Hadoop? Then this Webinar is a must.
Denodo DataFest 2016: Centralizing Data Security with Data VirtualizationDenodo
Watch the full session: Denodo DataFest 2016 sessions: https://goo.gl/8S17m5
Security can be a key concern when data is spread across multiple systems residing both on-premise and on the cloud. Asurion has leveraged data virtualization to use it as a single engine for security control over the data sources. This also helps facilitate transition to modern cloud-based data architecture.
In this presentation, the Enterprise Architect at Asurion, Larry Dawson presents:
• The challenges associated with centralizing security across on-premise and cloud data sources
• How to build a single engine for security that provides audit and control by geographies
• How to build modern cloud-based data architectures using data virtualization
This session also includes a panel discussion with:
• Larry Dawson, Enterprise Architect at Asurion
• Kent Weare, Senior Enterprise Architect & Integration Lead at TransAlta
• Ken Martin, Global Center of Excellence Lead for Business Analytics Services at HCL America
• Rich Walker, VP of Sales at Denodo (as moderator)
This session is part of the Denodo DataFest 2016 event. You can also watch more Denodo DataFest sessions on demand here: https://goo.gl/VXb6M6
Denodo DataFest 2016: Enterprise View of Data with Semantic Data LayerDenodo
Watch the full session: Denodo DataFest 2016 sessions: https://goo.gl/kPmzWU
Gaining an enterprise view of the data across different independent lines of businesses is difficult when the operations, systems, and data are inherently siloed. VSP Global is a conglomerate operating different businesses across eyewear insurance, manufacturing, and retail. They are integrating the silos using a semantic data layer.
In this presentation, the Enterprise Data Architect at VSP Global, Tim Fredricks will present:
• The challenges associated with data siloed across different LOBs
• How to build a semantic data layer using data virtualization
• Centralizing business rules in the data virtualization layer
This session also includes a panel discussion with:
• Tim Fredricks, Enterprise Data Architect at VSP Global
• Rick Hart, Director of Global Technology Solutions at BioStorage Technologies
• Jeff Veis, VP Big Data Platform Marketing at HPE
• Mike Litzkow, Sales Director at Denodo (as moderator)
This session is part of the Denodo DataFest 2016 event. You can also watch more Denodo DataFest sessions on demand here: https://goo.gl/VXb6M6
This session is part of the Denodo DataFest 2016 event. You can also watch more Denodo DataFest sessions on demand here: https://goo.gl/VXb6M6
Modernizing Architecture for a Complete Data StrategyCloudera, Inc.
Data is the future of business. Either take advantage of it, or get surpassed by those who do.
In this webinar, Ovum's Tony Baer discusses the importance of building a modern data strategy that ensures your journey with Apache Hadoop and big data is a successful one. Together, we'll walk through how to build a plan for long-term success while realizing short-term gains, including:
How to pinpoint the business goals that matter most
How to assess your strengths and weaknesses to meet those goals
How to build a thoughtful approach that ensures your initiatives succeed
"Hadoop: What we've learned in 5 years", Martin Oberhuber, Senior Data Scient...Dataconomy Media
"Hadoop 2015: What we’ve learned in 5 years", Martin Oberhuber, Senior Data Scientist at ThinkBig
YouTube Link: https://www.youtube.com/watch?v=odOTsGgfzm8
Watch more from Data Natives 2015 here: http://bit.ly/1OVkK2J
Visit the conference website to learn more: www.datanatives.io
Follow Data Natives:
https://www.facebook.com/DataNatives
https://twitter.com/DataNativesConf
Stay Connected to Data Natives by Email: Subscribe to our newsletter to get the news first about Data Natives 2016: http://bit.ly/1WMJAqS
Keynote from Big Data World Show Singapore, April 2015.
• How is data driving change?
• Where are the opportunities, across industries?
• What is required to gain value from data?
• How can you get started today?
Advanced Analytics and Machine Learning with Data Virtualization (India)Denodo
Watch full webinar here: https://bit.ly/3dMN503
Advanced data science techniques, like machine learning, have proven an extremely useful tool to derive valuable insights from existing data. Platforms like Spark, and complex libraries for R, Python, and Scala put advanced techniques at the fingertips of the data scientists. However, these data scientists spend most of their time looking for the right data and massaging it into a usable format. Data virtualization offers a new alternative to address these issues in a more efficient and agile way.
Watch this session to learn how companies can use data virtualization to:
- Create a logical architecture to make all enterprise data available for advanced analytics exercise
- Accelerate data acquisition and massaging, providing the data scientist with a powerful tool to complement their practice
- Integrate popular tools from the data science ecosystem: Spark, Python, Zeppelin, Jupyter, etc
This presentation given by Think Big's senior data scientist Eliano Marques at Digital Natives conference in Berlin, Germany (November 2015), details how to go from experimentation to productionization for a predictive maintenance use case.
Presentation by Ivan Schotsmans (DV Community) at the Data Vault Modelling an...Patrick Van Renterghem
The start of GDPR implementations in Europe was, for most organizations, also the start of rethinking their Data Warehouse strategy. The experience of past implementations gave a better view on the do's and don'ts. One of the important lessons learned was the approach of handling information quality. It's not something you handle on top of your data warehouse. To be successful, information quality goes hand in hand with your data warehouse implementation.
Presentation by Luc Delanglez (DataLumen) at the Data Vault Modelling and Dat...Patrick Van Renterghem
During this session, Luc Delanglez provides some practical insights to get you started the right way on Traceability, Roles & responsibilities, Data stewardship, Data lineage and impact analysis, Critical functional components, Business Glossary, Data Catalog, Data Quality, Master/Reference Data Management, Compliancy & privacy, all very important aspects of data governance.
Accelerating Self-Service Analytics with Denodo and Tableau (Singapore)Denodo
Watch full webinar here: https://bit.ly/3kL160o
Presented at Tableau Public Sector Day 2020, Singapore
Enterprise organizations are shifting to self-service analytics as business users need real-time access to holistic and consistent views of data regardless of its location, source or type for arriving at critical decisions.
Data Virtualization and Data Visualization work together through a universal semantic layer. Learn how they enable self-service data discovery and improve performance of your reports and dashboards.
Watch this on-demand session to learn more about:
- Combined use of Denodo and Tableau to achieve the best self-service BI experience
- How data virtualization enables self-service analytics
- Use case and lessons from customer’s success
- Features in Denodo Tableau Native Connector
Is your big data journey stalling? Take the Leap with Capgemini and ClouderaCloudera, Inc.
Transitioning to a Big Data architecture is a big step; and the complexity of moving existing analytical services onto modern platforms like Cloudera, can seem overwhelming.
Big Data Analytic with Hadoop: Customer StoriesYellowfin
Why watch?
Looking to analyze your growing data assets to unlock real business benefits today? But, are you sick of all the Big Data hype and whoopla?
Watch this on-demand Webinar from Actian and Yellowfin – Big Data Analytics with Hadoop – to discover how we’re making Big Data Analytics fast and easy:
Learn how a telecommunications provider has already transformed its business using Big Data Analytics with Hadoop.
Hold on as we go from data in Hadoop to predictive analytics in just 40-minutes.
Learn how to combine Hadoop with the most advanced Big Data technologies, and world’s easiest BI solution, to quickly generate real business value from Big Data Analytics.
What will you learn?
Discover how Actian’s market-leading Big Data Analytics technologies, combined with Yellowfin’s consumer-oriented platform for reporting and analytics, makes generating value from Big Data Analytics faster and easier than you thought possible.
Join us as we demonstrate how to:
• Connect to, prepare and optimize Big Data in Hadoop for reporting and analytics.
• Perform predictive analytics on streaming Big Data: Learn how to empower all your analytics stakeholders to move from historical reports to predictive analytics and gain a sustainable competitive advantage.
• Communicate insights attained from Big Data: Optimize the value of your Big Data insights by learning how to effectively communicate analytical information to defined user groups and types.
This Webinar is ideal if…
• You want to act on more data and data types in shorter timeframes
• You want to understand the steps involved in achieving Big Data success – both front and back end
• You want to see how market leaders are leveraging Big Data to become data-driven organizations today
Looking to analyze and exploit Big Data assets stored in Hadoop? Then this Webinar is a must.
Denodo DataFest 2016: Centralizing Data Security with Data VirtualizationDenodo
Watch the full session: Denodo DataFest 2016 sessions: https://goo.gl/8S17m5
Security can be a key concern when data is spread across multiple systems residing both on-premise and on the cloud. Asurion has leveraged data virtualization to use it as a single engine for security control over the data sources. This also helps facilitate transition to modern cloud-based data architecture.
In this presentation, the Enterprise Architect at Asurion, Larry Dawson presents:
• The challenges associated with centralizing security across on-premise and cloud data sources
• How to build a single engine for security that provides audit and control by geographies
• How to build modern cloud-based data architectures using data virtualization
This session also includes a panel discussion with:
• Larry Dawson, Enterprise Architect at Asurion
• Kent Weare, Senior Enterprise Architect & Integration Lead at TransAlta
• Ken Martin, Global Center of Excellence Lead for Business Analytics Services at HCL America
• Rich Walker, VP of Sales at Denodo (as moderator)
This session is part of the Denodo DataFest 2016 event. You can also watch more Denodo DataFest sessions on demand here: https://goo.gl/VXb6M6
Denodo DataFest 2016: Enterprise View of Data with Semantic Data LayerDenodo
Watch the full session: Denodo DataFest 2016 sessions: https://goo.gl/kPmzWU
Gaining an enterprise view of the data across different independent lines of businesses is difficult when the operations, systems, and data are inherently siloed. VSP Global is a conglomerate operating different businesses across eyewear insurance, manufacturing, and retail. They are integrating the silos using a semantic data layer.
In this presentation, the Enterprise Data Architect at VSP Global, Tim Fredricks will present:
• The challenges associated with data siloed across different LOBs
• How to build a semantic data layer using data virtualization
• Centralizing business rules in the data virtualization layer
This session also includes a panel discussion with:
• Tim Fredricks, Enterprise Data Architect at VSP Global
• Rick Hart, Director of Global Technology Solutions at BioStorage Technologies
• Jeff Veis, VP Big Data Platform Marketing at HPE
• Mike Litzkow, Sales Director at Denodo (as moderator)
This session is part of the Denodo DataFest 2016 event. You can also watch more Denodo DataFest sessions on demand here: https://goo.gl/VXb6M6
This session is part of the Denodo DataFest 2016 event. You can also watch more Denodo DataFest sessions on demand here: https://goo.gl/VXb6M6
Modernizing Architecture for a Complete Data StrategyCloudera, Inc.
Data is the future of business. Either take advantage of it, or get surpassed by those who do.
In this webinar, Ovum's Tony Baer discusses the importance of building a modern data strategy that ensures your journey with Apache Hadoop and big data is a successful one. Together, we'll walk through how to build a plan for long-term success while realizing short-term gains, including:
How to pinpoint the business goals that matter most
How to assess your strengths and weaknesses to meet those goals
How to build a thoughtful approach that ensures your initiatives succeed
"Hadoop: What we've learned in 5 years", Martin Oberhuber, Senior Data Scient...Dataconomy Media
"Hadoop 2015: What we’ve learned in 5 years", Martin Oberhuber, Senior Data Scientist at ThinkBig
YouTube Link: https://www.youtube.com/watch?v=odOTsGgfzm8
Watch more from Data Natives 2015 here: http://bit.ly/1OVkK2J
Visit the conference website to learn more: www.datanatives.io
Follow Data Natives:
https://www.facebook.com/DataNatives
https://twitter.com/DataNativesConf
Stay Connected to Data Natives by Email: Subscribe to our newsletter to get the news first about Data Natives 2016: http://bit.ly/1WMJAqS
Keynote from Big Data World Show Singapore, April 2015.
• How is data driving change?
• Where are the opportunities, across industries?
• What is required to gain value from data?
• How can you get started today?
Advanced Analytics and Machine Learning with Data Virtualization (India)Denodo
Watch full webinar here: https://bit.ly/3dMN503
Advanced data science techniques, like machine learning, have proven an extremely useful tool to derive valuable insights from existing data. Platforms like Spark, and complex libraries for R, Python, and Scala put advanced techniques at the fingertips of the data scientists. However, these data scientists spend most of their time looking for the right data and massaging it into a usable format. Data virtualization offers a new alternative to address these issues in a more efficient and agile way.
Watch this session to learn how companies can use data virtualization to:
- Create a logical architecture to make all enterprise data available for advanced analytics exercise
- Accelerate data acquisition and massaging, providing the data scientist with a powerful tool to complement their practice
- Integrate popular tools from the data science ecosystem: Spark, Python, Zeppelin, Jupyter, etc
This presentation given by Think Big's senior data scientist Eliano Marques at Digital Natives conference in Berlin, Germany (November 2015), details how to go from experimentation to productionization for a predictive maintenance use case.
Presentation by Ivan Schotsmans (DV Community) at the Data Vault Modelling an...Patrick Van Renterghem
The start of GDPR implementations in Europe was, for most organizations, also the start of rethinking their Data Warehouse strategy. The experience of past implementations gave a better view on the do's and don'ts. One of the important lessons learned was the approach of handling information quality. It's not something you handle on top of your data warehouse. To be successful, information quality goes hand in hand with your data warehouse implementation.
Presentation by Luc Delanglez (DataLumen) at the Data Vault Modelling and Dat...Patrick Van Renterghem
During this session, Luc Delanglez provides some practical insights to get you started the right way on Traceability, Roles & responsibilities, Data stewardship, Data lineage and impact analysis, Critical functional components, Business Glossary, Data Catalog, Data Quality, Master/Reference Data Management, Compliancy & privacy, all very important aspects of data governance.
Accelerating Self-Service Analytics with Denodo and Tableau (Singapore)Denodo
Watch full webinar here: https://bit.ly/3kL160o
Presented at Tableau Public Sector Day 2020, Singapore
Enterprise organizations are shifting to self-service analytics as business users need real-time access to holistic and consistent views of data regardless of its location, source or type for arriving at critical decisions.
Data Virtualization and Data Visualization work together through a universal semantic layer. Learn how they enable self-service data discovery and improve performance of your reports and dashboards.
Watch this on-demand session to learn more about:
- Combined use of Denodo and Tableau to achieve the best self-service BI experience
- How data virtualization enables self-service analytics
- Use case and lessons from customer’s success
- Features in Denodo Tableau Native Connector
Is your big data journey stalling? Take the Leap with Capgemini and ClouderaCloudera, Inc.
Transitioning to a Big Data architecture is a big step; and the complexity of moving existing analytical services onto modern platforms like Cloudera, can seem overwhelming.
Datenvirtualisierung: Wie Sie Ihre Datenarchitektur agiler machen (German)Denodo
Watch full webinar here: https://bit.ly/3idAnbf
Heute werden hochwertige Daten schnell und integriert benötigt, mittlerweile häufig auch über unterschiedliche Clouds hinweg.
Datenvirtualisierung kann hier als logische Datenschicht wahre Wunder wirken und die Modernisierung der Datenarchitektur drastisch beschleunigen.
In unserem kostenlosen Webinar interviewen wir den Experten Otto Neuer von Denodo, der die hier nur angerissenen Gedanken weiter ausführt. Er wird uns Einblicke in den Wandel von Datenarchitekturen geben und wie aus seinem Blickwinkel die nächste Phase der Business Intelligence aussieht.
Was Sie mitnehmen:
- Was sind die Herausforderungen und Limitierungen traditioneller Datenarchitekturen
- Wie können mit modernen Architekturen diese Limitierungen aufgehoben werden
- Welche Rolle spielt Datenvirtualisierung bei modernen Datenarchitekturen
- Was ist die nächste Phase der Business Intelligence
Erfahren Sie am 23. September 2020, den Experten Otto Neuer von Denodo zusammen mit unserem Partner QuinScape GmbH wird uns Einblicke in den Wandel von Datenarchitekturen geben und wie aus seinem Blickwinkel die nächste Phase der Business Intelligence aussieht.
Sie haben Interesse? Dann melden Sie sich am besten direkt an - die Plätze der Veranstaltung sind begrenzt.
Next Generation Data Center - IT TransformationDamian Hamilton
Computerworld CIO Event in Hong Kong sponsored by Dimension Data, EMC & Cisco.
Insights into Dimension Data's DC strategy and recent Client engagements
Real life use cases from across Europe (Walid Aoudi - Cognizant)
This presentation will present some Cognizant Big Data clients return on experiences on continental Europe and UK. The main focus will be centered on use cases through the presentation of the business drivers behind these projects. Key highlights around the big data architecture and approach solutions will be presented. Finally, the business outcomes in terms of ROI provided by the solutions implementations will be discussed.
¿Cómo las manufacturas están evolucionando hacia la Industria 4.0 con la virt...Denodo
Watch full webinar here: https://bit.ly/3cbpipB
Uno de los sectores en los que la transformación digital está teniendo un efecto más disruptivo es el de la fabricación. Líderes del sector manufacturero están apostando por el Big Data, la computación en la nube, la inteligencia artificial y el Internet de las Cosas (IoT) entre otras tecnologías, además de contemplar la llegada de la 5G, con el fin de:
- Automatizar los procesos de manera eficiente, para permitir una mayor producción en menor tiempo
- Crear valor añadido en los productos manufacturados
- Conectar la planta industrial con el punto de venta
- Impulsar el análisis en tiempo real de datos provenientes de diferentes cadenas de producción
Sin embargo, para alcanzar estos objetivos y llevar a cabo esta revolución tecnológica, también conocida como industria 4.0, las manufacturas tienen que enfrentarse a una serie de desafíos no negligentes. El sector industrial es el que genera más datos en el mundo, y en la era digital, la velocidad, la diversidad y el volumen exponencial de los datos pueden superar las arquitecturas de TI tradicionales. Además, la mayoría de los fabricantes se enfrentan a silos de datos, lo que hace que su tratamiento sea lento y costoso. Necesitan entonces una plataforma de TI fiable que permita integrar, centralizar y analizar datos de distintas fuentes y diferentes formatos de manera ágil y segura para poner la información al servicio del negocio.
Los expertos de Enki y Denodo te proponen este seminario online para descubrir qué es la virtualización de datos, y por qué líderes del sector apuestan por esta tecnología innovadora para optimizar su estrategia de TI y conseguir un ROI significativo gracias a un acceso más rápido, simple y unificado a los datos industriales.
Accelerate Cloud Migrations and Architecture with Data VirtualizationDenodo
Watch full webinar here: https://bit.ly/3N46zxX
Cloud migration brings scalability and flexibility, and often reduced cost to organizations. But even after moving to the cloud, more often than not, organizational data can be found to be siloed, hard to access and lacking centralized governance. That leads to delay and often missed opportunities in value creation from enterprise data. Join Amit Mody, Senior Manager at Accenture, in this keynote session to learn why current physical data architectures are hindrance to value creation from data, what is a logical data fabric powered by data virtualization and how a logical data fabric can unlock the value creation potential for enterprises.
Capgemini Leap Data Transformation Framework with ClouderaCapgemini
https://www.capgemini.com/insights-data/data/leap-data-transformation-framework
The complexity of moving existing analytical services onto modern platforms like Cloudera can seem overwhelming. Capgemini’s Leap Data Transformation Framework helps clients by industrializing the entire process of bringing existing BI assets and capabilities to next-generation big data management platforms.
During this webinar, you will learn:
• The key drivers for industrializing your transformation to big data at all stages of the lifecycle – estimation, design, implementation, and testing
• How one of our largest clients reduced the transition to modern data architecture by over 30%
• How an end-to-end, fact-based transformation framework can deliver IT rationalization on top of big data architectures
Rethink Your 2021 Data Management Strategy with Data Virtualization (ASEAN)Denodo
Watch full webinar here: https://bit.ly/2O2r3NP
In the last several decades, BI has evolved from large, monolithic implementations controlled by IT to orchestrated sets of smaller, more agile capabilities that include visual-based data discovery and governance. These new capabilities provide more democratic analytics accessibility that is increasingly being controlled by business users. However, given the rapid advancements in emerging technologies such as cloud and big data systems and the fast changing business requirements, creating a future-proof data management strategy is an incredibly complex task.
Catch this on demand session to understand:
- BI program modernization challenges
- What is data virtualization and why is its adoption growing so quickly?
- How data virtualization works and how it compares to alternative approaches to data integration
- How modern data virtualization can significantly increase agility while reducing costs
MongoDB World 2019: Data Digital DecouplingMongoDB
Why data decoupling? Learn how enterprises are pivoting to decouple big monolith and legacy data platform to smaller chunk and freedom to run anywhere and run multi-cloud agility for their business
Check out this presentation from Pentaho and ESRG to learn why product managers should understand Big Data and hear about real-life products that have been elevated with these innovative technologies.
Learn more in the brief that inspired the presentation, Product Innovation with Big Data: http://www.pentaho.com/resources/whitepaper/product-innovation-big-data
Structure 2014 - Disrupting the data center - Intel sponsor workshopGigaom
Presentation from Gigaom's Structure 2014 conference, June 21-22 in San Francisco
Intel sponsor workshop: Disrupting the data center
#gigaomlive
More at http://events.gigaom.com/structure-2014/
Strategic IT Transformation Programme Delivers Next-Generation Agile IT Infra...Cognizant
The strategic use of the new technology enables Standard Life’s IT group to focus on generating business innovation and directly supporting business stakeholders in areas such as customer engagement while Cognizant manages the provisioning of Infrastructure as a Service.
Watch full webinar here: https://bit.ly/3dhbZTK
What started to evolve as the most agile and real-time enterprise data fabric, data virtualization is proving to go beyond its initial promise and is becoming one of the most important enterprise big data fabrics.
Watch this session to learn:
- What data virtualization really is.
- How it differs from other enterprise data integration technologies.
- Why data virtualization is finding enterprise-wide deployment inside some of the largest organizations.
How Financial Institutions Are Leveraging Data Virtualization to Overcome the...Denodo
Watch full webinar here: https://bit.ly/2KkJ08B
Financial institutions need to implement new strategies and services that will drive them securely to their digital objectives over their entire infrastructure.
- How to securely move legacy systems and data to new technologies such as the Big Data and Cloud?
- How to break down silos and ensure a global, centralized, secure and agile access to meaningful data?
- How to facilitate data sharing while applying strict and coherent governance and security rules?
- How to avoid downtime and to guarantee the success of IT initiaves while optimizing costs and resources?
- How to produce and to maintain efficient reports and financial aggregations for the holdings and CxO managers?
We are pleased to invite you to this online session to discover how data virtualization can answer these questions and contribute to the digital transformation of financial institutions.
WHAT IS IT ABOUT?
This virtual event will be organized in two parts. First, we will conduct a conference focusing on the impact of digital transformation in the financial sector, in addition to the general concepts of Data Virtualization and how it has supported the new business goals of financial companies in terms of IT modernization, risk management, governance and security. Then, we will conduct will conduct a hands-on session with a guided live demo to help you discover the main features and benefits of Denodo Platform for Data Virtualization.
Connecta Event: Big Query och dataanalys med Google Cloud PlatformConnectaDigital
Avancerad dataanalys och ”big data” har under de senaste åren klättrat på trendlistorna och är nu ett av de mest prioriterade områdena i utvecklingen av nya tjänster och produkter för ledarföretag i det digitala landskapet.
Informationen som byggs upp i systemen när kundmötena digitaliseras har visat sig vara guld värt. Här finns allt vi behöver veta för att göra våra affärer mer effektiva.
Sedan sommaren 2013 har Connecta tillsammans med Google ett etablerat samarbete för att hjälpa våra kunder med övergången till moln-tjänster för bland annat avancerad dataanalys. För att göra oss själva redo att hjälpa våra kunder har vi under ett antal år utvecklat såväl kunskaper som skaffat oss erfarenheter kring Googles olika moln-produkter, som exempelvis ”Big Query”.
Big Query är ett molnbaserat analysverktyg och en del av Google Cloud Platform. Big Query gör det möjligt att ställa snabba frågor mot enorma dataset på bara någon sekund. Big Query och Google Cloud Platform erbjuder färdiga lösningar för att sätta upp och underhålla en infrastruktur som med enkla medel gör allt detta möjligt.
På Connecta Digital Consultings tredje event för våren introducerade vi våra kunder och partners i koncepten dataanalys och Big Query.
Under eventet berördes följande punkter:
- Big Data och Business Intelligence (BI)
- “The Google Big Data tools” – framgångsfaktorer och hur man kommer igång
- Google Cloud Platform och hur man genomför en framgångsrik molnsatsning
Vi presenterade case och berättade om viktiga lärdomar vi dragit i samarbetet med Google och våra kunder.
Centric - Jaap huisprijzen, GTST, The Bold, IKEA en IENS. Zomaar wat toepassi...BigDataExpo
Tijdens deze presentatie wordt duidelijk hoe je machine learning kunt toepassen in het dagelijks leven. Denk aan het kopen van een huis, het kijken van Goede Tijden Slechte Tijden, shoppen bij IKEA en het bezoeken van restaurants.
In this session we'll dive into the journey that Google chooses to take in order focus on AI: what was the mindset, what were the challenges and what is the direction for the future.
Pacmed - Machine Learning in health care: opportunities and challanges in pra...BigDataExpo
The potential of personalized medicine based on machine learning is huge, but big challenges must be overcome to implement this technology in practice. Hidde will discuss both sides of the story, including a case study on the intensive care.
De Toekomst Verkenner is een ‘award winning’ innovatie van PGGM, die in een rap temp doorontwikkeling naar een platform maakt.
In zijn presentatie zal Mladen Sančanin vertellen hoe PGGM real time data en algoritmes heeft ingezet om dit platform te bouwen en hoe PGGM innovaties vanuit haar ‘Big Data Lab’ ondersteunt?
In een half uur worden veel ervaringen gedeeld over het opzetten van innovatieprojecten gebruik makend van data en het inrichten van data lab in een corporate omgeving.
Universiteit Utrecht & gghdc - Wat zijn de gezondheidseffecten van omgeving e...BigDataExpo
Het GGHDC onderzoekt wat de gezondheidseffecten zijn van omgeving en leefstijl in relatie tot het dagelijks leven van mensen. Het onderzoekscentrum is opgebouwd rond een gedeelde data- infrastructuur van de Universiteit Utrecht en het Universitair Medisch Centrum Utrecht (UMCU).
Rob van Kranenburg - Kunnen we ons een sociaal krediet systeem zoals in het o...BigDataExpo
IoT, Big Data, AI creëren een nieuwe situatie met betrekking tot het nemen van beslissingen door beleidsmakers. Toch verschuift er weinig in ons democratisch bestel, terwijl onze data in handen zijn van GAFA, China en andere nieuwe vormen van bestuur die nog ontstaan in de digitale transitie. Wij, in Europa, staan stil.
OrangeNXT - High accuracy mapping from videos for efficient fiber optic cable...BigDataExpo
Construction companies such as BAM Infra Telecom rely on accurate, up-to-date maps. Google Maps isn’t enough, but doing on-site surveys is expensive and time-consuming. However, driving through and recording 360° video from a car is cheap and easy. Using machine learning, we turn videos into highly accurate maps.
Dynniq & GoDataDriven - Shaping the future of traffic with IoT and AIBigDataExpo
Dynniq is a high-tech, innovative company offering smart mobility solutions and services internationally. We will present advanced IoT use cases Dynniq is working on, and share how GoDataDriven helps set up an AI capability. We will share our learnings, and show what makes data science in the mobility domain unique.
Teleperformance - Smart personalized service door het gebruik van Data Science BigDataExpo
Bij Teleperformance helpen we klanten waarde toe te voegen aan het klanttraject. We gebruiken Data Science voor onze Omnichannel-klantinteracties om de behoeften van de klant te voorspellen, zodat we het beste antwoord kunnen geven.
FunXtion - Interactive Digital Fitness with Data AnalyticsBigDataExpo
Digital is the new Personal. FunXtion Interactive is een interactieve trainingservaring voor zowel binnen als buiten de sportschool. FunXtion is revolutionair in de fitness branche en volledig data driven, by design. FunXtion laat zien hoe zij real-time data gebruiken voor ondersteuning van beslissingen, proces automatisering, personalisatie en product innovatie.
fashionTrade - Vroeger noemde we dat Big DataBigDataExpo
Big Data was de verzamelnaam voor alles wat je nog niet deed, maar al wel door Google of Amazon was uitgevonden. Inmiddels doen we al die dingen wel dus heet productaanbevelingen weer gewoon productaanbevelingen, fraudebestrijding weer fraudebestrijding, en spraakherkenning nog steeds spraakherkenning; geen Big Data. Geeft niet, want nu is er AI. Deze keynote legt uit of dat anders is, en waarom.
BigData Republic - Industrializing data science: a view from the trenchesBigDataExpo
What does it take to bring machine learning algorithms to production and start delivering business value? How can teams of data scientists and engineers effectively collaborate on a single product, integrate with existing IT systems and keep business stakeholders involved? Using real-life examples, we discuss the challenges and best practices.
Bicos - Hear how a top sportswear company produced cutting-edge data infrastr...BigDataExpo
Industry expert Dave Vanhoudt will set out his vision for the future of data infrastructure. Dave will highlight the key role automation must play in any data infrastructure strategy today, drawing on his current role with Medtronic, and past experiences at AB Inbev, Baxter, BMW and Nike.
Endrse - Next level online samenwerkingen tussen personalities en merken met ...BigDataExpo
Digitaal is vrijwel alles meetbaar. Maar het is vaak een uitdaging om de impact van samenwerkingen tussen influencers (topsporters) en bedrijven te analyseren. Start-up Endrse gebruikt AI om socialmediacontent te analyseren om content van influencers en bedrijven beter op elkaar te laten aansluiten. Zo maak je impact bij het publiek!
Bovag - Refine-IT - Proces optimalisatie in de automotive sectorBigDataExpo
De ontwikkelingen in de automotive sector gaan snel: elektrisch rijdende auto’s, de snelle groei van private lease, over the air connectiviteit, services on the demand en advanced driver assistance is zo maar een greep uit deze ontwikkelingen. Voorbeelden van (big) data ontwikkelingen die van grote invloed zijn op de automotive retail. De transitie naar een nieuw verdienmodel daagt uit tot samenwerken en datagedreven procesoptimalisatie.
Wilco Schellevis, directeur van Refine-IT en Renate Weggemans, manager strategie en beleid, bij BOVAG Autodealers, nemen u mee in de case Dely-App. Een mooi staaltje samenwerken en datagedreven procesoptimalisatie in de automotive retail; gevangen in één app.
Schiphol - Optimale doorstroom van passagiers op Schiphol dankzij slimme data...BigDataExpo
Schiphol is Europa’s best connected airport en verwerkt op piekdagen tot 235.000 passagiers. Om deze soepel door de processen te leiden is een betrouwbare prognose van de drukte noodzakelijk. Schiphol laat zien hoe zij datatoepassingen ontwikkelt om het aantal reizigers zo accuraat mogelijk te voorspellen en hiermee processen in te richten.
Veco - Big Data in de Supply Chain: Hoe Process Mining kan helpen kosten te r...BigDataExpo
Veco is marktleider op het gebied van het ontwerpen en vervaardigen van precisie delen middels electroformeren. In deze presentatie zal uitgeleverd worden hoe Veco succesvol Process Mining heeft ingezet in de productie om doorlooptijd te reduceren en new business te creëren. Tevens wordt uitgelegd wat Process Mining is.
Rabobank - There is something about DataBigDataExpo
Technologische mogelijkheden en GDPR, een continue clash? En hoe staat het met de het ethisch (her)gebruik van data? Leer in deze sessie van Rabobank’s Big Data journey en krijg inzicht in: organisatorische keuzes, data Lab technologie visie & data strategie, als enabler en accelerator van digitale innovatie en transformatie.
VU Amsterdam - Big data en datagedreven waardecreatie: valt er nog iets te ki...BigDataExpo
In zijn presentatie gaat Frans Feldberg in op het ‘Waarom, Wat, en Hoe’ van big data en datagedreven business model innovation. Hoe is de wereld, als het om data gaat, de laatste jaren veranderd? Waarom zijn big data, business analytics en kunstmatige intelligentie belangrijke digitale innovaties die hoog op menig managementagenda staat en waarom investeren organisaties aanzienlijk in big data en data science? Hoe kunnen organisaties waarde met data creëren door zowel het verbeteren van het bestaande business model als door nieuwe data-gedreven business modellen te ontwikkelen. Dit zijn vragen die in zijn presentatie beantwoord zullen worden.
Booking.com - Data science and experimentation at Booking.com: a data-driven ...BigDataExpo
At Booking.com we have experienced what a data driven organisation means for creating business impact. And what looks it like, when experimentation is part of your company culture.
During this session we will share our experiences and learnings on how data science and experimentation go hand in go.
Data Centers - Striving Within A Narrow Range - Research Report - MCG - May 2...pchutichetpong
M Capital Group (“MCG”) expects to see demand and the changing evolution of supply, facilitated through institutional investment rotation out of offices and into work from home (“WFH”), while the ever-expanding need for data storage as global internet usage expands, with experts predicting 5.3 billion users by 2023. These market factors will be underpinned by technological changes, such as progressing cloud services and edge sites, allowing the industry to see strong expected annual growth of 13% over the next 4 years.
Whilst competitive headwinds remain, represented through the recent second bankruptcy filing of Sungard, which blames “COVID-19 and other macroeconomic trends including delayed customer spending decisions, insourcing and reductions in IT spending, energy inflation and reduction in demand for certain services”, the industry has seen key adjustments, where MCG believes that engineering cost management and technological innovation will be paramount to success.
MCG reports that the more favorable market conditions expected over the next few years, helped by the winding down of pandemic restrictions and a hybrid working environment will be driving market momentum forward. The continuous injection of capital by alternative investment firms, as well as the growing infrastructural investment from cloud service providers and social media companies, whose revenues are expected to grow over 3.6x larger by value in 2026, will likely help propel center provision and innovation. These factors paint a promising picture for the industry players that offset rising input costs and adapt to new technologies.
According to M Capital Group: “Specifically, the long-term cost-saving opportunities available from the rise of remote managing will likely aid value growth for the industry. Through margin optimization and further availability of capital for reinvestment, strong players will maintain their competitive foothold, while weaker players exit the market to balance supply and demand.”
Techniques to optimize the pagerank algorithm usually fall in two categories. One is to try reducing the work per iteration, and the other is to try reducing the number of iterations. These goals are often at odds with one another. Skipping computation on vertices which have already converged has the potential to save iteration time. Skipping in-identical vertices, with the same in-links, helps reduce duplicate computations and thus could help reduce iteration time. Road networks often have chains which can be short-circuited before pagerank computation to improve performance. Final ranks of chain nodes can be easily calculated. This could reduce both the iteration time, and the number of iterations. If a graph has no dangling nodes, pagerank of each strongly connected component can be computed in topological order. This could help reduce the iteration time, no. of iterations, and also enable multi-iteration concurrency in pagerank computation. The combination of all of the above methods is the STICD algorithm. [sticd] For dynamic graphs, unchanged components whose ranks are unaffected can be skipped altogether.
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...Subhajit Sahu
Abstract — Levelwise PageRank is an alternative method of PageRank computation which decomposes the input graph into a directed acyclic block-graph of strongly connected components, and processes them in topological order, one level at a time. This enables calculation for ranks in a distributed fashion without per-iteration communication, unlike the standard method where all vertices are processed in each iteration. It however comes with a precondition of the absence of dead ends in the input graph. Here, the native non-distributed performance of Levelwise PageRank was compared against Monolithic PageRank on a CPU as well as a GPU. To ensure a fair comparison, Monolithic PageRank was also performed on a graph where vertices were split by components. Results indicate that Levelwise PageRank is about as fast as Monolithic PageRank on the CPU, but quite a bit slower on the GPU. Slowdown on the GPU is likely caused by a large submission of small workloads, and expected to be non-issue when the computation is performed on massive graphs.
15. `
Learn more:
Visit us at stand
102
Thank you !
Rick Mutsaers
+31 622 414240
rmutsaers@informatica.com
Editor's Notes
So what is changing in the world of Analytics.
First of all we see new questions being asked to be able to control and grow business.
Like what interaction did we have with our customer that pursuaded him/her to buy our product.
Or in healthcare, what did we do to make the patient better. Or in general, do we understand the reasons why employees leave our company, so we can prevent this in future.
But also what can we do to lower cost and optimize our business performance.
The infrastructure choices to help with these questions have dramatically changed over the past decade.
And this new era we call Data 3.0 where data is used as a strategic asset to fuel this disruption comes with new challenges. Like data volumes, and types of data. or the fact that data consumers need smarter applications, meaning AI and Machine Learning. And we see a whole new group of data consumers.
Finally we see a big shift toward cloud. So how do you integrate data that’s not in your datacenter. More specifically that’s spread across many cloud applications.
Looking at these challenges, what fo we then need in terms of capabilities...
But don’t make the mistake of starting to use a plethora of non-integrated tools to tackle these problemns, rather think of an integrated platform that provides all the capabilities you need.
Now the new adagium is ‘Data is the new gold’. Then i would pose the statement ‘Metadata is the diamond in the rough’. Let me explain.
There are different types of metadata we need to collect to face these challenges
Using all this metadata create a data catalog that business users can query to understand what data there is, who owns it, what it means, what quality it has etc.
This can also be used to create data integration patterns based on an abstracted view
The benefit of using a tightly integrated platform are numerous....
Now lets look at a customer example that has gone through this digital revolution.
BICS is a network provider for roaming services.
They started a project to get better view on roaming activity to better predict network usage, disruptions, customer behavior. To do this they
To do this they first implemented a Hadoop platform to offload data from expensive Teradata into cheaper HDFS and then moved some of their existing data integration logic (built using Informatica’s PowerCenter data integration suite) to Hadoop for improved performance and scalability. They moved most of the batch processing to Hadoop now leveraging the technology benefits of that platform. They started with MapReduce and have now switched to Spark, without recoding.
Next phase will be to also start to process data in realtime/streaming mode to get even lower latencies for predictive maintenance and quicker response to disruptions.