Many organizations are struggling to understand Big Data, what it is, and how to best harness it. Generated by mobile devices, social media, click streams, machines, applications, and more, data is exploding at an exponential rate from sources that are increasingly complex and varied.
How do you manage and leverage both structured and unstructured data? How do you use advanced analytics to gain new insights, find anomalies, correlations, and answers that can transform the business?
Learn how enterprises are implementing Hadoop to get the answers to these questions and more.
The Briefcase Cluster – Enabling Big Data Everywhere MapR Technologies
The briefcase cluster is a mobile MapR cluster to collect big data in remote places like offshore platforms or airplanes. It can also serve as a private cluster for individuals looking to bring data from different IoT devices together in one privately controlled cluster.
Dickey's Barbecue Pit Heats Up Analytics with Amazon Web ServicesPrecisely
Heading to the cloud for Big Data analytics? Or, want to take your current projects to the next level?
When Dickey's Barbecue Pit needed an easy and cost-effective way to quickly understand — and act upon — rapidly growing business data across more than 550 stores, they turned to an advanced analytics solution hosted on Amazon Web Services. The results have been game-changing — driving insights to make real-time decisions on everything from staffing to inventory to marketing.
In this webcast, CIO Laura Rea Dickey presents a first-hand account of the company's journey to the cloud, including:
The business challenges that led to the new solution
How they picked the right platform and partner
Their success plan and results
And lessons learned along the way!
Use Cases from Batch to Streaming, MapReduce to Spark, Mainframe to Cloud: To...Precisely
So you built your Hadoop cluster. How do you get data from hundreds of database tables, streaming Kafka sources, and data shared by 20-year-old COBOL programs, all in there and working together quickly, efficiently and securely? With many customers asking this same question, Hortonworks recently expanded its partnership with Syncsort to provide optimized ETL onboarding for Hadoop. During this talk, we'll discuss how a next-generation ETL tool, built on contributions to the open source community and natively integrated in Hadoop, can drive lasting value for your organization. 1) Seamlessly onboard data from all your enterprise sources – batch and streaming -- into Hadoop for fast and easy analytics. 2) Stay agile and simplify your environment with a "design once, deploy anywhere" approach that minimizes disruption and risk in the face of a rapidly evolving big data ecosystem. 3) Secure, govern and manage your data with full integration with Apache Ambari, Apache Ranger, and more. These benefits come to life with real customer case studies. Learn how a national insurance company and global hotel chain are using Hortonworks HDP and Syncsort DMX-h to get bigger insights from their enterprise data, securely, efficiently, and cost-effectively, without spending hundreds of man-hours.
Reducing the Total Cost of Ownership of Big Data- Impetus White PaperImpetus Technologies
For Impetus’ White Papers archive, visit- http://www.impetus.com/whitepaper
The paper discusses the challenges that relate to the cost of Big Data solutions and looks at the technology options available to overcome these problems.
NetApp IT sought to adopt game changers like hyperscale computing, flash storage and SaaS as part of their cloud-first strategy. After taking a hard look at our data centers, NetApp IT established a strategic roadmap to downsize data centers where possible to lower costs, enhance services, and become more dynamic.
NetApp IT Efficiencies Gained with Flash, NetApp ONTAP, OnCommand Insight, Al...NetApp
During an Insight Las Vegas 2017 breakout presentation, NetApp IT Senior Manager of Customer-1, Pridhvi Appineni, to talk about IT's business results of running a global enterprise on NetApp technology. From being cloud ready to data compliant to prepared for a disaster, NetApp technology is at the heart of our stable, reliable IT data management environment
The Briefcase Cluster – Enabling Big Data Everywhere MapR Technologies
The briefcase cluster is a mobile MapR cluster to collect big data in remote places like offshore platforms or airplanes. It can also serve as a private cluster for individuals looking to bring data from different IoT devices together in one privately controlled cluster.
Dickey's Barbecue Pit Heats Up Analytics with Amazon Web ServicesPrecisely
Heading to the cloud for Big Data analytics? Or, want to take your current projects to the next level?
When Dickey's Barbecue Pit needed an easy and cost-effective way to quickly understand — and act upon — rapidly growing business data across more than 550 stores, they turned to an advanced analytics solution hosted on Amazon Web Services. The results have been game-changing — driving insights to make real-time decisions on everything from staffing to inventory to marketing.
In this webcast, CIO Laura Rea Dickey presents a first-hand account of the company's journey to the cloud, including:
The business challenges that led to the new solution
How they picked the right platform and partner
Their success plan and results
And lessons learned along the way!
Use Cases from Batch to Streaming, MapReduce to Spark, Mainframe to Cloud: To...Precisely
So you built your Hadoop cluster. How do you get data from hundreds of database tables, streaming Kafka sources, and data shared by 20-year-old COBOL programs, all in there and working together quickly, efficiently and securely? With many customers asking this same question, Hortonworks recently expanded its partnership with Syncsort to provide optimized ETL onboarding for Hadoop. During this talk, we'll discuss how a next-generation ETL tool, built on contributions to the open source community and natively integrated in Hadoop, can drive lasting value for your organization. 1) Seamlessly onboard data from all your enterprise sources – batch and streaming -- into Hadoop for fast and easy analytics. 2) Stay agile and simplify your environment with a "design once, deploy anywhere" approach that minimizes disruption and risk in the face of a rapidly evolving big data ecosystem. 3) Secure, govern and manage your data with full integration with Apache Ambari, Apache Ranger, and more. These benefits come to life with real customer case studies. Learn how a national insurance company and global hotel chain are using Hortonworks HDP and Syncsort DMX-h to get bigger insights from their enterprise data, securely, efficiently, and cost-effectively, without spending hundreds of man-hours.
Reducing the Total Cost of Ownership of Big Data- Impetus White PaperImpetus Technologies
For Impetus’ White Papers archive, visit- http://www.impetus.com/whitepaper
The paper discusses the challenges that relate to the cost of Big Data solutions and looks at the technology options available to overcome these problems.
NetApp IT sought to adopt game changers like hyperscale computing, flash storage and SaaS as part of their cloud-first strategy. After taking a hard look at our data centers, NetApp IT established a strategic roadmap to downsize data centers where possible to lower costs, enhance services, and become more dynamic.
NetApp IT Efficiencies Gained with Flash, NetApp ONTAP, OnCommand Insight, Al...NetApp
During an Insight Las Vegas 2017 breakout presentation, NetApp IT Senior Manager of Customer-1, Pridhvi Appineni, to talk about IT's business results of running a global enterprise on NetApp technology. From being cloud ready to data compliant to prepared for a disaster, NetApp technology is at the heart of our stable, reliable IT data management environment
Accelerate your business in a data-driven worldNetApp
Did you know that with NetApp all-flash solution for Oracle Database, you can speed time to market, accelerate return on investment, and deliver a consistent customer experience.
Learn more at: https://www.netapp.com/us/solutions/applications/oracle-database.aspx
Converged Everything, Converged Infrastructure delivering business value and ...NetAppUK
Converged Infrastructure solutions for Cloud create business value for many customers worldwide by shortening and simplifying the path to infrastructure adoption, and time to productivity. In this session hear Alan Watson, NetApp Alliances Business Development Manager, Julian Datta, Microsoft Private Cloud Channel Development Manager and Andrew Gunyon, Cisco Data Centre Sales Manager discuss the value of very well integrated software and converged infrastructure. Hear them share their experiences, along with the latest developments in FlexPod Converged Infrastructure.
Benefits of Transferring Real-Time Data to Hadoop at ScaleHortonworks
Today’s Big Data teams demand solutions designed for Big Data that are optimized, secure, and adaptable to changing workload requirements. Working together, Hortonworks, IBM, and Attunity have designed an integrated solution that transfers large volumes of data to a platform that can handle rapid ingest, processing and analysis of data of all types from all sources, at scale.
https://hortonworks.com/webinar/benefits-transferring-real-time-data-hadoop-scale-ibm-hortonworks-attunity/
Hitachi Data Systems Hadoop Solution. Customers are seeing exponential growth of unstructured data from their social media websites to operational sources. Their enterprise data warehouses are not designed to handle such high volumes and varieties of data. Hadoop, the latest software platform that scales to process massive volumes of unstructured and semi-structured data by distributing the workload through clusters of servers, is giving customers new option to tackle data growth and deploy big data analysis to help better understand their business. Hitachi Data Systems is launching its latest Hadoop reference architecture, which is pre-tested with Cloudera Hadoop distribution to provide a faster time to market for customers deploying Hadoop applications. HDS, Cloudera and Hitachi Consulting will present together and explain how to get you there. Attend this WebTech and learn how to: Solve big-data problems with Hadoop. Deploy Hadoop in your data warehouse environment to better manage your unstructured and structured data. Implement Hadoop using HDS Hadoop reference architecture. For more information on Hitachi Data Systems Hadoop Solution please read our blog: http://blogs.hds.com/hdsblog/2012/07/a-series-on-hadoop-architecture.html
Introducing Cloudera DataFlow (CDF) 2.13.19Cloudera, Inc.
Watch this webinar to understand how Hortonworks DataFlow (HDF) has evolved into the new Cloudera DataFlow (CDF). Learn about key capabilities that CDF delivers such as -
-Powerful data ingestion powered by Apache NiFi
-Edge data collection by Apache MiNiFi
-IoT-scale streaming data processing with Apache Kafka
-Enterprise services to offer unified security and governance from edge-to-enterprise
An Introduction to the MapR Converged Data PlatformMapR Technologies
Listen to the webinar on-demand: http://info.mapr.com/WB_Partner_CDP_Intro_EMEA_DG_17.05.31_RegistrationPage.html
In this 90-minute webinar, we discuss:
- The MapR Converged Data Platform and its components
- Use cases for the Converged Data Platform
- MapR Converged Partner Program
- How to get started with MapR
- Becoming a partner
Data Warehouse Modernization: Accelerating Time-To-Action MapR Technologies
Data warehouses have been the standard tool for analyzing data created by business operations. In recent years, increasing data volumes, new types of data formats, and emerging analytics technologies such as machine learning have given rise to modern data lakes. Connecting application databases, data warehouses, and data lakes using real-time data pipelines can significantly improve the time to action for business decisions. More: http://info.mapr.com/WB_MapR-StreamSets-Data-Warehouse-Modernization_Global_DG_17.08.16_RegistrationPage.html
Postgres Vision 2018: How to Consume your Database Platform On-premisesEDB
The usual model for a database platform on-premises is to run it the way IT is usually operated - silo'd and capital- and labor-intensive. In the cloud, consumption means that you pay for what you use, with less heavy lifting to operate the platform. Presented at Postgres Vision 2018, this covers how HPE can deliver EDB Postgres in the data center or on the edge in a consumption model that is pay-per-use, elastic IT, operated for you, migrated, and integrated.
At Postgres Vision 2018, Lauren Nelson, Principal Analyst, Forrester, provided a look into the practical considerations that are influencing modern cloud strategies, including existing skill sets and technology limitations, the assortment of current and future cloud workloads, and the economics and realities of today's technology options.
Postgres Vision 2018: Making Modern an Old Legacy SystemEDB
A New England insurance company had aging hardware, a database that was out of support, an older operating system, rising costs, and no disaster recovery plan. Craig Bogovich of NTT Data tackled this massive website backend, used by the company's insureds, providers, and partners, and architected a complete overhaul and ultimately deployed it into the cloud. Presented at Postgres Vision 2018, this presentation shows how the project unfolded and provided the strategies and methods used to modernize this legacy system with open source software and cloud technology.
Learn how flash enables e-commerce businesses to handle 14,000 transactions per second; allows oil & gas companies to shorten data analysis time from days to hours; and allows banks to double their times to market, among other data points.
Postgres Vision 2018: The Changing Role of the DBA in the CloudEDB
Not that long ago, DBAs were the gateway to all things database related for enterprises. With the advent of the cloud, automation and DevOps, the DBAs role and responsibilities are rapidly evolving. In this presentation delivered at Postgres Vision 2018, Ken Rugg, Chief Product & Strategy Officer at EDB, explored the 10 most significant ways the role of the DBA has changed and what new, higher value skills a DBA will need be ready for epic change.
Slides for talk presented at Boulder Java User's Group on 9/10/2013, updated and improved for presentation at DOSUG, 3/4/2014
Code is available at https://github.com/jmctee/hadoopTools
Accelerate your business in a data-driven worldNetApp
Did you know that with NetApp all-flash solution for Oracle Database, you can speed time to market, accelerate return on investment, and deliver a consistent customer experience.
Learn more at: https://www.netapp.com/us/solutions/applications/oracle-database.aspx
Converged Everything, Converged Infrastructure delivering business value and ...NetAppUK
Converged Infrastructure solutions for Cloud create business value for many customers worldwide by shortening and simplifying the path to infrastructure adoption, and time to productivity. In this session hear Alan Watson, NetApp Alliances Business Development Manager, Julian Datta, Microsoft Private Cloud Channel Development Manager and Andrew Gunyon, Cisco Data Centre Sales Manager discuss the value of very well integrated software and converged infrastructure. Hear them share their experiences, along with the latest developments in FlexPod Converged Infrastructure.
Benefits of Transferring Real-Time Data to Hadoop at ScaleHortonworks
Today’s Big Data teams demand solutions designed for Big Data that are optimized, secure, and adaptable to changing workload requirements. Working together, Hortonworks, IBM, and Attunity have designed an integrated solution that transfers large volumes of data to a platform that can handle rapid ingest, processing and analysis of data of all types from all sources, at scale.
https://hortonworks.com/webinar/benefits-transferring-real-time-data-hadoop-scale-ibm-hortonworks-attunity/
Hitachi Data Systems Hadoop Solution. Customers are seeing exponential growth of unstructured data from their social media websites to operational sources. Their enterprise data warehouses are not designed to handle such high volumes and varieties of data. Hadoop, the latest software platform that scales to process massive volumes of unstructured and semi-structured data by distributing the workload through clusters of servers, is giving customers new option to tackle data growth and deploy big data analysis to help better understand their business. Hitachi Data Systems is launching its latest Hadoop reference architecture, which is pre-tested with Cloudera Hadoop distribution to provide a faster time to market for customers deploying Hadoop applications. HDS, Cloudera and Hitachi Consulting will present together and explain how to get you there. Attend this WebTech and learn how to: Solve big-data problems with Hadoop. Deploy Hadoop in your data warehouse environment to better manage your unstructured and structured data. Implement Hadoop using HDS Hadoop reference architecture. For more information on Hitachi Data Systems Hadoop Solution please read our blog: http://blogs.hds.com/hdsblog/2012/07/a-series-on-hadoop-architecture.html
Introducing Cloudera DataFlow (CDF) 2.13.19Cloudera, Inc.
Watch this webinar to understand how Hortonworks DataFlow (HDF) has evolved into the new Cloudera DataFlow (CDF). Learn about key capabilities that CDF delivers such as -
-Powerful data ingestion powered by Apache NiFi
-Edge data collection by Apache MiNiFi
-IoT-scale streaming data processing with Apache Kafka
-Enterprise services to offer unified security and governance from edge-to-enterprise
An Introduction to the MapR Converged Data PlatformMapR Technologies
Listen to the webinar on-demand: http://info.mapr.com/WB_Partner_CDP_Intro_EMEA_DG_17.05.31_RegistrationPage.html
In this 90-minute webinar, we discuss:
- The MapR Converged Data Platform and its components
- Use cases for the Converged Data Platform
- MapR Converged Partner Program
- How to get started with MapR
- Becoming a partner
Data Warehouse Modernization: Accelerating Time-To-Action MapR Technologies
Data warehouses have been the standard tool for analyzing data created by business operations. In recent years, increasing data volumes, new types of data formats, and emerging analytics technologies such as machine learning have given rise to modern data lakes. Connecting application databases, data warehouses, and data lakes using real-time data pipelines can significantly improve the time to action for business decisions. More: http://info.mapr.com/WB_MapR-StreamSets-Data-Warehouse-Modernization_Global_DG_17.08.16_RegistrationPage.html
Postgres Vision 2018: How to Consume your Database Platform On-premisesEDB
The usual model for a database platform on-premises is to run it the way IT is usually operated - silo'd and capital- and labor-intensive. In the cloud, consumption means that you pay for what you use, with less heavy lifting to operate the platform. Presented at Postgres Vision 2018, this covers how HPE can deliver EDB Postgres in the data center or on the edge in a consumption model that is pay-per-use, elastic IT, operated for you, migrated, and integrated.
At Postgres Vision 2018, Lauren Nelson, Principal Analyst, Forrester, provided a look into the practical considerations that are influencing modern cloud strategies, including existing skill sets and technology limitations, the assortment of current and future cloud workloads, and the economics and realities of today's technology options.
Postgres Vision 2018: Making Modern an Old Legacy SystemEDB
A New England insurance company had aging hardware, a database that was out of support, an older operating system, rising costs, and no disaster recovery plan. Craig Bogovich of NTT Data tackled this massive website backend, used by the company's insureds, providers, and partners, and architected a complete overhaul and ultimately deployed it into the cloud. Presented at Postgres Vision 2018, this presentation shows how the project unfolded and provided the strategies and methods used to modernize this legacy system with open source software and cloud technology.
Learn how flash enables e-commerce businesses to handle 14,000 transactions per second; allows oil & gas companies to shorten data analysis time from days to hours; and allows banks to double their times to market, among other data points.
Postgres Vision 2018: The Changing Role of the DBA in the CloudEDB
Not that long ago, DBAs were the gateway to all things database related for enterprises. With the advent of the cloud, automation and DevOps, the DBAs role and responsibilities are rapidly evolving. In this presentation delivered at Postgres Vision 2018, Ken Rugg, Chief Product & Strategy Officer at EDB, explored the 10 most significant ways the role of the DBA has changed and what new, higher value skills a DBA will need be ready for epic change.
Slides for talk presented at Boulder Java User's Group on 9/10/2013, updated and improved for presentation at DOSUG, 3/4/2014
Code is available at https://github.com/jmctee/hadoopTools
Integrating Hadoop into your enterprise IT environmentMapR Technologies
http://bit.ly/1M8gzAM – As the old saying goes, "it's not what you do, but how you do it" that makes all the difference. The benefits of Hadoop are well-documented as mainstream adoption continues to grow. However, as with any new technology, integrating Hadoop with your existing data management infrastructure is crucial for getting the maximum value from its capabilities.
Join us for a special roundtable webcast on July 10th to learn how to do it the right way. Gain a deeper understanding of the fundamentals of Hadoop and its growing ecosystem, the key considerations for modifying your current data management practices and the types of Big Data applications you'll be able to build.
Fast and Furious: From POC to an Enterprise Big Data Stack in 2014MapR Technologies
View this webinar presentation as CenturyLink Technology Solutions (Formerly Savvis) and MapR as we deconstruct and demystify “the enterprise big data stack.” We provide you with a more holistic view of the landscape, explore use cases to show how you can derive business value from it, and share best practices for navigating through the fragmented big data environment.
This presentation was given by MapR CMO Jack Norris at Gartner BI and Analytics Summit in las Vegas on April 2, 2014.
Hadoop revolutionizes how data is stored processed and analyzed. Hadoop represents a new data and compute stack that provides huge operational advantages and is being used to change how organizations compete. This session will provide an overview of how customers are using Hadoop today through details on initial uses and a glimpse of how this new platform is providing organizations 10X performance at 1/10 the cost
Getting started with Hadoop on the Cloud with BluemixNicolas Morales
Silicon Valley Code Camp -- October 11, 2014.
Session: Getting started with Hadoop on the Cloud.
Hadoop and Cloud is an almost perfect marriage. Hadoop is a distributed computing framework that leverages a cluster built on commodity hardware. The Cloud simplifies provisioning of machines and software. Getting started with Hadoop on the Cloud makes it simple to provision your environment quickly and actually get started using Hadoop. IBM Bluemix has democratized Hadoop for the masses! This session will provide a brief introduction to what Hadoop is, how does cloud work and will then focus on how to get started via a series of demos. We will conclude with a discussion around the tutorials and public datasets - all of the tools needed to get you started quickly.
Learn more about BigInsights for Hadoop: https://developer.ibm.com/hadoop/
MapR on Azure: Getting Value from Big Data in the Cloud -MapR Technologies
Public cloud adoption is exploding and big data technologies are rapidly becoming an important driver of this growth. According to Wikibon, big data public cloud revenue will grow from 4.4% in 2016 to 24% of all big data spend by 2026. Digital transformation initiatives are now a priority for most organizations, with data and advanced analytics at the heart of enabling this change. This is key to driving competitive advantage in every industry.
There is nothing better than a real-world customer use case to help you understand how to get value from big data in the cloud and apply the learnings to your business. Join Microsoft, MapR, and Sullexis on November 10th to:
Hear from Sullexis on the business use case and technical implementation details of one of their oil & gas customers
Understand the integration points of the MapR Platform with other Azure services and why they matter
Know how to deploy the MapR Platform on the Azure cloud and get started easily
You will also get to hear about customer use cases of the MapR Converged Data Platform on Azure in other verticals such as real estate and retail.
Speakers
Rafael Godinho
Technical Evangelist
Microsoft Azure
Tim Morgan
Managing Director
Sullexis
The CSC Big Data Analytics Insights service enables clients who do not have an analytics capability to implement the business, data and technology changes to gain business benefit from an initial set of analytics based on a roadmap of changes created by CSC or provided from a compatible set of inputs.
CSC Analytic Insights Implementation has four phases:
Stage 1: Analytic Engagement
Stage 2: Analytic Discovery
Stage 3: Implementation Planning
Stage 4: Embedding Analysis .
The CSC Big Data Analytics Insights service enables clients who do not have an analytics capability to implement the business, data and technology changes to gain business benefit from an initial set of analytics based on a roadmap of changes created by CSC or provided from a compatible set of inputs.
CSC Analytic Insights Implementation has four phases:
Stage 1: Analytic Engagement
Stage 2: Analytic Discovery
Stage 3: Implementation Planning
Stage 4: Embedding Analysis
We're introducing MapR Streams, a reliable, global event streaming system that connects data producers and data consumers across shared topics of information. With the integration of MapR Streams, comes the industry’s first and only converged data platform that integrates file, database, event streaming, and analytics to accelerate data-driven applications and address emerging IoT needs.
Are you ready to accelerate your business with the power of a truly global platform for integrating data-in-motion with data-at-rest?
When it comes to the cloud, Gartner may have said it best:
“By 2020, a corporate ‘no-cloud’ policy will be as rare as a corporate ‘no-internet’ policy is today.”
If your organization is still skeptical of the cloud, now is the time to take a closer look. Faster implementation timelines and reduced maintenance costs are just two reasons why the cloud is becoming the standard across all industries.
In our webinar, we dispelled common concerns and explored the benefits of operating in the cloud. We also provided real-world examples of companies that have taken the leap and discovered just how much better business works in the cloud.
Building a Modern Analytic Database with Cloudera 5.8Cloudera, Inc.
Analytic workloads and the ability to determine “what happened” are some of the most common use cases across enterprises today - helping you understand and adapt based on changing trends. However, for most businesses today, they are only able to see a piece of the story. Analytics are limited by the amount of data able to be stored and ultimately accessed, it’s time-intensive to bring in new datasets or fit unstructured data into rigid schemas, and user access is constrained to a select few who must already know the questions they’re trying to answer.
It’s no surprise that big data is disrupting this modus operandi for analytics. A modern, Hadoop-based platform is designed to help businesses break free of these analytic limitations, providing a new kind of adaptive, high-performance analytic database. The recent release of Cloudera 5.8 continues to advance Cloudera Enterprise as the foundation for these analytic workloads.
Join Justin Erickson, Senior Director of Product Management at Cloudera, and Andy Frey, Chief Technology Officer at Marketing Associates, as they discuss:
-What technology is needed to build a modern analytic database with Hadoop
-What’s new with Cloudera 5.8
-How to align your teams around agile analytics
-Real world success from Marketing Associates
-What’s next for Cloudera Enterprise’s Analytic Database
3 Benefits of Multi-Temperature Data Management for Data AnalyticsMapR Technologies
SAP® HANA and SAP® IQ are popular platforms for various analytical and transactional use cases. If you’re an SAP customer, you’ve experienced the benefits of deploying these solutions. However, as data volumes grow, you’re likely asking yourself: How do I scale storage to support these applications? How can I have one platform for various applications and use cases?
Best Practices for Monitoring Cloud NetworksThousandEyes
Slides from webinar of Wednesday March 7th 2018 presented by Ian Waters and Tim Hale from ThousandEyes on how to adopt a Cloud Readiness Lifecycle methodology to monitoring cloud networks.
Turning Data into Business Value with a Modern Data PlatformCloudera, Inc.
3 Things to Learn About:
-Real-time analytics and data in motion
-Self-service access for SQL analysts and data scientists alike
-Public cloud and hybrid infrastructure
How Data-Driven Approaches are Changing Your Data Management Strategies
Introducing data-driven strategies into your business model alters the way your organization manages and provides information to your customers, partners and employees. Gone are the days of “waterfall” implementation strategies from relational data to applications within a data center. Now, data-driven business models require agile implementation of applications based on information from all across an organization–on-premises, cloud, and mobile–and includes information from outside corporate walls from partners, third-party vendors, and customers. Data management strategies need to be ready to meet these challenges or your new and disruptive business models will fail at the most critical time: when your customers want to access it.
ML Workshop 2: Machine Learning Model Comparison & EvaluationMapR Technologies
How Rendezvous Architecture Improves Evaluation in the Real World
In this addition of our machine learning logistics webinar series we build on the ideas of the key requirements for effective management of machine learning logistics presented in the Overview webinar and in Part I Workshop. Here we focus on model-to-model comparison & evaluation, use of decoy models and more. Listen here: http://info.mapr.com/machine-learning-workshop2.html?_ga=2.35695522.324200644.1511891424-416597139.1465233415
Self-Service Data Science for Leveraging ML & AI on All of Your DataMapR Technologies
MapR has launched the MapR Data Science Refinery which leverages a scalable data science notebook with native platform access, superior out-of-the-box security, and access to global event streaming and a multi-model NoSQL database.
Enabling Real-Time Business with Change Data CaptureMapR Technologies
Machine learning (ML) and artificial intelligence (AI) enable intelligent processes that can autonomously make decisions in real-time. The real challenge for effective ML and AI is getting all relevant data to a converged data platform in real-time, where it can be processed using modern technologies and integrated into any downstream systems.
Machine Learning for Chickens, Autonomous Driving and a 3-year-old Who Won’t ...MapR Technologies
Big data technologies are being applied to a wide variety of use cases. We will review tangible examples of machine learning, discuss an autonomous driving project and illustrate the role of MapR in next generation initiatives. More: http://info.mapr.com/WB_Machine-Learning-for-Chickens_Global_DG_17.11.02_RegistrationPage.html
ML Workshop 1: A New Architecture for Machine Learning LogisticsMapR Technologies
Having heard the high-level rationale for the rendezvous architecture in the introduction to this series, we will now dig in deeper to talk about how and why the pieces fit together. In terms of components, we will cover why streams work, why they need to be persistent, performant and pervasive in a microservices design and how they provide isolation between components. From there, we will talk about some of the details of the implementation of a rendezvous architecture including discussion of when the architecture is applicable, key components of message content and how failures and upgrades are handled. We will touch on the monitoring requirements for a rendezvous system but will save the analysis of the recorded data for later. Listen to the webinar on demand: https://mapr.com/resources/webinars/machine-learning-workshop-1/
Machine Learning Success: The Key to Easier Model ManagementMapR Technologies
Join Ellen Friedman, co-author (with Ted Dunning) of a new short O’Reilly book Machine Learning Logistics: Model Management in the Real World, to look at what you can do to have effective model management, including the role of stream-first architecture, containers, a microservices approach and a DataOps style of work. Ellen will provide a basic explanation of a new architecture that not only leverages stream transport but also makes use of canary models and decoy models for accurate model evaluation and for efficient and rapid deployment of new models in production.
Live Tutorial – Streaming Real-Time Events Using Apache APIsMapR Technologies
For this talk we will explore the power of streaming real time events in the context of the IoT and smart cities.
http://info.mapr.com/WB_Streaming-Real-Time-Events_Global_DG_17.08.02_RegistrationPage.html
Bringing Structure, Scalability, and Services to Cloud-Scale StorageMapR Technologies
Deploying storage with a forklift is so 1990s, right? Today’s applications and infrastructure demand systems and services that scale. Customers require performance and capacity to fit the use case and workloads, not the other way around. Architects need multi-temperature, multi-location, highly available, and compliance friendly platforms that grow with the generational shift in data growth and utility.
Churn prediction is big business. It minimizes customer defection by predicting which customers are likely to cancel a service. Though originally used within the telecommunications industry, it has become common practice for banks, ISPs, insurance firms, and other verticals. More: http://info.mapr.com/WB_PredictingChurn_Global_DG_17.06.15_RegistrationPage.html
The prediction process is data-driven and often uses advanced machine learning techniques. In this webinar, we'll look at customer data, do some preliminary analysis, and generate churn prediction models – all with Spark machine learning (ML) and a Zeppelin notebook.
Spark’s ML library goal is to make machine learning scalable and easy. Zeppelin with Spark provides a web-based notebook that enables interactive machine learning and visualization.
In this tutorial, we'll do the following:
Review classification and decision trees
Use Spark DataFrames with Spark ML pipelines
Predict customer churn with Apache Spark ML decision trees
Use Zeppelin to run Spark commands and visualize the results
How to Leverage the Cloud for Business Solutions | Strata Data Conference Lon...MapR Technologies
IT budgets are shrinking, and the move to next-generation technologies is upon us. The cloud is an option for nearly every company, but just because it is an option doesn’t mean it is always the right solution for every problem.
Most cloud providers would prefer that every customer be tightly coupled with their proprietary services and APIs to create lock-in with that cloud provider. The savvy customer will leverage the cloud as infrastructure and stay loosely bound to a cloud provider. This creates an opportunity for the customer to execute a multicloud strategy or even a hybrid on-premises and cloud solution.
Jim Scott explores different use cases that may be best run in the cloud versus on-premises, points out opportunities to optimize cost and operational benefits, and explains how to get the data moved between locations. Along the way, Jim discusses security, backups, event streaming, databases, replication, and snapshots across a variety of use cases that run most businesses today.
Is your organization at the analytics crossroads? Have you made strides collecting and sharing massive amounts of data from electronic health records, insurance claims, and health information exchanges but found these efforts made little impact on efficiency, patient outcomes, or costs?
Changes in how business is done combined with multiple technology drivers make geo-distributed data increasingly important for enterprises. These changes are causing serious disruption across a wide range of industries, including healthcare, manufacturing, automotive, telecommunications, and entertainment. Technical challenges arise with these disruptions, but the good news is there are now innovative solutions to address these problems. http://info.mapr.com/WB_Geo-distributed-Big-Data-and-Analytics_Global_DG_17.05.16_RegistrationPage.html
MapR announced a few new releases in 2017, and we want to go over those exciting new products and features that are available now. We’d like to invite our customers and partners to this webinar in which members of the MapR product team will share details about the latest updates.
Cisco & MapR bring 3 Superpowers to SAP HANA DeploymentsMapR Technologies
SAP HANA is an increasingly popular platform for various analytical and transactional use cases with its in-memory architecture. If you’re an SAP customer you’ve experienced the benefits.
However, the underlying storage for SAP HANA is painfully expensive. This slows down your ability to grow your SAP HANA footprint and serve up more applications.
You’re not the only one still loading your data into data warehouses and building marts or cubes out of it. But today’s data requires a much more accessible environment that delivers real-time results. Prepare for this transformation because your data platform and storage choices are about to undergo a re-platforming that happens once in 30 years.
With the MapR Converged Data Platform (CDP) and Cisco Unified Compute System (UCS), you can optimize today’s infrastructure and grow to take advantage of what’s next. Uncover the range of possibilities from re-platforming by intimately understanding your options for density, performance, functionality and more.
Open Source Innovations in the MapR Ecosystem Pack 2.0MapR Technologies
Over the summer, we introduced the MapR Ecosystem Pack (MEP) which is a natural evolution of our existing software update program that decouples open source ecosystem updates from core platform updates. MEP gives our customers quick access to the latest open source innovations while also ensuring cross-project compatibility in any given MEP version.
How Spark is Enabling the New Wave of Converged Cloud Applications MapR Technologies
Apache Spark has become the de-facto compute engine of choice for data engineers, developers, and data scientists because of its ability to run multiple analytic workloads with a single, general-purpose compute engine.
But is Spark alone sufficient for developing cloud-based big data applications? What are the other required components for supporting big data cloud processing? How can you accelerate the development of applications which extend across Spark and other frameworks such as Kafka, Hadoop, NoSQL databases, and more?
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Generating a custom Ruby SDK for your web service or Rails API using Smithyg2nightmarescribd
Have you ever wanted a Ruby client API to communicate with your web service? Smithy is a protocol-agnostic language for defining services and SDKs. Smithy Ruby is an implementation of Smithy that generates a Ruby SDK using a Smithy model. In this talk, we will explore Smithy and Smithy Ruby to learn how to generate custom feature-rich SDKs that can communicate with any web service, such as a Rails JSON API.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
Thank you for your time today. Today we’ll walk through a brief presentation to give you an overview of MapR. The high level summary of what we’ll talk about can be summarized in 3 points.
Hadoop is the leading technology for Big Data platform with the power to transform customer’s business
MapR gives you the most technologically advanced distribution for Hadoop
MapR has the product, services, and partner network to ensure production success and continued success.
Hadoop is making CIO’s rethink their data architecture. It is a fundamental shift in the economics of data storage/processing/analytics, and is opening up entirely new business opportunities. Let’s talk about 3 key trends we are seeing, as well as 3 realities or implications on your business and “readiness” to harness the power of big data and Hadoop.
The first trend is that the industry leaders have shown how to use big data to compete and win in their markets. It’s no longer a nice to have – you need big data to compete
Google pioneered MapReduce processing on commodity hardware and used that to catapult themselves to into the leading search engine even though they were 19th in the market
Yahoo! Leveraged these ideas to create Hadoop to keep up with Google and many mainstream companies have followed with new data-driven applications such as “people you may know” (started by LinkedIN and now used by Facebook, Twitter, and every social application), product recommendation engines, contextual and personalized music services (beats), measuring digital media effectiveness (comScore), serving more relevant/targeted ads(Comcast, rubicon project), fraud and risk detection, healthcare efficacy, and more
What makes the difference? A lot of attention is given to data science and developing sophisticated new algorithms, but in many cases just having more data beats better algorithms. (make point on collecting more consumer interaction as well as transaction data, as an example).
In addition, competitive advantage is decided by very small percentages. Just 1% improvement in fraud can mean hundreds $millions in savings. A ½% lift in advertising effectiveness means millions in new product sales and profitability. The same can be applied to customer churn, disease diagnosis, and more.
A second trend in enterprise architecture has been big data overwhelming the existing workload-specific systems which are in production. (list of requirements for each of these on the side in text)
People started with mainframes or operational systems which run ERP, finance, CRM and other mission-critical applications. They require… (pick out attributes you want to stress on the left)
You also have data warehouses, marts, data mining, and other analytical systems which pull data from these operational and other systems for providing insights to the business for decision making
The amount/variety of data has been overloading these systems. You reach a certain point as you try to ingest new types of data when these systems are not cost-effective to scale to terabytes or petabytes of data
Hadoop has become the defacto big data platform which allows organizations to keep up with big data and feed data-driven applications and processes
This chart shows the percentage growth of jobs from Indeed.com.
Compared to other popular technologies such as MongoDB and Cassandra, Hadoop is not only the fastest growing big data technology it’s one of the fastest growing technologies period.
Hadoop has the most robust ecosystem and momentum and is the big data platform of choice for industry-leading, data-driven companies
(Also of interest is that Indeed.com (which is a subsidiary of a Japanese-owned company) is a customer of MapR – they harness and analyze all of the job trends data using MapR)
The first reality is that as people put Hadoop into production, to relieve the pressure from other systems in their enterprise architecture it needs to reliable . Hadoop needs to be held to the same enterprise standards as your Oracle, SAP, Teradata, NetApp storage, or any other enterprise system.
Many organizations are putting Hadoop into their data center to provide (list of use cases underneath) … it can do all of this and more, but
For Hadoop to act as a system of record , it must provide the same guarantees for SLA’s, performance, data protection, and more
Most importantly, Hadoop has the potential for both analytics AND operations. It can be used to optimize the data warehouse provide batch data refining or storage. But Hadoop can provide many operational analytics or database operations/jobs when done right.
In a recent article by Tom Davenport (http://www.cmswire.com/cms/big-data/5-things-to-lessen-your-anxiety-about-big-data-024382.php) – he says
“Big data’s biggest wins come from making many small decisions vs. one that’s huge. The majority of big data driven decisions will be recurring, made at speed (in milliseconds), and at scale; actions will be taken automatically (vs. reviewed and approved by an individual). Examples include ad platforms making many constant adjustments, fraud detection on millions of transactions that are based on individual patterns, fleet management and routing taking into account current conditions….
This requires a Hadoop platform that can go beyond batch and support streaming writes so data can be constantly writing to the system while analysis is being conducted. High performance to meet the business needs and real-time operations the ability to perform online database operations to react to the business situation and impact business as it happens not report on it one week, month or quarter later.
To do this requires THE RIGHT ARCHITECTURE
96% of US internet traffic
Formerly used 2 other distros
Went to MapR to meet very high SLA’s and performance
Push messaging. Starbucks or ESPN applications, and others.
MapR is the only software that they pay for. Have HBase committers on staff.
Taken 8 applications clusters and moved into 1 MapR cluster; have 1 cluster with 8 sub-clusters running on different sets of nodes. Data placement control enables this.
Went from 12 CDH servers and cut it down to 6. Just for HBase tables. (They won’t use M7 since they are HBase committers. )
Verizon Teradata example
Less than 10% of CDR’s analyzed
More relevant local example Experian
Solutionary is a Managed Security Services provider with services that include network intrusion detection
----- Meeting Notes (3/27/14 11:12) -----
Zions Bank
Video - Phishing Attack
MapR is the Hadoop technology leader with over 500 paying customers and the largest production deployments in the world.
People like to think of Yahoo, Facebook, and LinkedIn as big Hadoop users, and they are, but you would expect this because of their deep engineering heritage. Mainstream organizations who want to leverage Hadoop without hiring armies of engineers turn to MapR. We have the largest retailer, largest financial services deployment, largest media, healthcare, and government agencies
Through a combination of Apache Hadoop community participation and a differentiated data platform, MapR lets organizations do more with Hadoop in both operational and analytical use cases that are expensive or impossible with other Hadoop distributions.
Again,
Hadoop is the leading technology for Big Data platform with the power to transform customer’s business
MapR gives you the most technologically advanced distribution for Hadoop
MapR has the product, services, and partner network to ensure production success and continued success.
-