The Briefing Room with Dr. Robin Bloor and NuoDB
Live Webcast on March 25, 2014
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=ac6cb15c0aaaa6d044784969e4187696
Enterprise organizations are already deeply embedded in the cloud, whether it’s via Salesforce.com for customer relationship management or Marketo for marketing and lead generation. But frequently the most significant impediment to moving the crown jewels of corporate data to the cloud is the database. A cloud database must be secure, flexible enough to solve a variety of problems, easy to automate and administer, and able to run in multiple cloud data centers simultaneously. Plus, it should be consistently resilient in the face of failure, not to mention cost-effective, just like the cloud itself.
Register for this episode of The Briefing Room to hear veteran Analyst Dr. Robin Bloor as he explains how cloud deployments are the inevitable next step for information management. He will be joined by Jim Starkey, co-founder of NuoDB, who will discuss the common reasons enterprises shy away from leveraging a database in the cloud, as well as how next generation DBMS, purpose-built for the cloud, can create strategic organizational advantage.
Visit InsideAnlaysis.com for more information.
The starting point for this project was a MapReduce application that processed log files produced by the support portal. This application was running on Hadoop with Ruby Wukong. At the time of the project start it was underperforming and did not show good scalability. This made the case for redesigning it using Spark with Scala and Java.
Initial review of the Ruby code revealed that it was using disk IO excessively, in order to communicate between MapReduce jobs. Each job was implemented as a separate script passing large data volumes through. Spark is more efficient in managing intermediate data passed between MapReduce jobs – not only it keeps it in memory whenever possible, it often eliminates the need for intermediate data at all. However, that alone not brought us much improvement since there were additional bottlenecks at data aggregation stages.
The application involved a global data ordering step, followed by several localized aggregation steps. This first global sort required significant data shuffle that was inefficient. Spark allowed us to partition the data and convert a single global sort into many local sorts, each running on a single node and not exchanging any data with other nodes. As a result, several data processing steps started to fit into node memory, which brought about a tenfold performance improvement.
DataStax C*ollege Credit: What and Why NoSQL?DataStax
In the first of our bi-weekly C*ollege Credit series Aaron Morton, DataStax MVP for Apache Cassandra and Apache Cassandra committer and Robin Schumacher, VP of product management at DataStax, will take a look back at the history of NoSQL databases and provide a foundation of knowledge for people looking to get started with NoSQL, or just wanting to learn more about this growing trend. You will learn how to know that NoSQL is right for your application, and how to pick a NoSQL database. This webinar is C* 101 level.
Analytics, Big Data and Nonvolatile Memory Architectures – Why you Should Car...StampedeCon
This session will begin with an overview of current non-volatile memory (NVM, aka persistent memory) architectures and its relationship between several levels of memory and storage hierarchy, both near- and far-processor. A discussion on its significant impact on computing analytic workloads now and in the near future will ensue, including use cases and the concept of very large persistent memory surfaces as applied to both analytic computation and storage for big data workflows. The presentation will end with ‘why you should care’ about such technologies which inevitably will completely change the way we think about solving data-intensive problems.
How to Succeed in Hadoop: comScore’s Deceptively Simple Secrets to Deploying ...MapR Technologies
Get an insider's view into one of the most talked-about Hadoop deployments in the world!
As more enterprises realize the value of big data, Hadoop is moving from lab curiosity to genuine competitive advantage. But how can you confidently deploy it in a production environment?
In this joint webinar with Syncsort, learn firsthand from industry thought leader, Mike Brown, CTO of comScore, how to offload critical data and optimize your enterprise data architecture with Hadoop to increase performance while lowering costs.
Exploring microservices in a Microsoft landscapeAlex Thissen
Presentation for Dutch Microsoft TechDays 2015 with Marcel de Vries:
During this session we will take a look at how to realize a Microservices architecture (MSA) using the latest Microsoft technologies available. We will discuss some fundamental theories behind MSA and show you how this can actually be realized with Microsoft technologies such as Azure Service Fabric. This session is a real must-see for any developer that wants to stay ahead of the curve in modern architectures.
The starting point for this project was a MapReduce application that processed log files produced by the support portal. This application was running on Hadoop with Ruby Wukong. At the time of the project start it was underperforming and did not show good scalability. This made the case for redesigning it using Spark with Scala and Java.
Initial review of the Ruby code revealed that it was using disk IO excessively, in order to communicate between MapReduce jobs. Each job was implemented as a separate script passing large data volumes through. Spark is more efficient in managing intermediate data passed between MapReduce jobs – not only it keeps it in memory whenever possible, it often eliminates the need for intermediate data at all. However, that alone not brought us much improvement since there were additional bottlenecks at data aggregation stages.
The application involved a global data ordering step, followed by several localized aggregation steps. This first global sort required significant data shuffle that was inefficient. Spark allowed us to partition the data and convert a single global sort into many local sorts, each running on a single node and not exchanging any data with other nodes. As a result, several data processing steps started to fit into node memory, which brought about a tenfold performance improvement.
DataStax C*ollege Credit: What and Why NoSQL?DataStax
In the first of our bi-weekly C*ollege Credit series Aaron Morton, DataStax MVP for Apache Cassandra and Apache Cassandra committer and Robin Schumacher, VP of product management at DataStax, will take a look back at the history of NoSQL databases and provide a foundation of knowledge for people looking to get started with NoSQL, or just wanting to learn more about this growing trend. You will learn how to know that NoSQL is right for your application, and how to pick a NoSQL database. This webinar is C* 101 level.
Analytics, Big Data and Nonvolatile Memory Architectures – Why you Should Car...StampedeCon
This session will begin with an overview of current non-volatile memory (NVM, aka persistent memory) architectures and its relationship between several levels of memory and storage hierarchy, both near- and far-processor. A discussion on its significant impact on computing analytic workloads now and in the near future will ensue, including use cases and the concept of very large persistent memory surfaces as applied to both analytic computation and storage for big data workflows. The presentation will end with ‘why you should care’ about such technologies which inevitably will completely change the way we think about solving data-intensive problems.
How to Succeed in Hadoop: comScore’s Deceptively Simple Secrets to Deploying ...MapR Technologies
Get an insider's view into one of the most talked-about Hadoop deployments in the world!
As more enterprises realize the value of big data, Hadoop is moving from lab curiosity to genuine competitive advantage. But how can you confidently deploy it in a production environment?
In this joint webinar with Syncsort, learn firsthand from industry thought leader, Mike Brown, CTO of comScore, how to offload critical data and optimize your enterprise data architecture with Hadoop to increase performance while lowering costs.
Exploring microservices in a Microsoft landscapeAlex Thissen
Presentation for Dutch Microsoft TechDays 2015 with Marcel de Vries:
During this session we will take a look at how to realize a Microservices architecture (MSA) using the latest Microsoft technologies available. We will discuss some fundamental theories behind MSA and show you how this can actually be realized with Microsoft technologies such as Azure Service Fabric. This session is a real must-see for any developer that wants to stay ahead of the curve in modern architectures.
Big Data Education Webcast: Introducing DMX and DMX-h Release 8Precisely
Check out this webcast, where our Big Data product experts take you on a tour of the coolest features, complete with product demos. Tune in to learn how you can:
Future-proof your applications. Deploy the same data flows on or off of Hadoop, on premise or in the cloud, with no application changes
Save users from underlying complexities of Hadoop with our new Intelligence Execution Layer
Ingest data directly into Big Data formats such as Avro & Parquet – in one step & without staging
Load Apache Spark engines with mainframe data via a new, Cloudera-certified Spark mainframe connector
Turn raw data into powerful insights in just one click with our new connectors for QlikView and Tableau
Data Lake and the rise of the microservicesBigstep
By simply looking at structured and unstructured data, Data Lakes enable companies to understand correlations between existing and new external data - such as social media - in ways traditional Business Intelligence tools cannot.
For this you need to find out the most efficient way to store and access structured or unstructured petabyte-sized data across your entire infrastructure.
In this meetup we’ll give answers on the next questions:
1. Why would someone use a Data Lake?
2. Is it hard to build a Data Lake?
3. What are the main features that a Data Lake should bring in?
4. What’s the role of the microservices in the big data world?
Learn how to build a global, multi-region MySQL cloud back-end capable of serving hundreds of millions of online multiplayer game accounts. In this webinar, you will find solutions to the typical business challenges of serving a geographically distributed audience - like Riot Games - with low-latency, fast response times, rapid-failover automated high availability, simple administration, system visibility, and stability.
AGENDA
This webinar has three parts, and lasts about 30 minutes.
- Customer Use Case:
- Customer Profile
- Business Challenge
- The Solution Architecture
- Significant Benefits
- Continuent Benefits
- Q & A
DATA LAKE AND THE RISE OF THE MICROSERVICES - ALEX BORDEIBig Data Week
Alex Bordei is a developer turned Product Manager. He has been developing infrastructure products for over nine years. Before becoming Bigstep’s Product Manager, he was one of the core developers for Hostway Corporation’s provisioning platform. He then focused on defining and developing products for Hostway’s EMEA market and was one of the pioneers of virtualization in the company. After successfully launching two public clouds based on VMware software, he created the first prototype of Bigstep’s Full Metal Cloud in 2011. He now focuses on guaranteeing that the Full Metal Cloud is the highest performance cloud in the world, for big data applications.
Azure + DataStax Enterprise (DSE) Powers Office365 Per User StoreDataStax Academy
We will present our Office 365 use case scenarios, why we chose Cassandra + Spark, and walk through the architecture we chose for running DSE on Azure.
The presentation will feature demos on how you too can build similar applications.
Manage Microservices & Fast Data Systems on One Platform w/ DC/OSMesosphere Inc.
The application landscape inside our data center is changing: Along with the trend of moving toward microservices and containers, there are a number of new distributed data processing frameworks such as Kafka or Cassandra being released on a weekly basis. These changes have implications for the ways we think about infrastructure. With the growing need for computing power and the rise of distributed applications comes the need for a reliable and simple-use cluster manager and programming abstraction.
In this presentation, Mesosphere explains how to use DC/OS to manage microservices and fast data systems on a single platform. We will look at how container orchestration, including resource management and service management, can be streamlined to process fast data in a matter of seconds, allowing for predictive user interfaces, product recommendations, and billing charge back, among other modern app components.
Supporting Hadoop in containers takes much more than the very primitive support Docker provides using the Storage Plugin. A production scale Hadoop deployment inside containers needs to honor anti/affinity, fault-domain and data-locality policies. Kubernetes alone, with primitives such as StatefulSets and PersitentVolumeClaims, is not sufficient to support a complex data-heavy application such as Hadoop. One needs to think about this problem more holistically across containers, networking and storage stacks. Also, constructs around deployment, scaling, upgrade etc in traditional orchestration platforms is designed for applications that have adopted a microservices philosophy, which doesn't fit most Big Data applications across the ingest, store, process, serve and visualization stages of the pipeline. Come to this technical session to learn how to run and manage lifecycle of containerized Hadoop and other applications in the data analytics pipeline efficiently and effectively, far and beyond simple container orchestration. #BigData, #NoSQL, #Hortonworks, #Cloudera, #Kafka, #Tensorflow, #Cassandra, #MongoDB, #Kudu, #Hive, #HBase, PARTHA SEETALA, CTO, Robin Systems.
Rolling presentation during Couchbase Day. Including
Introduction to NoSQL
Why NoSQL?
Introduction to Couchbase
Couchbase Architecture
Single Node Operations
Cluster Operations
HA and DR
Availability and XDCR
Backup/Restore
Security
Developing with Couchbase
Couchbase SDKs
Couchbase Indexing
Couchbase GSI and Views
Indexing and Query
Couchbase Mobile
30 Minutes to the Analytics Platform with Infrastructure as CodeGuido Schmutz
Analytical platforms for PoCs and evaluation can be built in the cloud in an hour - with ready-made setup scripts. But if you put the services together freely, it gets more difficult. The open-source platform-in-a-box "Platys" (https://github.com/TrivadisPF/platys) shows that it is easier for test and PoC environments. In addition to possible uses and examples, we explain services and "just briefly" set up a data lake with a database, event broker, stream processing, blob store, SQL access and data science notebook.
Microsoft: Building a Massively Scalable System with DataStax and Microsoft's...DataStax Academy
We have the challenge of how to reliably store massive quantities of data that are available even in the face of infrastructure failures. We have similar challenges on the application side. The most successful cloud architectures break applications down into microservices. How then do we deploy, upgrade and manage the scale of those microservices? This session will illustrate how to tackle these challenges by taking advantage of both Cassandra and Microsoft's next generation PaaS infrastructure called Azure Service Fabric.
Building A Diverse Geo-Architecture For Cloud Native Applications In One DayVMware Tanzu
Presenter: Ben Laplanche, Product Manager, Pivotal Cloud Foundry
Companies turn to PaaS and Cloud Native Applications to gain agility and speed. To provide customer value, a fault tolerant infrastructure is essential. But what happens if an entire data center, region, or even country should go offline? Cassandra holds the key to keeping application state in sync through replication, whilst Pivotal Cloud Foundry provides easy deployment to multiple IaaS providers. It also comes complete with a managed service offering for DataStax Enterprise. This talk will discuss how this setup can be deployed in one day, including demonstrations and a walkthrough of the key concepts, approaches, and considerations.
Review Oracle OpenWorld 2015 - Overview, Main themes, Announcements and FutureLucas Jellema
This presentation (part of the year AMIS Oracle OpenWorld Review session) discusses the main themes for this year's conference and introduces the all encompassing cloud strategy. It highlights some major changes at Oracle Corporation. It lists the major announcements, the hot terminology and the product roadmaps.
The Cloud Imperative – What, Why, When and HowInside Analysis
TechWise Episode III, Featuring Dr. Robin Bloor and Gilbert Cutsem
Live Webcast on September 24, 2014
Watch the archive:
https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=61c9c40def9ecba2b0dabf03b6075f3a
Regardless of where you stand in the enterprise, Cloud Computing has arrived. From the analytics that drive dynamic change, to the operational systems that keep the business humming; from the predictive models that improve results, to the database systems that underpin the most advanced infrastructure in history -- the Cloud now challenges the status quo in every corner of the enterprise.
Register for this episode of TechWise to hear veteran Analysts Dr. Robin Bloor of The Bloor Group, and Gilbert Cutsem, as they explain how today’s cutting edge Cloud solutions can deliver enterprise caliber software like never before. They’ll discuss best practices for moving to the Cloud, and offer insights for enabling intelligent hybrid architectures that can connect data, systems and business processes.
Visit InsideAnlaysis.com for more information.
Think Big - How to Design a Big Data Information ArchitectureInside Analysis
Exploratory Webcast for the Big Data Information Architecture Research Project
Live Webcast Jan. 22, 2014
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=32304b307fc5359a2f97b173166ea07b
Big Data is everywhere -- that's for sure. But the big question for today's savvy enterprise is where, exactly, should it fit within the Information Architecture? Making that decision correctly can save a lot of money while adding significant value to any number of enterprise operations. Business processes can be improved with critical new data sets; marketing can excel at hitting the right targets quickly; sales can hit home runs by having a much deeper understanding of key prospects; and senior executives can see the big picture more clearly than ever before.
Register for this Exploratory Webcast to hear veteran Analyst Dr. Robin Bloor outline the current landscape of Big Data, and offer guidance for today's organizations to determine how, when and where to deploy this powerful if unwieldy information asset. This event will kick off The Bloor Group's Interactive Research Report for 2014 which will focus on illuminating optimal Big Data Information Architectures. The series will include a dozen interviews with today's Big Data visionaries, plus three interactive Webcasts and a detailed findings report.
Visit InsideAnalysis.com for more information.
Big Data Education Webcast: Introducing DMX and DMX-h Release 8Precisely
Check out this webcast, where our Big Data product experts take you on a tour of the coolest features, complete with product demos. Tune in to learn how you can:
Future-proof your applications. Deploy the same data flows on or off of Hadoop, on premise or in the cloud, with no application changes
Save users from underlying complexities of Hadoop with our new Intelligence Execution Layer
Ingest data directly into Big Data formats such as Avro & Parquet – in one step & without staging
Load Apache Spark engines with mainframe data via a new, Cloudera-certified Spark mainframe connector
Turn raw data into powerful insights in just one click with our new connectors for QlikView and Tableau
Data Lake and the rise of the microservicesBigstep
By simply looking at structured and unstructured data, Data Lakes enable companies to understand correlations between existing and new external data - such as social media - in ways traditional Business Intelligence tools cannot.
For this you need to find out the most efficient way to store and access structured or unstructured petabyte-sized data across your entire infrastructure.
In this meetup we’ll give answers on the next questions:
1. Why would someone use a Data Lake?
2. Is it hard to build a Data Lake?
3. What are the main features that a Data Lake should bring in?
4. What’s the role of the microservices in the big data world?
Learn how to build a global, multi-region MySQL cloud back-end capable of serving hundreds of millions of online multiplayer game accounts. In this webinar, you will find solutions to the typical business challenges of serving a geographically distributed audience - like Riot Games - with low-latency, fast response times, rapid-failover automated high availability, simple administration, system visibility, and stability.
AGENDA
This webinar has three parts, and lasts about 30 minutes.
- Customer Use Case:
- Customer Profile
- Business Challenge
- The Solution Architecture
- Significant Benefits
- Continuent Benefits
- Q & A
DATA LAKE AND THE RISE OF THE MICROSERVICES - ALEX BORDEIBig Data Week
Alex Bordei is a developer turned Product Manager. He has been developing infrastructure products for over nine years. Before becoming Bigstep’s Product Manager, he was one of the core developers for Hostway Corporation’s provisioning platform. He then focused on defining and developing products for Hostway’s EMEA market and was one of the pioneers of virtualization in the company. After successfully launching two public clouds based on VMware software, he created the first prototype of Bigstep’s Full Metal Cloud in 2011. He now focuses on guaranteeing that the Full Metal Cloud is the highest performance cloud in the world, for big data applications.
Azure + DataStax Enterprise (DSE) Powers Office365 Per User StoreDataStax Academy
We will present our Office 365 use case scenarios, why we chose Cassandra + Spark, and walk through the architecture we chose for running DSE on Azure.
The presentation will feature demos on how you too can build similar applications.
Manage Microservices & Fast Data Systems on One Platform w/ DC/OSMesosphere Inc.
The application landscape inside our data center is changing: Along with the trend of moving toward microservices and containers, there are a number of new distributed data processing frameworks such as Kafka or Cassandra being released on a weekly basis. These changes have implications for the ways we think about infrastructure. With the growing need for computing power and the rise of distributed applications comes the need for a reliable and simple-use cluster manager and programming abstraction.
In this presentation, Mesosphere explains how to use DC/OS to manage microservices and fast data systems on a single platform. We will look at how container orchestration, including resource management and service management, can be streamlined to process fast data in a matter of seconds, allowing for predictive user interfaces, product recommendations, and billing charge back, among other modern app components.
Supporting Hadoop in containers takes much more than the very primitive support Docker provides using the Storage Plugin. A production scale Hadoop deployment inside containers needs to honor anti/affinity, fault-domain and data-locality policies. Kubernetes alone, with primitives such as StatefulSets and PersitentVolumeClaims, is not sufficient to support a complex data-heavy application such as Hadoop. One needs to think about this problem more holistically across containers, networking and storage stacks. Also, constructs around deployment, scaling, upgrade etc in traditional orchestration platforms is designed for applications that have adopted a microservices philosophy, which doesn't fit most Big Data applications across the ingest, store, process, serve and visualization stages of the pipeline. Come to this technical session to learn how to run and manage lifecycle of containerized Hadoop and other applications in the data analytics pipeline efficiently and effectively, far and beyond simple container orchestration. #BigData, #NoSQL, #Hortonworks, #Cloudera, #Kafka, #Tensorflow, #Cassandra, #MongoDB, #Kudu, #Hive, #HBase, PARTHA SEETALA, CTO, Robin Systems.
Rolling presentation during Couchbase Day. Including
Introduction to NoSQL
Why NoSQL?
Introduction to Couchbase
Couchbase Architecture
Single Node Operations
Cluster Operations
HA and DR
Availability and XDCR
Backup/Restore
Security
Developing with Couchbase
Couchbase SDKs
Couchbase Indexing
Couchbase GSI and Views
Indexing and Query
Couchbase Mobile
30 Minutes to the Analytics Platform with Infrastructure as CodeGuido Schmutz
Analytical platforms for PoCs and evaluation can be built in the cloud in an hour - with ready-made setup scripts. But if you put the services together freely, it gets more difficult. The open-source platform-in-a-box "Platys" (https://github.com/TrivadisPF/platys) shows that it is easier for test and PoC environments. In addition to possible uses and examples, we explain services and "just briefly" set up a data lake with a database, event broker, stream processing, blob store, SQL access and data science notebook.
Microsoft: Building a Massively Scalable System with DataStax and Microsoft's...DataStax Academy
We have the challenge of how to reliably store massive quantities of data that are available even in the face of infrastructure failures. We have similar challenges on the application side. The most successful cloud architectures break applications down into microservices. How then do we deploy, upgrade and manage the scale of those microservices? This session will illustrate how to tackle these challenges by taking advantage of both Cassandra and Microsoft's next generation PaaS infrastructure called Azure Service Fabric.
Building A Diverse Geo-Architecture For Cloud Native Applications In One DayVMware Tanzu
Presenter: Ben Laplanche, Product Manager, Pivotal Cloud Foundry
Companies turn to PaaS and Cloud Native Applications to gain agility and speed. To provide customer value, a fault tolerant infrastructure is essential. But what happens if an entire data center, region, or even country should go offline? Cassandra holds the key to keeping application state in sync through replication, whilst Pivotal Cloud Foundry provides easy deployment to multiple IaaS providers. It also comes complete with a managed service offering for DataStax Enterprise. This talk will discuss how this setup can be deployed in one day, including demonstrations and a walkthrough of the key concepts, approaches, and considerations.
Review Oracle OpenWorld 2015 - Overview, Main themes, Announcements and FutureLucas Jellema
This presentation (part of the year AMIS Oracle OpenWorld Review session) discusses the main themes for this year's conference and introduces the all encompassing cloud strategy. It highlights some major changes at Oracle Corporation. It lists the major announcements, the hot terminology and the product roadmaps.
The Cloud Imperative – What, Why, When and HowInside Analysis
TechWise Episode III, Featuring Dr. Robin Bloor and Gilbert Cutsem
Live Webcast on September 24, 2014
Watch the archive:
https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=61c9c40def9ecba2b0dabf03b6075f3a
Regardless of where you stand in the enterprise, Cloud Computing has arrived. From the analytics that drive dynamic change, to the operational systems that keep the business humming; from the predictive models that improve results, to the database systems that underpin the most advanced infrastructure in history -- the Cloud now challenges the status quo in every corner of the enterprise.
Register for this episode of TechWise to hear veteran Analysts Dr. Robin Bloor of The Bloor Group, and Gilbert Cutsem, as they explain how today’s cutting edge Cloud solutions can deliver enterprise caliber software like never before. They’ll discuss best practices for moving to the Cloud, and offer insights for enabling intelligent hybrid architectures that can connect data, systems and business processes.
Visit InsideAnlaysis.com for more information.
Think Big - How to Design a Big Data Information ArchitectureInside Analysis
Exploratory Webcast for the Big Data Information Architecture Research Project
Live Webcast Jan. 22, 2014
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=32304b307fc5359a2f97b173166ea07b
Big Data is everywhere -- that's for sure. But the big question for today's savvy enterprise is where, exactly, should it fit within the Information Architecture? Making that decision correctly can save a lot of money while adding significant value to any number of enterprise operations. Business processes can be improved with critical new data sets; marketing can excel at hitting the right targets quickly; sales can hit home runs by having a much deeper understanding of key prospects; and senior executives can see the big picture more clearly than ever before.
Register for this Exploratory Webcast to hear veteran Analyst Dr. Robin Bloor outline the current landscape of Big Data, and offer guidance for today's organizations to determine how, when and where to deploy this powerful if unwieldy information asset. This event will kick off The Bloor Group's Interactive Research Report for 2014 which will focus on illuminating optimal Big Data Information Architectures. The series will include a dozen interviews with today's Big Data visionaries, plus three interactive Webcasts and a detailed findings report.
Visit InsideAnalysis.com for more information.
Down to Business: Taking Action Quickly with Linked Data ServicesInside Analysis
The Briefing Room with Krish Krishnan and Denodo
Live Webcast 5-28-2013
Rapid time-to-insight makes analysts happy, but rapid time-to-action is what executives want most. Being able to respond quickly to market changes, new opportunities or customer requests is increasingly a must-have in today's competitive landscape. The key ingredient for this kind of organizational flexibility? Data! Companies that can quickly pull together a variety of data sources have a significant advantage over those that cannot.
Register for this episode of The Briefing Room to hear Analyst Krish Krishnan of Sixth Sense explain how linked data services can provide the necessary foundation for an agile enterprise. He'll be briefed by Suresh Chandrasekaran of Denodo Technologies who will showcase his company's mature data virtualization platform. He'll demonstrate how a point-and-click interface can be used to quickly assemble a wide range of data sets, thus enabling companies to build business solutions that address very specific enterprise needs.
Visit: http://www.insideanalysis.com
Continuous Intelligence: Staying Ahead with Streaming AnalyticsInside Analysis
The Briefing Room with Mark Madsen and SQLstream
Live Webcast Mar. 12, 2013
The battle cry of “time to insight” continues to change the way organizations seek insights from their data, big and small. One increasingly popular strategy focus on analyzing data streams, ranging from social media to machine-generated data captured in logs. By focusing on the kind of continuous intelligence that can flow from such analysis, organizations can stay ahead of their competitors by seizing new opportunities, and often avoiding problematic disruptions.
Register for this episode of The Briefing Room to hear veteran Analyst Mark Madsen of Third Nature, as he explains how a new wave of technologies offers a compelling alternative to the traditional means for generating insights. He’ll be briefed by Damian Black of SQLStream who will explain how his company’s platform was designed to enable real-time analysis of multiple data streams. He’ll discuss how streaming analytics can remove the gap between traditional business intelligence and operational systems.
Visit: http://www.insideanalysis.com
No Time-Outs: How to Empower Round-the-Clock AnalyticsInside Analysis
The Briefing Room with Rick Sherman and Actian
Slides from the Live Webcast on Aug. 28, 2012
The appetite for high-powered analytics is greater than ever these days, with increasing numbers of business users clamoring for insights. At the same time, source systems are proliferating, and the nature of questions being asked is getting more complex. Indeed, the entire landscape of analytics is changing in fundamental ways. How can your organization stay ahead of the curve?
Register for this episode of The Briefing Room to learn from veteran Analyst Rick Sherman how a variety of technologies can change the manner in which analytics are done. He'll be briefed by Fred Gallagher of Actian, who will explain how his company's Vectorwise technology leverages vector processing to expedite even the most complex queries when compared to traditional columnar or relational databases.
For more information visit: http://www.insideanalysis.com
Thinking Outside the Cube: How In-Memory Bolsters AnalyticsInside Analysis
The Briefing Room with Mark Madsen and IBM
Live Webcast on Aug. 27, 2013
Visit: www.insideanalysis.com
What's old is often new again, especially in the world of information management. The innovation of OLAP cubes years ago transformed business intelligence by empowering analysts with significantly faster number-crunching capabilities. Today, with data volumes exploding, a new kind of cube is offering similar value, thanks in large part to in-memory analytics.
of The Briefing Room to learn from veteran Analyst and practitioner Mark Madsen of Third Nature, who will explain how this new wave of in-memory technology can give analysts a needed boost for dealing with the rising tide of data volumes and types. He'll be briefed by Chris McPherson of IBM Business Analytics, who will tout IBM Cognos Dynamic Cubes, which were specifically designed to let business users maintain the speed and agility they need for their analytical solutions.
A Tighter Weave – How YARN Changes the Data Quality GameInside Analysis
Hot Technologies with David Loshin, David Raab and RedPoint Global
Live Webcast on August 20, 2014
Watch the archive:
https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=cc1ff3fd6d8642b3cc2d3866358387b1
The game-changing power of Hadoop is no longer questioned in the world of data management. But the bigger story these days is YARN, sometimes called Hadoop 2.0. This innovation extends the power of Hadoop to the entire spectrum of enterprise applications, and has a uniquely compelling story for data quality. In fact, solutions built around YARN now promise to revolutionize this critical field, by weaving together the myriad data quality practices into a comprehensive, end-to-end platform.
Register for this episode of Hot Technologies to hear veteran Analysts David Loshin and David Raab as they explain how YARN has opened up a new world of possibilities for enterprise data quality. They’ll be briefed by George Corugedo of RedPoint Global who will demonstrate how his company uses YARN as the backbone for a next-generation data quality platform. He’ll show how long-standing best practices can be stitched together quickly, and can be augmented by the latest advances in machine learning and predictive analytics.
Visit InsideAnlaysis.com for more information.
Enabling Flexible Governance for All Data SourcesInside Analysis
The Briefing Room with Robin Bloor and Birst
Live Webcast on Feb. 5, 2013
All the effort that goes into data governance can quickly be lost if effective guard rails aren't in place. However, end users invariably need additional data sets in order to get a complete picture of what's happening. All too often, some or all of those additional data sources have not yet run the gauntlet of governance. Striking a balance between core and contextual data can help ensure that your business stays on top of opportunities without straying from the path.
Check out this episode of The Briefing Room to learn from veteran Analyst Dr. Robin Bloor, who will explain the nuances of integrating governed and ungoverned data in ways that business users can easily leverage. He'll be briefed by Brad Peters of Birst who will demonstrate how managed data mashups can provide the kind of flexibility and agility that can lead to valuable insights. He'll explain how Birst's architecture can significantly lighten the load on IT without sacrificing data integrity, security or governance.
Visit: http://www.insideanalysis.com
Raising the Bar: Innovative Healthcare Program Fosters Collaboration, EducationInside Analysis
Big Data, Big Discoveries WebSeries
Live Webcast Feb. 27, 2013
Improving patient care remains a top priority for America's healthcare organizations, but for a group of providers in New Orleans, that bar wasn't high enough. Last year, led by the Louisiana Public Health Institute, several community healthcare providers embarked on a redesign of comprehensive care delivery, called the Crescent City Beacon Community, to improve the overall health of the greater New Orleans community. Though still in its early stages, the project has already received national recognition for excellence, and is being considered as a model for other major urban centers throughout the country. The program has focused on quality improvement for chronic care management in primary practices, enabling transitions of care using health information technology, and promoting consumer engagement through mobile phones.
Check out the slides from this free Webcast to hear LPHI's director, Dr. Anjum Khurshid, explain the component parts of this program, the foundation of which is the Greater New Orleans Health Information Exchange. Inspired by the Affordable Care Act, HIEs are intended to foster collaboration among and between various health care institutions by providing access to electronic medical records. Care coordination systems based on HIEs can greatly improve patient care, while also lowering costs, in part by reducing preventable emergency room visits and fragmentation of the healthcare system. Another innovative component of the program involves an interactive social media campaign designed to educate the New Orleans community about the risks of diabetes.
Sponsored by the BioDistrict New Orleans, this webcast is the first in a series designed to showcase the exemplary projects taking place within the district.
Visit: http://www.insideanalysis.com
The Briefing Room with Robin Bloor and Tableau Software
Live Webcast Sept. 17, 2013
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?AT=pb&SP=EC&rID=7461107&rKey=61f984b8947229b9
In the modern world of news delivery, many stories cannot be told with just words and pictures. Increasingly, top-tier news providers use interactive visualizations of data in order to tell compelling stories. The result is a more engaging experience for the user, plus added insights for the news provider.
Register for this episode of The Briefing Room to see several of the most creative and powerful examples of data visualization in the news. Chief Analyst Robin Bloor of The Bloor Group will then discuss the visualization building process with Ben Jones of Tableau Software, who will answer questions about best practices for creating educational and visually stimulating graphics.
Bridging the Gap: Analyzing Data in and Below the CloudInside Analysis
The Briefing Room with Dean Abbott and Tableau Software
Live Webcast July 23, 2013
http://www.insideanalysis.com
Today’s desire for analytics extends well beyond the traditional domain of Business Intelligence. That’s partly because business users are realizing the value of mixing and matching all kinds of data, from all kinds of sources. One emerging market driver is Cloud-based data, and the desire companies have to analyze this data cohesively with their on-premise data sets.
Register for this episode of The Briefing Room to learn from Analyst Dean Abbott, who will explain how the ability to access data in the cloud can play a critical role for generating business value from analytics. He’ll be briefed by Ellie Fields of Tableau Software who will tout Tableau’s latest release, which includes native connectors to cloud-based applications like Salesforce.com, Amazon Redshift, Google Analytics and BigQuery. She’ll also demonstrate how Tableau can combine cloud data with other data sources, including spreadsheets, databases, cubes and even Big Data.
Agents for Agility - The Just-in-Time Enterprise Has ArrivedInside Analysis
Hot Technologies with Krish Krishnan, Robin Bloor and EnterpriseWeb
Live Webcast Aug. 21, 2013
The demand for agility continues to motivate today's data-driven organizations. Competitors all over the globe are vying for faster time-to-insight, or even time-to-action. But there are other issues like governance and data quality that typically slow down key processes. Almost invariably, legacy systems that perform critical business processes are late to the party, resulting in enterprise inertia. However, a new wave of innovation is solving that problem by incorporating a late-binding approach for both analytics and operations.
Register for this episode of Hot Technologies to hear Analysts Krish Krishnan of Sixth Sense, and Dr. Robin Bloor of The Bloor Group, as they outline their competing visions for the architecture of a real-time enterprise. They'll be briefed by Dave Duggal of EnterpriseWeb, who will tout his company's platform for delivering robust enterprise functionality at the speed of the network. He'll discuss how EnterpriseWeb leverages the best ideas of service orientation, combined with intelligent agents that act as virtual hubs for the sharing of data, analytics, and mission-critical business processes.
All Grown Up: Maturation of Analytics in the CloudInside Analysis
The Briefing Room with Wayne Eckerson and Birst
Live Webcast on Nov. 6, 2012
The desire for analytics today extends far beyond the traditional domain of Business Intelligence. The challenge is that operational systems come in countless shapes and sizes. Furthermore, each application treats data somewhat differently. But there are patterns of data flow and transformation that pervade all such systems. And there's one big place where all these data types and use cases have come together architecturally: the Cloud.
Watch this episode of the Briefing Room to hear veteran Analyst Wayne Eckerson explain how Cloud computing is ushering in a new era of analytics and intelligence. He'll be briefed by Brad Peters of Birst who will tout his company's purpose-built analytics platform. He'll discuss how the Birst engine processes and delivers raw data from disparate systems, offering the deployment flexibility of Software-as-a-Service, together with the capabilities of enterprise-class BI.
Slides from the Live Webcast on Jan. 18, 2012
The purpose of this event is to allow the Analysts, Robin Bloor and Mark Madsen, to offer their theories on where the database market stands today: What’s new? What’s standard? What is the trajectory of this changing market? Each Analyst will present for 10-15 minutes, then will engage in a dialogue with Host Eric Kavanagh and all attendees.
For more information visit: http://www.databaserevolution.com
Watch this and the entire series at : http://www.youtube.com/playlist?list=PLE1A2D56295866394
Hadoop and the Data Warehouse: Point/Counter PointInside Analysis
Robin Bloor and Teradata
Live Webcast on April 22, 2014
Watch the archive:
https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=2e69345c0a6a4e5a8de6fc72652e3bc6
Can you replace the data warehouse with Hadoop? Is Hadoop an ideal ETL subsystem? And what is the real magic of Hadoop? Everyone is looking to capitalize on the insights that lie in the vast pools of big data. Generating the value of that data relies heavily on several factors, especially choosing the right solution for the right context. With so many options out there, how do organizations best integrate these new big data solutions with the existing data warehouse environment?
Register for this episode of The Briefing Room to hear veteran analyst Dr. Robin Bloor as he explains where Hadoop fits into the information ecosystem. He’ll be briefed by Dan Graham of Teradata, who will offer perspective on how Hadoop can play a critical role in the analytic architecture. Bloor and Graham will interactively discuss big data in the big picture of the data center and will also seek to dispel several common misconceptions about Hadoop.
Visit InsideAnlaysis.com for more information.
At the Tipping Point: Considerations for Cloud BI in a Multi-platform BI Ente...Inside Analysis
The Briefing Room with Wayne Eckerson and Birst
Live Webcast on March 4, 2014
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=573944bdf3e01bfb977fa9f3d1d623c2
At the Tipping Point: Considerations for Cloud BI in a Multi-platform BI Enterprise - Mar. 4
The Briefing Room with Wayne Eckerson and Birst
Cloud BI, once a novelty, is now going mainstream. More and more organizations are adopting Cloud BI and folding it in their BI strategy. Learn what considerations are necessary to determine if Cloud BI should be part of your multi-platform BI architecture.
Register for this episode of The Briefing Room to learn from veteran Analyst Wayne Eckerson as he explains what to consider when choosing cloud-based solutions. He will be briefed by Brad Peters of Birst, who will tout his company’s enterprise-caliber cloud BI platform. Peters will describe how the coexistence of cloud applications and traditional environments can provide the kind of agility and flexibility that can lead to faster time to value.
Visit InsideAnlaysis.com for more information.
Watch a replay of the webinar: https://www.youtube.com/watch?v=BtzPgLBy56w
451 Research and NuoDB outline the key database criteria for cloud applications. Explore how applications deployed in the cloud require a combination of standard functionality, such as ANSI SQL, and new capabilities specifically required to take full advantage of cloud economics, such as elastic scalability and continuous availability.
Leapfrog into Serverless - a Deloitte-Amtrak Case Study | Serverless Confere...Gary Arora
This talk was delivered at the Serverless Conference in New York City in 2017. Deloitte and Amtrak built a Serverless Cloud-Native solution on AWS for real-time operational datastore and near real-time reporting data mart that modernized Amtrak's legacy systems & applications. With Serverless solutions, we are able leapfrog over several rungs of computing evolution.
Gary Arora is a Cloud Solutions Architect at Deloitte Consulting, specializing on Azure & AWS.
Slides: Enterprise Architecture vs. Data ArchitectureDATAVERSITY
Enterprise Architecture (EA) provides a visual blueprint of the organization, and shows key interrelationships between data, process, applications, and more. By abstracting these assets in a graphical view, it’s possible to see key interrelationships, particularly as they relate to data and its business impact across the organization. Join us for a discussion on how data architecture is a key component of an overall enterprise architecture for enhanced business value and success.
Cloud computing revolutionized application design, and changed the way people think about infrastructure. The rise of cloud computing coincided with a new generation of applications and services that required scale. New architecture and design had to take into account low latency network connectivity, geographic distribution, large real-time data stores, the ability to meet demand (while not knowing exactly how much demand to handle), and so much more. We refer to this as Internet Scale.
Yet most discussion of scale and cloud revolves around compute as virtualized instances, which have defined configurations and constrained options. Delivering on the promise of Internet Scale involves substantial upfront design, and a comprehensive understanding of the entire architecture - from the underlying hardware, to the operating system, the application stack, services, and deployment. And, it involves choice - choices you should make based on your requirements. Join us for a discussion on the many facets of Internet Scale, and how it can apply to your applications and services.
(ARC309) Getting to Microservices: Cloud Architecture PatternsAmazon Web Services
Gilt, a billion dollar e-commerce company, implemented a sophisticated microservices architecture on AWS to handle millions of customers visiting their site at noon every day. The microservices architecture pattern enables independent service scaling, faster deployments, better fault isolation, and graceful degradation. In this session, Derek Chiles, AWS solutions architect, will review best practices and recommended architectures for deploying microservices on AWS. Adrian Trenaman, SVP of engineering at Gilt, will share Gilt's experiences and lessons learned during their evolution from a single monolithic Rails application in a traditional data center to more than 300 Scala/Java microservices deployed in the cloud.
Modernizing your Application Architecture with Microservicesconfluent
Organizations are quickly adopting microservice architectures to achieve better customer service and improve user experience while limiting downtime and data loss. However, transitioning from a monolithic architecture based on stateful databases to truly stateless microservices can be challenging and requires the right set of solutions.
In this webinar, learn from field experts as they discuss how to convert the data locked in traditional databases into event streams using HVR and Apache Kafka®. They will show you how to implement these solutions through a real-world demo use case of microservice adoption.
You will learn:
-How log-based change data capture (CDC) converts database tables into event streams
-How Kafka serves as the central nervous system for microservices
-How the transition to microservices can be realized without throwing away your legacy infrastructure
Lessons from Building Large-Scale, Multi-Cloud, SaaS Software at DatabricksDatabricks
The cloud has become one of the most attractive ways for enterprises to purchase software, but it requires building products in a very different way from traditional software
What is the Oracle PaaS Cloud for Developers (Oracle Cloud Day, The Netherlan...Lucas Jellema
The promise of the cloud is substantial. Oracle's public cloud promise goes beyond the generic promise. This presentation describes the promise of the Oracle Public Cloud specifically for developers. It describes the current state of the PaaS Platform, the actual and coming services and what they could mean to a developer. From same platform, different location (DBaaS, JCS) to cloud native stack (ICS, MCS) and services for Citizen Developers, the presentation touches upon virtually all services relevant to developers. The presentation concludes with first the steps enterprises can start taking to move to the cloud and second the steps individual developers could and perhaps should take in order to conquer the clouds.
Similar to The Crown Jewels: Is Enterprise Data Ready for the Cloud? (20)
Smart companies know that business intelligence surfaces insights. With complex analytics, data mining and everything in between, it takes many moving parts to serve up the big picture. The key is to provide full-stack visibility into the entire BI environment, ensuring solid service and system performance.
Learn more at http://www.insideanalysis.com
Agile, Automated, Aware: How to Model for SuccessInside Analysis
The Briefing Room with David Loshin and Embarcadero
Live Webcast October 27, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/onstage/g.php?MTID=eea9877b71c653c499c809c5693eae8fe
Data management teams face some tough challenges these days. Organizations need business-driven visibility that enables understanding and awareness of enterprise data assets – without worrying about definitions and change management. But with information architectures evolving into a hybrid mix of data objects and data services built over relational databases as well as big data stores, serving up accurately defined, reusable data can become a complex issue.
Register for this episode of The Briefing Room to learn from veteran Analyst David Loshin as he explains the importance of agile, automated workflows in today’s enterprise. He’ll be briefed by Ron Huizenga of Embarcadero, who will discuss how his company’s ER/Studio suite approaches data modeling and management from a modern architecture standpoint. He will explain that unifying the way information is represented can not only eliminate the need for costly workarounds, but also foster collaboration between data architects, developers and business users.
Visit InsideAnalysis.com for more information.
First in Class: Optimizing the Data Lake for Tighter IntegrationInside Analysis
The Briefing Room with Dr. Robin Bloor and Teradata RainStor
Live Webcast October 13, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=012bb2c290097165911872b1f241531d
Hadoop data lakes are emerging as peers to corporate data warehouses. However, successful data management solutions require a fusion of all relevant data, new and old, which has proven challenging for many companies. With a data lake that’s been optimized for fast queries, solid governance and lifecycle management, users can take data management to a whole new level.
Register for this episode of The Briefing Room to learn from veteran Analyst Dr. Robin Bloor as he discusses the relevance of data lakes in today’s information landscape. He’ll be briefed by Mark Cusack of Teradata, who will explain how his company’s archiving solution has developed into a storage point for raw data. He’ll show how the proven compression, scalability and governance of Teradata RainStor combined with Hadoop can enable an optimized data lake that serves as both reservoir for historical data and as a "system of record” for the enterprise.
Visit InsideAnalysis.com for more information.
Fit For Purpose: Preventing a Big Data LetdownInside Analysis
The Briefing Room with Dr. Robin Bloor and RedPoint Global
Live Webcast October 6, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=9982ad3a2603345984895f279e849d35
Gartner recently placed Big Data in its “trough of disillusionment,” reflective of many leaders’ struggle to prove the value of Hadoop within their organization. While the promise of enhanced data integration and enrichment is obvious, measurable results have remained elusive. This episode of The Briefing Room will outline how to successfully tie Big Data to existing business applications, preventing your next Hadoop project from being another “Big Data letdown.”
Register today to learn from veteran Analyst Dr. Robin Bloor as he discusses the importance of converging enterprise data integration with intelligence and scalability. He’ll be briefed by George Corugedo of RedPoint Global, who will provide concrete examples of how the convergence of scalable cloud platforms, ever-expanding data sources and intelligent execution can turn the Big Data hype into demonstrable business value.
Visit InsideAnalysis.com for more information.
To Serve and Protect: Making Sense of Hadoop Security Inside Analysis
The Briefing Room with Dr. Robin Bloor and HP Security Voltage
Live Webcast September 22, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=45ece7082b1d7c2cc8179bc7a1a69ea5
Hadoop is rapidly becoming a development platform and dominant server environment, and organizations are keen to take advantage of its massively scalable – and relatively inexpensive – resources. It is not, however, without its limitations, and it often requires a contingent of complementary components in order to behave within an information architecture. One area often overlooked is security, a factor that, if not considered from the onset, can insert great risk when putting sensitive data in Hadoop.
Register for this episode of The Briefing Room to learn from veteran Analyst Dr. Robin Bloor as he discusses how security was never a design point for Hadoop and what organizations can do about it. He’ll be briefed by Sudeep Venkatesh of HP Security Voltage, who will explain the intricacies surrounding a secure Hadoop implementation. He will show how techniques like format-preserving and partial-field encryption can allow for analytics over protected data, with zero performance impact.
Visit InsideAnalysis.com for more information.
The Hadoop Guarantee: Keeping Analytics Running On TimeInside Analysis
The Briefing Room with Dr. Robin Bloor and Pepperdata
Live Webcast September 15, 2015
Watch the Archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=32f198185d9d0c4cf32c27bdd1498b2a
Industry researchers agree: the importance of Hadoop will continue to grow as more companies recognize the range of benefits they can reap, from lower-cost storage to better business insights. At the same time, advances in the Hadoop ecosystem are addressing many of the key concerns that have hampered adoption, including performance and reliability. As a result, Hadoop is fast becoming a first-class citizen in the world of enterprise computing.
Register for this episode of The Briefing Room to hear veteran Analyst Dr. Robin Bloor explain how the Hadoop ecosystem is evolving into a mature foundation for managing enterprise data. He’ll be briefed by Sean Suchter of Pepperdata, who will explain how his company’s software brings predictability and reliability to Hadoop through dynamic, policy-based controls and monitoring. He’ll show how to guarantee service-level agreements by slowing down low-priority tasks as needed. He’ll also discuss the holy grail of Hadoop: how to enable mixed workloads.
Visit InsideAnalysis.com for more information.
Special Edition with Dr. Robin Bloor
Live Webcast September 9, 2015
Watch the Archive: https://bloorgroup.webex.com/bloorgroup/onstage/g.php?MTID=e8b9ac35d8e4ffa3452562c1d4286a975
Do the math: algebra will transform information management. Just as the relational database revolutionized the information landscape, so will a just-released, complete algebra of data overhaul the industry itself. So says Dr. Robin Bloor in his new book, the Algebra of Data, which he’ll outline in this special one-hour webcast.
Once organizations learn how to express their data sets algebraically, the benefits will be significant and far-reaching. Data quality problems will slowly subside; queries will run orders of magnitude faster; integration challenges will fade; and countless tedious jobs in the data management space will bid their farewell. But first, software companies must evolve, and that will take time.
Visit InsideAnalysis.com for more information.
The Role of Data Wrangling in Driving Hadoop AdoptionInside Analysis
The Briefing Room with Mark Madsen and Trifacta
Live Webcast September 1, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/onstage/g.php?MTID=eb655874d04ba7d560be87a9d906dd2fd
Like all enterprise software solutions, Hadoop must deliver business value in order to be a success. Much of the innovation around the big data industry these days therefore addresses usability. While there will always be a technical side to the Hadoop equation, the need for user-friendly tools to manage the data will continue to focus on business users. That’s why self-service data preparation or "data wrangling" is a serious and growing trend, one which promises to move Hadoop beyond the early adopter phase and more into the mainstream of business.
Register for this episode of The Briefing Room to hear veteran Analyst Mark Madsen of Third Nature explain why business users will play an increasingly important role in the evolution of big data. He’ll be briefed by Trifacta's Will Davis and Alon Bartur, who will demonstrate how Trifacta's solution empowers business users to “wrangle" data of all shapes and sizes faster and easier than ever before. They’ll discuss why a new approach to accessing and preparing diverse data is required and how it can accelerate and broaden the use of big data within organizations.
Visit InsideAnalysis.com for more information.
Ahead of the Stream: How to Future-Proof Real-Time AnalyticsInside Analysis
Business seems to move faster by the day, with the most cutting edge companies taking advantage of real-time data streams for heavy duty analytics. But with so much innovation happening in so many places, how can companies stay ahead of the game? One answer is to future-proof your analytics architecture by using an abstraction layer that can translate your business use-case or work-flow to one of many leading innovative technologies to address the growing number of use cases in this dynamic field.
Register for this episode of The Briefing Room to hear veteran Analyst Dr. Robin Bloor, as he explains how a data flow architecture can harness a wide range of streaming solutions. He'll be briefed by Anand Venugopal of Impetus Technologies, who will showcase his company's StreamAnalytix platform, which was designed from the ground up to leverage multiple major streaming engines available today, including Apache Spark, Apache Storm and others. He'll demonstrate how StreamAnalytix provides enterprise-class performance while incorporating best-of-breed open-source components.
View the archive at: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=925d1e9b639b78c6cf76a1bbbf485b2b
All Together Now: Connected Analytics for the Internet of EverythingInside Analysis
The Briefing Room with Mark Madsen and Cisco
Live Webcast August 18, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=0eff120f8b2879b582b77f4ff207ee54
Today's digital enterprises are seeing an explosion of data at the edge. The Internet of Everything is fast approaching a critical mass that will demand a sea change in how companies process data. This new world of information is widely distributed, streaming, and overall becoming too big to move. Experts predict that within two to three years, the bulk of analytic processing will take place on the fringes of information architectures. As a result, forward-thinking companies are dramatically shifting their analytic strategies.
Register for this episode of The Briefing Room to hear veteran Analyst Mark Madsen of Third Nature explain how a new era of information architectures is now unfolding, paving the way to much more responsive and agile business models. He'll be briefed by Kim Macpherson of the Cisco Data and Analytics Business Unit, who will explain how her company's platform is uniquely suited for this new, federated analytic paradigm. She'll demonstrate how edge analytics can help companies address opportunities quickly and effectively.
Visit InsideAnalysis.com for more information.
Goodbye, Bottlenecks: How Scale-Out and In-Memory Solve ETLInside Analysis
The Briefing Room with Dr. Robin Bloor and Splice Machine
Live Webcast August 11, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/onstage/g.php?MTID=e1b33c9d45b178e13784b4a971a4c1349
The ETL process was born out of necessity, and for decades it has been the glue between data sources and target applications. But as data
growth soars and increased competition demands real-time data, standard ETL has become brittle and often unmanageable. Scaling up resources can do the trick, but it’s very costly and only a matter of time before the processes hit another bottleneck. When outmoded ETL stands in the way of real-time analytics, it might be time to consider a completely new approach.
Register for this episode of The Briefing Room to learn from veteran Analyst Dr. Robin Bloor as he explains how modern, data-driven architectures must adopt an equally capable data integration strategy. He’ll be briefed by Rich Reimer of Splice Machine, who will discuss how his company solves ETL performance issues and enables real-time analytics and reports on big data. He will show that by leveraging the scale-out power of Hadoop and the in-memory speed of Spark, users can bring both analytical and operational systems together, eventually performing transformations only when needed.
Visit InsideAnalysis.com for more information.
The Biggest Picture: Situational Awareness on a Global LevelInside Analysis
The Briefing Room with Dr. Robin Bloor and Modus Operandi
Live Webcast July 28, 2015
Watch the Archive: https://bloorgroup.webex.com/bloorgroup/onstage/g.php?MTID=efc4082d9b0b0adfcd753a7435d2d6a1b
The analytic bottlenecks of yesterday need not apply today. The boundaries are also falling thanks in large part to the abundance of third-party data. The most data-driven companies these days are finding creative ways to dynamically incorporate data from within and beyond the firewall, thus building highly accurate, multidimensional views of their business, customer, competition or other subject areas.
Register for this episode of The Briefing Room to hear veteran Analyst Dr. Robin Bloor as he explains the magnitude of change that's occurring in the world of data, why it's happening now, and how you can take advantage. He'll be briefed by Mike Gilger and Boris Pelakh, who will showcase their company's enterprise analytics platform, which combines a range of battle-tested functionality to deliver dynamic situational awareness that can leverage a comprehensive array of data sets. They'll explain how the platform's reasoner benefits from a highly scalable rules engine, and a flexible modeling capability that can optimize data storage virtually on the fly.
Visit InsideAnalysis.com for more information.
Structurally Sound: How to Tame Your ArchitectureInside Analysis
The Briefing Room with Krish Krishnan and Teradata
Live Webcast July 21, 2015
Watch the Archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=602b2a8413e8719d39465f4d6291d505
Technology changes all the time, but the basic needs of the business are the same: BI and analytics. With new types of data, various analytics engines and multiple systems, giving business users seamless access to enterprise data can be a rather daunting process. One solution is to provide a complete fabric that spans the organization, touching all data points and masking the complexity behind disparate sources.
Register for this episode of The Briefing Room to learn from veteran Analyst Krish Krishnan as he explores how and why architectures have changed over the years. He’ll be briefed by Imad Birouty of Teradata, who will discuss his company’s QueryGrid, an analytics solution designed to provide access to data across all systems. He will show how QueryGrid essentially creates a logical data warehouse and enables users to leverage SQL over multiple data types.
Visit InsideAnalysis.com for more information.
SQL In Hadoop: Big Data Innovation Without the RiskInside Analysis
The Briefing Room with Dr. Robin Bloor and Actian
Live Webcast July 14, 2015
Watch the Archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=bbd4395ea2f8c60a03cfefc68c7aa823
Innovation often implies risk, which is why businesses have many issues to weigh when considering change. Yet the remarkable growth of data is driving many traditional systems into the ground, forcing information workers to take a critical look at their existing tools. Technologies like Hadoop offer economical solutions to big data management, but to truly take advantage of its capabilities, organizations must modernize their infrastructure.
Register for this episode of The Briefing Room to learn from veteran Analyst Dr. Robin Bloor as he explains how and why organizations should improve legacy systems. He’ll be briefed by Todd Untrecht of Actian, who will tout his company’s Actian Vortex, a SQL-in-Hadoop solution. He will show how integrating a SQL engine directly in the Hadoop cluster can lead to faster analytics and greater control, while still maintaining existing investments.
Visit InsideAnalysis.com for more information.
The Briefing Room with Dr. Robin Bloor and SYSTAP
Live Webcast June 30, 2015
Watch the Archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=0ff3889293f6c090483295fd7362c5a4
There's a reason why the biggest Web companies these days leverage graph technology: it is incredibly powerful for revealing a wide range of insights. Unlike other analytical databases, graph can very quickly identify the kinds of patterns that lead to better business decisions. Though relatively nascent in existing data centers, graph databases are proving to be well-suited for all kinds of business use cases, from clustering and hypothesis generation to failure detection and cyber analytics.
Register for this episode of The Briefing Room to learn from veteran Analyst Dr. Robin Bloor as he discusses how semantic technology fits in the spectrum of database and discovery solutions. He’ll be briefed by Brad Bebee of SYSTAP, who will showcase his company’s Blazegraph products and Mapgraph technology. He will explain how SYSTAP’s approach overcomes the challenge of scalability, and how graph technology’s powerful data management capabilities can deliver better enterprise performance and analytics using GPUs and other approaches.
Visit InsideAnalysis.com for more information.
A Revolutionary Approach to Modernizing the Data WarehouseInside Analysis
Hot Technologies with Rick Sherman, Dr. Robin Bloor and Snowflake Computing
Live Webcast June 25, 2015
Watch the Archive: https://bloorgroup.webex.com/bloorgroup/onstage/g.php?MTID=e6e6de6cdfa8926e7a9d52e099a1a08e2
Enterprise software tends to advance in one of two ways: evolutionary and revolutionary. Evolutionary advances happen through incremental improvements made to an existing code base over a long period of time. Revolutionary advances happen when a new solution is designed from scratch, breaking cleanly from legacy approaches to take advantage of technology innovations that can span from hardware to software and methodologies.
Register for this episode of Hot Technologies to hear veteran analysts Rick Sherman of Athena IT Solutions and Dr. Robin Bloor along with Bob Muglia, CEO of Snowflake Computing, explain how a confluence of advances in the data world have opened up new doors for revolutionary advances in data warehousing. They will discuss new technology innovations and how they can be used to create data warehouses with the power, flexibility, and resiliency that modern enterprises need without the complexities and latencies inherent to traditional approaches.
Visit InsideAnalysis.com for more information.
The Maturity Model: Taking the Growing Pains Out of HadoopInside Analysis
The Briefing Room with Rick van der Lans and Think Big, a Teradata Company
Live Webcast on June 16, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=197f8106531874cc5c14081ca214eaff
Hadoop is arguably one of the most disruptive technologies of the last decade. Once lauded solely for its ability to transform the speed of batch processing, it has marched steadily forward and promulgated an array of performance-enhancing accessories, notably Spark and YARN. Hadoop has evolved into much more than a file system and batch processor, and it now promises to stand as the data management and analytics backbone for enterprises.
Register for this episode of The Briefing Room to learn from veteran Analyst Rick van der Lans, as he discusses the emerging roles of Hadoop within the analytics ecosystem. He’ll be briefed by Ron Bodkin of Think Big, a Teradata Company, who will explore Hadoop’s maturity spectrum, from typical entry use cases all the way up the value chain. He’ll show how enterprises that already use Hadoop in production are finding new ways to exploit its power and build creative, dynamic analytics environments.
Visit InsideAnalysis.com for more information.
Rethinking Data Availability and Governance in a Mobile WorldInside Analysis
The Briefing Room with Malcolm Chisholm and Druva
Live Webcast on June 9, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=baf82d3835c5dfa63202dcbe322a3ad7
The emergence of the mobile workforce has left an indelible mark on the enterprise; every employee is now mobile, and business data continues to be dispatched to the far reaches of the enterprise. While this has added enormous opportunity for increased productivity, it has also muddied the waters when it comes to controlling and protecting valuable data assets. As companies quickly evolve to address the new set of challenges posed by this shift in data usage, IT must ensure that all data, no matter where it’s generated or stored, is available and governed just as if it were still safely behind the corporate firewall.
Register for this episode of The Briefing Room to hear veteran Analyst Malcolm Chisholm as he explains the myriad challenges that mobile data introduces when addressing regulations and compliance needs, requiring new approaches to data governance. He’ll be briefed by Dave Packer of Druva, who will outline his company’s converged data protection strategy, which brings data center class capabilities to backup, availability and governance for the mobile workforce. He will share strategies to meet regional data residency, data recovery, legal hold and eDiscovery requirements and more.
Visit InsideAnalysis.com for more information.
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...
The Crown Jewels: Is Enterprise Data Ready for the Cloud?
1. Grab some coffee and enjoy
the pre-show banter before
the top of the hour!
2. The Crown Jewels: Is Enterprise Data Ready for the Cloud?
The Briefing Room
3. Twitter Tag: #briefr
The Briefing Room
Welcome
Host:
Eric Kavanagh
eric.kavanagh@bloorgroup.com
@eric_kavanagh
4. ! Reveal the essential characteristics of enterprise software,
good and bad
! Provide a forum for detailed analysis of today’s innovative
technologies
! Give vendors a chance to explain their product to savvy
analysts
! Allow audience members to pose serious questions... and get
answers!
Twitter Tag: #briefr
The Briefing Room
Mission
5. Twitter Tag: #briefr
The Briefing Room
Topics
This Month: CLOUD
April: BIG DATA
May: DATABASE
2014 Editorial Calendar at
www.insideanalysis.com/webcasts/the-briefing-room
6.
7. Twitter Tag: #briefr
The Briefing Room
Analyst: Robin Bloor
Robin Bloor is
Chief Analyst at
The Bloor Group
robin.bloor@bloorgroup.com
@robinbloor
8. Twitter Tag: #briefr
The Briefing Room
NuoDB
! NuoDB is a NewSQL distributed database solution
! It is architected to scale elastically on the cloud
! NuoDB leverages a peer-to-peer distributed architecture,
and it is ACID complaint and continuously available
9. Twitter Tag: #briefr
The Briefing Room
Guest: Jim Starkey
Jim Starkey invented the NuoDB Emergent
Architecture, and developed the initial
implementation of the product. Jim’s career as an
entrepreneur, architect, and innovator spans more
than three decades of database history from the
Datacomputer project on the fledgling ARPAnet to his
most recent startup, NuoDB, Inc. Through the period,
he has been responsible for many database
innovations from the date data type to the BLOB to
multi-version concurrency control (MVCC). Starkey has
extensive experience in proprietary and open source
software. Starkey joined Digital Equipment
Corporation in 1975, where he created the Datatrieve
family of products, the DEC Standard Relational
Interface architecture, and the first of the Rdb
products, Rdb/ELN. Starkey founded Interbase
Software in 1984 and Netfrastructure, Inc. in 2000.
11. Magic Quadrant 2013
NuoDB
! Next-generation distributed database
! Designed for cloud, datacenter, and on-premise
deployment
! Unique ability to deploy an active-active database
in multiple locations
! Deep database DNA in management team and
world-class investors
! Headquartered in Cambridge, MA
12. Dassault Systèmes
Dassault Systèmes:
! 2nd largest independent software
vendor (ISV) in Europe
! Leader in 3D design software, 3D
Digital Mock Up and Product
Lifecycle Management (PLM)
solutions
! 170,000 customers and 10M on-premise
users
! Customers include Boeing, Ford
Motor Company, Guess apparel,
NASA, Airbus, Fujitsu, Coca Cola
and thousands of others
! NuoDB is an integral part of their
cloud-based 3DEXPERIENCE
strategy
! Investor in NuoDB
“NuoDB delivers a lot of the
features required to address the
market needs in terms of usages in
the new world of experiences.”
“This investment demonstrates our
strong interest and belief in
NuoDB’s strategy and
technologies for next-generation
cloud based services.”
Dominique Florack,
Senior Executive VP
Products-R&D
Dassault Systèmes
13. Conven>onal
Applica>ons
Cloud-‐Style
Applica>ons
Ø Rigid
&
Inflexible
Ø Dedicated
servers
Ø Scale-‐up
/
No
Scale-‐
down
Ø Low
u>liza>on
Ø High
Administrator/
Applica>on
ra>o
Ø Mul>ple
single
points
of
failure
Ø Maintenance
down>me
Ø High
capex
Ø Single
datacenter
Ø Web
Servers
Scale-‐out
✓
Ø App
Servers
Scale-‐out
✓
Ø DBMS
Servers
don’t
Scale-‐out✗
Ø Storage
Servers
Scale-‐out
✓
We
need
a
distributed
database
system
…
14. Can a RDBMS do this?
Time
TPS
(Without
giving
up
SQL
or
ACID
Transac>ons)
15. Jim Starkey
“Elas>cally
Scalable
Transac>ons
represent
the
biggest
breakthrough
in
database
technology
in
25
years”
16. Breakthrough Capabilities
Elastic Scale-out
Multi-Tenancy
Continuous Availability
No-knobs Admin
• NuoDB scales to over 100 server
machines
• Scalability is instant and elastic
• Scales-out and scales-in
• TPS numbers exceed 10m TPS on
$100k of hardware
• Also scales on AWS, GCE etc. Public
demo of 32 nodes with GOOGLE
• Now showing linear scalablity on
TPC-C type workloads (DBT-2)
• Scalability demonstrated with
heavier duty customer applications
(eg Axway, Dassault Systémes)
• Self-healing
• No single point of failure
• Fully distributed control
• Arbitrarily redundant
• Online backup
• Online schema evolution
• Rolling upgrades
• HP Moonshot Launch – 45 Micro
servers in a 4U rack mount box
• NuoDB ran 72,000 databases on
a single Moonshot box
• Uses proprietary “Database
Hibernation” and “Database
Bursting” technologies
• Zero admin UI
• Demo showed the potential of
“Software Defined Database”
• Moonshot is the foundation of
the HP relationship
• Active/Active
• ACID Semantics
• Transactional
Consistency
• N-Way Redundant
• Local User Latency
• Asynch WAN Comms
• Auto-admin
• Rules-driven
• Auto-optimizing
• Auto-backup
Geo-Distribution
16
17. Breakthrough Capabilities
Multi-Tenancy
No-knobs Admin
• HP Moonshot Launch – 45 Micro
servers in a 4U rack mount box
• NuoDB ran 72,000 databases on
a single Moonshot box
• Uses proprietary “Database
Hibernation” and “Database
Bursting” technologies
• Zero admin UI
• Demo showed the potential of
“Software Defined Database”
• Moonshot is the foundation of
the HP relationship
• Active/Active
• ACID Semantics
• Transactional
Consistency
• N-Way Redundant
• Local User Latency
• Asynch WAN Comms
• Auto-admin
• Rules-driven
• Auto-optimizing
• Auto-backup
Elastic Scale-out
• NuoDB scales to over 100 server
machines
• Scalability is instant and elastic
• Scales-out and scales-in
• TPS numbers exceed 10m TPS on
$100k of hardware
• Also scales on AWS, GCE etc. Public
demo of 32 nodes with GOOGLE
• Now showing linear scalablity on
TPC-C type workloads (DBT-2)
• Scalability demonstrated with
heavier duty customer applications
(eg Axway, Dassault Systémes)
Geo-Distribution
17
Continuous Availability
• Self-healing
• No single point of failure
• Fully distributed control
• Arbitrarily redundant
• Online backup
• Online schema evolution
• Rolling upgrades
Elastic Scale-out
• NuoDB scales to over 100 server machines
• Scalability is instant and elastic
• Scales-out and scales-in
• TPS numbers exceed 10m TPS on $100k of
hardware
• Also scales on AWS, GCE etc. Public demo
of 32 nodes with GOOGLE
• Now showing linear scalablity on TPC-C type
workloads (DBT-2)
• Scalability demonstrated with heavier duty
customer applications (eg Axway, Dassault
Systémes)
18. Breakthrough Capabilities
Elastic Scale-out
Multi-Tenancy
Continuous Availability
No-knobs Admin
• NuoDB scales to over 100 server
machines
• Scalability is instant and elastic
• Scales-out and scales-in
• TPS numbers exceed 10m TPS on
$100k of hardware
• Also scales on AWS, GCE etc. Public
demo of 32 nodes with GOOGLE
• Now showing linear scalablity on
TPC-C type workloads (DBT-2)
• Scalability demonstrated with
heavier duty customer applications
(eg Axway, Dassault Systémes)
• Self-healing
• No single point of failure
• Fully distributed control
• Arbitrarily redundant
• Online backup
• Online schema evolution
• Rolling upgrades
• HP Moonshot Launch – 45 Micro
servers in a 4U rack mount box
• NuoDB ran 72,000 databases on
a single Moonshot box
• Uses proprietary “Database
Hibernation” and “Database
Bursting” technologies
• Zero admin UI
• Demo showed the potential of
“Software Defined Database”
• Moonshot is the foundation of
the HP relationship
• Active/Active
• ACID Semantics
• Transactional
Consistency
• N-Way Redundant
• Local User Latency
• Asynch WAN Comms
• Auto-admin
• Rules-driven
• Auto-optimizing
• Auto-backup
Geo-Distribution
18
Continuous Availability
• Self-healing
• No single point of failure
• Fully distributed control
• Arbitrarily redundant
• Online backup
• Online schema evolution
• Rolling upgrades
19. Breakthrough Capabilities
Multi-Tenancy
Geo-Distribution
No-knobs Admin
• HP Moonshot Launch – 45 Micro
servers in a 4U rack mount box
• NuoDB ran 72,000 databases on
a single Moonshot box
• Uses proprietary “Database
Hibernation” and “Database
Bursting” technologies
• Zero admin UI
• Demo showed the potential of
“Software Defined Database”
• Moonshot is the foundation of
the HP relationship
• Active/Active
• ACID Semantics
• Transactional
Consistency
• N-Way Redundant
• Local User Latency
• Asynch WAN Comms
• Auto-admin
• Rules-driven
• Auto-optimizing
• Auto-backup
Elastic Scale-out
• NuoDB scales to over 100 server
machines
• Scalability is instant and elastic
• Scales-out and scales-in
• TPS numbers exceed 10m TPS on
$100k of hardware
• Also scales on AWS, GCE etc. Public
demo of 32 nodes with GOOGLE
• Now showing linear scalablity on
TPC-C type workloads (DBT-2)
• Scalability demonstrated with
heavier duty customer applications
(eg Axway, Dassault Systémes)
Geo-Distribution
19
Continuous Availability
• Self-healing
• No single point of failure
• Fully distributed control
• Arbitrarily redundant
• Online backup
• Online schema evolution
• Rolling upgrades
• Active/Active
• ACID Semantics
• Transactional Consistency
• N-Way Redundant
• Local User Latency
• Asynch WAN Comms
20. Breakthrough Capabilities
Multi-Tenancy
No-knobs Admin
No-knobs Admin
• HP Moonshot Launch – 45 Micro
servers in a 4U rack mount box
• NuoDB ran 72,000 databases on
a single Moonshot box
• Uses proprietary “Database
Hibernation” and “Database
Bursting” technologies
• Zero admin UI
• Demo showed the potential of
“Software Defined Database”
• Moonshot is the foundation of
the HP relationship
• Active/Active
• ACID Semantics
• Transactional
Consistency
• N-Way Redundant
• Local User Latency
• Asynch WAN Comms
• Auto-admin
• Rules-driven
• Auto-optimizing
• Auto-backup
Elastic Scale-out
• NuoDB scales to over 100 server
machines
• Scalability is instant and elastic
• Scales-out and scales-in
• TPS numbers exceed 10m TPS on
$100k of hardware
• Also scales on AWS, GCE etc. Public
demo of 32 nodes with GOOGLE
• Now showing linear scalablity on
TPC-C type workloads (DBT-2)
• Scalability demonstrated with
heavier duty customer applications
(eg Axway, Dassault Systémes)
Geo-Distribution
20
Continuous Availability
• Self-healing
• No single point of failure
• Fully distributed control
• Arbitrarily redundant
• Online backup
• Online schema evolution
• Rolling upgrades
• Auto-admin
• Rules-driven
• Auto-optimizing
• Auto-backup
21. Twitter Tag: #briefr
The Briefing Room
Perceptions & Questions
Analyst:
Robin Bloor
22.
23. The Quest of Many Database Engineers
True database distribution has always been a Holy Grail
HERE’S WHY…
24. What is a Database?
A database is software that presides over a
heap of data that:
IMPLEMENTS a data model
MANAGES multiple concurrent requests for data
IMPLEMENTS a security model
IS ACID compliant (?)
IS resilient
26. Databases Have to Distribute
Databases always scaled-out somewhat…
u Usually it is best to scale up (on a single node)
before scaling out
u The first scale-out step is onto well-engineered
cluster
u Then onto a more loosely bound grid
u At some point the scale-out sharding approach will
run into bottlenecks, depending on workload
u This will occur sooner with OLTP workloads
27. Approaches to Distribution…
PRIOR
ATTEMPTS AT
DISTRIBUTION:
Note that geo-distribution
is
just
distribution
with bigger
latency issues
Simple
replication
(master-slave)
Multi-master
replication
(= peer
replication)
If I understand
it correctly,
NuoDB
implements
multi-master
replication
28. u You depict NuoDB as requiring zero admin. What
parameters can the user set?
u 100 server nodes – what (roughly) is the latency
penalty?
u What is the latency penalty for geo-distribution,
roughly speaking.
u How well does NuoDB manage large query
workloads?
29. u Can you explain the recovery possibilities
available with NuoDB?
u What can you tell us about Dassault Systèmes’
use of NuoDB?
u Why is NuoDB suited to cloud operation?
31. This Month: CLOUD
April: BIG DATA
May: DATABASE
www.insideanalysis.com/webcasts/the-briefing-room
Twitter Tag: #briefr
The Briefing Room
Upcoming Topics
2014 Editorial Calendar at
www.insideanalysis.com
32. Twitter Tag: #briefr
THANK YOU
for your
ATTENTION!
Images borrowed from the Internet:
Slide 23: http://www.film-intel.com/2012/01/why-americans-like-monty-
The Briefing Room
python-and.html