The Briefing Room with Wayne Eckerson and Birst
Live Webcast on Nov. 6, 2012
The desire for analytics today extends far beyond the traditional domain of Business Intelligence. The challenge is that operational systems come in countless shapes and sizes. Furthermore, each application treats data somewhat differently. But there are patterns of data flow and transformation that pervade all such systems. And there's one big place where all these data types and use cases have come together architecturally: the Cloud.
Watch this episode of the Briefing Room to hear veteran Analyst Wayne Eckerson explain how Cloud computing is ushering in a new era of analytics and intelligence. He'll be briefed by Brad Peters of Birst who will tout his company's purpose-built analytics platform. He'll discuss how the Birst engine processes and delivers raw data from disparate systems, offering the deployment flexibility of Software-as-a-Service, together with the capabilities of enterprise-class BI.
Self-Service Access and Exploration of Big DataInside Analysis
The Briefing Room with Robin Bloor and Cirro
Live Webcast on Dec. 11, 2012
As the information landscape expands with all kinds of Big Data, businesses are searching for ways to unite their traditional analytics with this new source of insight. One ambitious approach involves federating access to multiple data sources, even across various operating systems. The idea is to take analytic processing to the data, then intelligently assemble the results for a business user. Could this be the long-awaited alternative to data virtualization?
Check out this episode of The Briefing Room to hear veteran Analyst Robin Bloor explain how federated access to data sources can pave the way for a truly integrated data fabric. Bloor will be briefed by Mark Theissen of Cirro, who will tout his company's patent-pending Data Hub, which simplifies data access by federating queries across multiple sources of structured, semi-structured, and unstructured data. He'll discuss Cirro's cost based optimizer, smart caching, dynamic query plan re-optimization, normalization of cost estimates and a metadata repository for unstructured data sources.
Visit: http://www.insideanalysis.com
Technically Speaking: How Self-Service Analytics Fosters CollaborationInside Analysis
The Briefing Room with Wayne Eckerson and Tableau Software
Achieving self-service analytics requires tight collaboration between business users and their technical counterparts. Data sources, interfaces, business rules and governance guard rails must be designed to accommodate particular topical domains, and levels of expertise. The process of creating such customized solutions tends to result in very productive collaboration, not just across the IT-business boundary, but also between and among business users and IT professionals alike.
Check out this episode of The Briefing Room to hear veteran Analyst Wayne Eckerson explain how self-service solutions should be designed and implemented. He'll be briefed by Ellie Fields of Tableau Software who will discuss how to create, adjust and share visualizations that can help you understand and communicate your information effectively.
http://www.insideanalysis.com
Enabling Flexible Governance for All Data SourcesInside Analysis
The Briefing Room with Robin Bloor and Birst
Live Webcast on Feb. 5, 2013
All the effort that goes into data governance can quickly be lost if effective guard rails aren't in place. However, end users invariably need additional data sets in order to get a complete picture of what's happening. All too often, some or all of those additional data sources have not yet run the gauntlet of governance. Striking a balance between core and contextual data can help ensure that your business stays on top of opportunities without straying from the path.
Check out this episode of The Briefing Room to learn from veteran Analyst Dr. Robin Bloor, who will explain the nuances of integrating governed and ungoverned data in ways that business users can easily leverage. He'll be briefed by Brad Peters of Birst who will demonstrate how managed data mashups can provide the kind of flexibility and agility that can lead to valuable insights. He'll explain how Birst's architecture can significantly lighten the load on IT without sacrificing data integrity, security or governance.
Visit: http://www.insideanalysis.com
Hadoop as Data Refinery - Steve LoughranJAX London
Apache Hadoop is often described as a "Big Data Platform" but what does that mean? One way to better understand Hadoop is to talk about how Hadoop is used. This talk discusses using Hadoop as a "Data Refinery", which is a common use case. The concept is very much like a traditional oil refinery except with data, pulling in large quantities of "crude data" over pipelines, refining some into useful business intelligence; refining other pieces into slightly less crude data that stays in the cluster until needed later. This metaphor proves useful when considering how Hadoop could be adopted in an organisation that already has data warehousing and business intelligence systems -and when contemplating how to hook up a Hadoop cluster to the sources of data inside and outside that organisation. A key point to remember is that storing data in Hadoop is not a means to an end any more than storing data in a database is: it is extracting information from that data. Using Hadoop as a front end "data refinery" means that it can integrate with existing Business Intelligence systems, while providing the platform for new applications.
Explores the notion of "Hadoop as a Data Refinery" within an organisation, be it one with an existing Business Intelligence system or none - looks at 'agile data' as a a benefit of using Hadoop as the store for historical, unstructured and very-large-scale datasets.
The final slides look at the challenge of an organisation becoming "data driven"
The Perfect Storm: The Impact of Analytics, Big Data and AnalyticsInside Analysis
The Briefing Room with Barry Devlin and NuoDB
Live Webcast on Oct. 23, 2012
Three major factors in enterprise computing are combining to rewrite how data is stored, accessed and managed: 1) the demand of analytics that now spreads across hundreds, even thousands of users; 2) the pervasiveness of Big Data in all its forms and sizes; and 3) the rise of the commodity data center, aka Cloud computing. The convergence of these forces calls for a new data foundation, one that can handle the scalability and workload issues that face today's information managers.
Check out this episode of The Briefing Room to learn from veteran Analyst Barry Devlin, one of the very first architects of data warehousing, who will explain how today's information architectures require a radically different approach. He'll be briefed by Barry Morris, Founder and CEO of NuoDB, who will tout his company's product, described as a peer-to-peer messaging system that acts as a database. It behaves just like a traditional relational database, but was designed with a completely distributed and scalable architecture.
http://www.insideanalysis.com
Self-Service Access and Exploration of Big DataInside Analysis
The Briefing Room with Robin Bloor and Cirro
Live Webcast on Dec. 11, 2012
As the information landscape expands with all kinds of Big Data, businesses are searching for ways to unite their traditional analytics with this new source of insight. One ambitious approach involves federating access to multiple data sources, even across various operating systems. The idea is to take analytic processing to the data, then intelligently assemble the results for a business user. Could this be the long-awaited alternative to data virtualization?
Check out this episode of The Briefing Room to hear veteran Analyst Robin Bloor explain how federated access to data sources can pave the way for a truly integrated data fabric. Bloor will be briefed by Mark Theissen of Cirro, who will tout his company's patent-pending Data Hub, which simplifies data access by federating queries across multiple sources of structured, semi-structured, and unstructured data. He'll discuss Cirro's cost based optimizer, smart caching, dynamic query plan re-optimization, normalization of cost estimates and a metadata repository for unstructured data sources.
Visit: http://www.insideanalysis.com
Technically Speaking: How Self-Service Analytics Fosters CollaborationInside Analysis
The Briefing Room with Wayne Eckerson and Tableau Software
Achieving self-service analytics requires tight collaboration between business users and their technical counterparts. Data sources, interfaces, business rules and governance guard rails must be designed to accommodate particular topical domains, and levels of expertise. The process of creating such customized solutions tends to result in very productive collaboration, not just across the IT-business boundary, but also between and among business users and IT professionals alike.
Check out this episode of The Briefing Room to hear veteran Analyst Wayne Eckerson explain how self-service solutions should be designed and implemented. He'll be briefed by Ellie Fields of Tableau Software who will discuss how to create, adjust and share visualizations that can help you understand and communicate your information effectively.
http://www.insideanalysis.com
Enabling Flexible Governance for All Data SourcesInside Analysis
The Briefing Room with Robin Bloor and Birst
Live Webcast on Feb. 5, 2013
All the effort that goes into data governance can quickly be lost if effective guard rails aren't in place. However, end users invariably need additional data sets in order to get a complete picture of what's happening. All too often, some or all of those additional data sources have not yet run the gauntlet of governance. Striking a balance between core and contextual data can help ensure that your business stays on top of opportunities without straying from the path.
Check out this episode of The Briefing Room to learn from veteran Analyst Dr. Robin Bloor, who will explain the nuances of integrating governed and ungoverned data in ways that business users can easily leverage. He'll be briefed by Brad Peters of Birst who will demonstrate how managed data mashups can provide the kind of flexibility and agility that can lead to valuable insights. He'll explain how Birst's architecture can significantly lighten the load on IT without sacrificing data integrity, security or governance.
Visit: http://www.insideanalysis.com
Hadoop as Data Refinery - Steve LoughranJAX London
Apache Hadoop is often described as a "Big Data Platform" but what does that mean? One way to better understand Hadoop is to talk about how Hadoop is used. This talk discusses using Hadoop as a "Data Refinery", which is a common use case. The concept is very much like a traditional oil refinery except with data, pulling in large quantities of "crude data" over pipelines, refining some into useful business intelligence; refining other pieces into slightly less crude data that stays in the cluster until needed later. This metaphor proves useful when considering how Hadoop could be adopted in an organisation that already has data warehousing and business intelligence systems -and when contemplating how to hook up a Hadoop cluster to the sources of data inside and outside that organisation. A key point to remember is that storing data in Hadoop is not a means to an end any more than storing data in a database is: it is extracting information from that data. Using Hadoop as a front end "data refinery" means that it can integrate with existing Business Intelligence systems, while providing the platform for new applications.
Explores the notion of "Hadoop as a Data Refinery" within an organisation, be it one with an existing Business Intelligence system or none - looks at 'agile data' as a a benefit of using Hadoop as the store for historical, unstructured and very-large-scale datasets.
The final slides look at the challenge of an organisation becoming "data driven"
The Perfect Storm: The Impact of Analytics, Big Data and AnalyticsInside Analysis
The Briefing Room with Barry Devlin and NuoDB
Live Webcast on Oct. 23, 2012
Three major factors in enterprise computing are combining to rewrite how data is stored, accessed and managed: 1) the demand of analytics that now spreads across hundreds, even thousands of users; 2) the pervasiveness of Big Data in all its forms and sizes; and 3) the rise of the commodity data center, aka Cloud computing. The convergence of these forces calls for a new data foundation, one that can handle the scalability and workload issues that face today's information managers.
Check out this episode of The Briefing Room to learn from veteran Analyst Barry Devlin, one of the very first architects of data warehousing, who will explain how today's information architectures require a radically different approach. He'll be briefed by Barry Morris, Founder and CEO of NuoDB, who will tout his company's product, described as a peer-to-peer messaging system that acts as a database. It behaves just like a traditional relational database, but was designed with a completely distributed and scalable architecture.
http://www.insideanalysis.com
Agile Data Rationalization for Operational IntelligenceInside Analysis
The Briefing Room with Eric Kavanagh and Phasic Systems
Live Webcast Mar. 26, 2013
The complexity of today's information architectures creates a wide range of challenges for executives trying to get a strategic view of their current operations. The data and context locked in operational systems often get diluted during the normalization processes of data warehousing and other types of analytic solutions. And the ultimate goal of seeing the big picture gets derailed by a basic inability to reconcile disparate organizational views of key information assets and rules.
Register for this episode of The Briefing Room to learn from Bloor Group CEO Eric Kavanagh, who will explain how a tightly controlled methodology can be combined with modern NoSQL technology to resolve both process and system complexities, thus enabling a much richer, more interconnected information landscape. Kavanagh will be briefed by Geoffrey Malafsky of Phasic Systems who will share his company's tested methodology for capturing and managing the business and process logic that run today's data-driven organizations. He'll demonstrate how a “don't say no” approach to entity definitions can dissolve previously intractable disagreements, opening the door to clear, verifiable operational intelligence.
Visit: http://www.insideanalysis.com
Introduction to Hortonworks Data Platform for WindowsHortonworks
According to IDC, Windows Servers run more than 50% of the servers in the Enterprise Data Center. Hortonworks has worked closely with Microsoft to port Apache Hadoop to Windows to enable organizations to take advantage of this emerging Big Data technology. Join us in this informative webinar to hear about the new Hortonworks Data Platform for Windows.
In less than an hour, you’ll learn:
-Key capabilities available in Hortonworks Data Platform for Windows
-How HDP for Windows integrates with Microsoft tools
-Key workloads and use cases for driving Hadoop today
Big data insights with Red Hat JBoss Data VirtualizationKenneth Peeples
You’re hearing a lot about big data these days. And big data and the technologies that store and process it, like Hadoop, aren’t just new data silos. You might be looking to integrate big data with existing enterprise information systems to gain better understanding of your business. You want to take informed action.
During this session, we’ll demonstrate how Red Hat JBoss Data Virtualization can integrate with Hadoop through Hive and provide users easy access to data. You’ll learn how Red Hat JBoss Data Virtualization:
Can help you integrate your existing and growing data infrastructure.
Integrates big data with your existing enterprise data infrastructure.
Lets non-technical users access big data result sets.
We’ll also provide typical uses cases and examples and a demonstration of the integration of Hadoop sentiment analysis with sales data.
Join Objectivity, Inc.’s VP of Product Management, Brian Clark, in a discussion of the latest trends in Big Data Analytics, defining what is Big Data and understanding how to maximize your existing architectures by utilizing NOSQL technologies to improve functionality and provide real-time results. There will be a focus on relationship analytics as well as an introduction to NOSQL data stores, object and graph databases, such as the architecture behind Objectivity/DB and InfiniteGraph.
Is your organization contemplating a migration to Office 365? Whether you are planning to move SharePoint or Exchange or whether you are planning to implement OneDrive for Business, Skype for Business, or Power BI, this session will cover many aspects of how to plan for a migration to Office 365.
Specifically, Doug Hemminger will walk us through how to:
• Inventory your current environment and make key deployment decisions about what to migrate and how to migrate it.
• Fix potential deployment blockers including how to cleanup active directory and how to get your network ready for Office 365.
• Set up Office 365 services to work for your organization, including enabling and disabling the appropriate features and services.
• Roll out Office 365 out to your users including assigning the appropriate licenses and communicating key concepts
This session will be a mix of presentation and demonstration. By the end of the session, you should have a good idea of how to plan your migration to Office 365.
D365 Finance & Operations - Data & Analytics (see newer release of this docum...Gina Pabalan
This very comprehensive white paper provides a detailed and clear overview of Microsoft's D365 Finance & Operations solutions to support Data & Analytics.
There is a newer version of this available - search SlideShare for the new version of this deck.
Big Data Analytics - Is Your Elephant Enterprise Ready?Hortonworks
Hadoop’s cost effective scalability and flexibility to analyze all data types is driving organizations everywhere to embrace big data analytics. From proof of concept to deployment across the enterprise, join Datameer and Hortonworks as we answer the ‘now what?’ when rolling out your Hadoop big data analytics project. This webinar will address critical project components such as data security, data privacy, high availability, user training and use case development.
Given at Oracle Open World 2011: Not to be confused with Oracle Database Vault (a commercial db security product), Data Vault Modeling is a specific data modeling technique for designing highly flexible, scalable, and adaptable data structures for enterprise data warehouse repositories. It has been in use globally for over 10 years now but is not widely known. The purpose of this presentation is to provide an overview of the features of a Data Vault modeled EDW that distinguish it from the more traditional third normal form (3NF) or dimensional (i.e., star schema) modeling approaches used in most shops today. Topics will include dealing with evolving data requirements in an EDW (i.e., model agility), partitioning of data elements based on rate of change (and how that affects load speed and storage requirements), and where it fits in a typical Oracle EDW architecture. See more content like this by following my blog http://kentgraziano.com or follow me on twitter @kentgraziano.
Infochimps #1 Big Data Platform for the CloudBrian Krpec
The Infochimps Platform is the simplest, fastest, and most flexible way to implement proven big data infrastructure in the cloud. Scalably and affordably ingest data from wherever you need — your in-house systems, external data feeds, data from the web, or our Data Marketplace. Make it useful with in-stream data decoration and augmentation. Store and analyze it in the best place for your application. Hadoop, NoSQL, real-time analytics — how do you tie it all together? The Infochimps Platform takes the mystery and difficulty out of big data and seamlessly integrates it with your existing environment, so you can focus on gaining business insights from your data fast.
Introduction to Microsoft’s Master Data Services (MDS)James Serra
Master Data Services is bundled with SQL Server 2012 to help resolve many of the Master Data Management issues that companies are faced with when integrating data. In this session, James will show an overview of Master Data Services 2012, including the out of the box Web UI, the highly developed Excel Add-in, and how to get started with loading MDS with your data.
Hadoop World 2011: Big Data Architecture: Integrating Hadoop with Other Enter...Cloudera, Inc.
Recent research has pointed out the complementary nature of Hadoop and other data management solutions and the importance of leveraging existing systems, SQL, engineering, and operational skills, as well as incorporating novel uses of MapReduce to improve analytic processing. Come to this session to learn how companies optimize the use of Hadoop with other enterprise systems to improve overall analytical throughput and build new data-driven products. This session covers: ways to achieve high-performance integration between Hadoop and relational-based systems; Hadoop+NoSQL vs Hadoop+SQL architectures; high-speed, massively parallel data transfer to analytical platforms that can aggregate web log data with granular fact data; and strategies for freeing up capacity for more explorative, iterative analytics and ad hoc queries.
SnapLogic provides a Data Integration platform that takes integration to another level, by combining the power of dynamic programming languages with standard Web interfaces to solve today's most pressing problems in application integration. SnapLogic has an intuitive visual designer that runs in your browser and connects to highly scalable web based Integration server that you can run on premise or in the cloud.
SQL Server Data Mining - Taking your Application Design to the Next LevelMark Ginnebaugh
Presentation to the Silicon Valley SQL Server User Group on July 21, 2009. Microsoft MVP Peter Myers talks about Microsoft SQL Server 2005 and 2008 Data Mining and demonstrates how to develop data mining models that can be embedded into applications.
You Will Learn:
•What business problems SQL Server 2008 data mining can solve
•SQL Server’s data mining capabilities
•The data mining development cycle
•How to create, train, test and query mining models
•How to Embed data mining into reports and applications
BI Leadership Forum
Wayne Eckerson and Eric Colson
Live Webcast on Sept. 24, 2012
In a collegial, fast-paced culture where change is constant and speed is paramount. Managing data in such a hyper-paced environment requires creative, out-of-box thinking. To meet business needs, developers and analysts iterate quickly, fail fast, and coalesce their designs after the fact to deliver maximum value. “We keep things fluid and rely on good judgment rather than rules to get things done,” says Colson. Tune into this Webcast to discover how to empower your developers and analysts to build effective solutions at the speed of business.
Discussion Points:
Can one developer really build an entire BI application?
What is the role of specialists, if any?
How do you evolve your data warehouse models quickly?
What types of rules and principles guide your development activities?
What is the relationship of your statisticians to your BI developers?
Visit http://www.bileadership.com
Manufacturers have an abundance of data, whether from connected sensors, plant systems, manufacturing systems, claims systems and external data from industry and government. Manufacturers face increased challenges from continually improving product quality, reducing warranty and recall costs to efficiently leveraging their supply chain. For example, giving the manufacturer a complete view of the product and customer information integrating manufacturing and plant floor data, with as built product configurations with sensor data from customer use to efficiently analyze warranty claim information to reduce detection to correction time, detect fraud and even become proactive around issues requires a capable enterprise data hub that integrates large volumes of both structured and unstructured information. Learn how an enterprise data hub built on Hadoop provides the tools to support analysis at every level in the manufacturing organization.
Agile Data Rationalization for Operational IntelligenceInside Analysis
The Briefing Room with Eric Kavanagh and Phasic Systems
Live Webcast Mar. 26, 2013
The complexity of today's information architectures creates a wide range of challenges for executives trying to get a strategic view of their current operations. The data and context locked in operational systems often get diluted during the normalization processes of data warehousing and other types of analytic solutions. And the ultimate goal of seeing the big picture gets derailed by a basic inability to reconcile disparate organizational views of key information assets and rules.
Register for this episode of The Briefing Room to learn from Bloor Group CEO Eric Kavanagh, who will explain how a tightly controlled methodology can be combined with modern NoSQL technology to resolve both process and system complexities, thus enabling a much richer, more interconnected information landscape. Kavanagh will be briefed by Geoffrey Malafsky of Phasic Systems who will share his company's tested methodology for capturing and managing the business and process logic that run today's data-driven organizations. He'll demonstrate how a “don't say no” approach to entity definitions can dissolve previously intractable disagreements, opening the door to clear, verifiable operational intelligence.
Visit: http://www.insideanalysis.com
Introduction to Hortonworks Data Platform for WindowsHortonworks
According to IDC, Windows Servers run more than 50% of the servers in the Enterprise Data Center. Hortonworks has worked closely with Microsoft to port Apache Hadoop to Windows to enable organizations to take advantage of this emerging Big Data technology. Join us in this informative webinar to hear about the new Hortonworks Data Platform for Windows.
In less than an hour, you’ll learn:
-Key capabilities available in Hortonworks Data Platform for Windows
-How HDP for Windows integrates with Microsoft tools
-Key workloads and use cases for driving Hadoop today
Big data insights with Red Hat JBoss Data VirtualizationKenneth Peeples
You’re hearing a lot about big data these days. And big data and the technologies that store and process it, like Hadoop, aren’t just new data silos. You might be looking to integrate big data with existing enterprise information systems to gain better understanding of your business. You want to take informed action.
During this session, we’ll demonstrate how Red Hat JBoss Data Virtualization can integrate with Hadoop through Hive and provide users easy access to data. You’ll learn how Red Hat JBoss Data Virtualization:
Can help you integrate your existing and growing data infrastructure.
Integrates big data with your existing enterprise data infrastructure.
Lets non-technical users access big data result sets.
We’ll also provide typical uses cases and examples and a demonstration of the integration of Hadoop sentiment analysis with sales data.
Join Objectivity, Inc.’s VP of Product Management, Brian Clark, in a discussion of the latest trends in Big Data Analytics, defining what is Big Data and understanding how to maximize your existing architectures by utilizing NOSQL technologies to improve functionality and provide real-time results. There will be a focus on relationship analytics as well as an introduction to NOSQL data stores, object and graph databases, such as the architecture behind Objectivity/DB and InfiniteGraph.
Is your organization contemplating a migration to Office 365? Whether you are planning to move SharePoint or Exchange or whether you are planning to implement OneDrive for Business, Skype for Business, or Power BI, this session will cover many aspects of how to plan for a migration to Office 365.
Specifically, Doug Hemminger will walk us through how to:
• Inventory your current environment and make key deployment decisions about what to migrate and how to migrate it.
• Fix potential deployment blockers including how to cleanup active directory and how to get your network ready for Office 365.
• Set up Office 365 services to work for your organization, including enabling and disabling the appropriate features and services.
• Roll out Office 365 out to your users including assigning the appropriate licenses and communicating key concepts
This session will be a mix of presentation and demonstration. By the end of the session, you should have a good idea of how to plan your migration to Office 365.
D365 Finance & Operations - Data & Analytics (see newer release of this docum...Gina Pabalan
This very comprehensive white paper provides a detailed and clear overview of Microsoft's D365 Finance & Operations solutions to support Data & Analytics.
There is a newer version of this available - search SlideShare for the new version of this deck.
Big Data Analytics - Is Your Elephant Enterprise Ready?Hortonworks
Hadoop’s cost effective scalability and flexibility to analyze all data types is driving organizations everywhere to embrace big data analytics. From proof of concept to deployment across the enterprise, join Datameer and Hortonworks as we answer the ‘now what?’ when rolling out your Hadoop big data analytics project. This webinar will address critical project components such as data security, data privacy, high availability, user training and use case development.
Given at Oracle Open World 2011: Not to be confused with Oracle Database Vault (a commercial db security product), Data Vault Modeling is a specific data modeling technique for designing highly flexible, scalable, and adaptable data structures for enterprise data warehouse repositories. It has been in use globally for over 10 years now but is not widely known. The purpose of this presentation is to provide an overview of the features of a Data Vault modeled EDW that distinguish it from the more traditional third normal form (3NF) or dimensional (i.e., star schema) modeling approaches used in most shops today. Topics will include dealing with evolving data requirements in an EDW (i.e., model agility), partitioning of data elements based on rate of change (and how that affects load speed and storage requirements), and where it fits in a typical Oracle EDW architecture. See more content like this by following my blog http://kentgraziano.com or follow me on twitter @kentgraziano.
Infochimps #1 Big Data Platform for the CloudBrian Krpec
The Infochimps Platform is the simplest, fastest, and most flexible way to implement proven big data infrastructure in the cloud. Scalably and affordably ingest data from wherever you need — your in-house systems, external data feeds, data from the web, or our Data Marketplace. Make it useful with in-stream data decoration and augmentation. Store and analyze it in the best place for your application. Hadoop, NoSQL, real-time analytics — how do you tie it all together? The Infochimps Platform takes the mystery and difficulty out of big data and seamlessly integrates it with your existing environment, so you can focus on gaining business insights from your data fast.
Introduction to Microsoft’s Master Data Services (MDS)James Serra
Master Data Services is bundled with SQL Server 2012 to help resolve many of the Master Data Management issues that companies are faced with when integrating data. In this session, James will show an overview of Master Data Services 2012, including the out of the box Web UI, the highly developed Excel Add-in, and how to get started with loading MDS with your data.
Hadoop World 2011: Big Data Architecture: Integrating Hadoop with Other Enter...Cloudera, Inc.
Recent research has pointed out the complementary nature of Hadoop and other data management solutions and the importance of leveraging existing systems, SQL, engineering, and operational skills, as well as incorporating novel uses of MapReduce to improve analytic processing. Come to this session to learn how companies optimize the use of Hadoop with other enterprise systems to improve overall analytical throughput and build new data-driven products. This session covers: ways to achieve high-performance integration between Hadoop and relational-based systems; Hadoop+NoSQL vs Hadoop+SQL architectures; high-speed, massively parallel data transfer to analytical platforms that can aggregate web log data with granular fact data; and strategies for freeing up capacity for more explorative, iterative analytics and ad hoc queries.
SnapLogic provides a Data Integration platform that takes integration to another level, by combining the power of dynamic programming languages with standard Web interfaces to solve today's most pressing problems in application integration. SnapLogic has an intuitive visual designer that runs in your browser and connects to highly scalable web based Integration server that you can run on premise or in the cloud.
SQL Server Data Mining - Taking your Application Design to the Next LevelMark Ginnebaugh
Presentation to the Silicon Valley SQL Server User Group on July 21, 2009. Microsoft MVP Peter Myers talks about Microsoft SQL Server 2005 and 2008 Data Mining and demonstrates how to develop data mining models that can be embedded into applications.
You Will Learn:
•What business problems SQL Server 2008 data mining can solve
•SQL Server’s data mining capabilities
•The data mining development cycle
•How to create, train, test and query mining models
•How to Embed data mining into reports and applications
BI Leadership Forum
Wayne Eckerson and Eric Colson
Live Webcast on Sept. 24, 2012
In a collegial, fast-paced culture where change is constant and speed is paramount. Managing data in such a hyper-paced environment requires creative, out-of-box thinking. To meet business needs, developers and analysts iterate quickly, fail fast, and coalesce their designs after the fact to deliver maximum value. “We keep things fluid and rely on good judgment rather than rules to get things done,” says Colson. Tune into this Webcast to discover how to empower your developers and analysts to build effective solutions at the speed of business.
Discussion Points:
Can one developer really build an entire BI application?
What is the role of specialists, if any?
How do you evolve your data warehouse models quickly?
What types of rules and principles guide your development activities?
What is the relationship of your statisticians to your BI developers?
Visit http://www.bileadership.com
Manufacturers have an abundance of data, whether from connected sensors, plant systems, manufacturing systems, claims systems and external data from industry and government. Manufacturers face increased challenges from continually improving product quality, reducing warranty and recall costs to efficiently leveraging their supply chain. For example, giving the manufacturer a complete view of the product and customer information integrating manufacturing and plant floor data, with as built product configurations with sensor data from customer use to efficiently analyze warranty claim information to reduce detection to correction time, detect fraud and even become proactive around issues requires a capable enterprise data hub that integrates large volumes of both structured and unstructured information. Learn how an enterprise data hub built on Hadoop provides the tools to support analysis at every level in the manufacturing organization.
A presentation from TDWI's 2009 Executive Summit in San Diego. This presentation is by Wayne Eckerson, TDWI's Director of Research. For more information on TDWI, please visit http://www.tdwi.org
In this Wayne Eckerson delivers an overview of his new book "Secrets of Analytical Leaders: Insights from Information Insiders." Imagine spending a day with top analytical leaders and asking any question you want. In this book, Wayne Eckerson illustrates analytical best practices by weaving his perspective with commentary from seven directors of analytics who unveil their secrets of success. With an innovative flair, Eckerson tackles a complex subject with clarity and insight.
A Strategic View of Enterprise Reporting and Analytics: The Data FunnelInside Analysis
The Briefing Room with Colin White and Jaspersoft
Slides from the Live Webcast on June 12, 2012
As the corporate appetite for analytics and reporting grows, companies must find a way to secure a strategic view of their information architecture. End users with varying degrees of expertise need a wide range of data and reports delivered in a timely fashion. As the audience for analytics expands, that puts pressure on IT infrastructure and staff. And now with the promise of Hadoop and MapReduce, the organization's desire for business insight becomes even more significant.
In this episode of The Briefing Room, veteran Analyst Colin White of BI Research will explain the value of being strategic with enterprise reporting. White will be briefed by Karl Van den Bergh of Jaspersoft, who will tout his company's “data funnel” concept, which is designed to strategically manage an organization's information architecture. By aligning information assets along this funnel, IT can effectively address the spectrum of analytical needs – from simple reporting to complex, ad hoc analysis – without over-taxing personnel and system resources.
The Big Picture: Big Data for the New Wave of AnalyticsInside Analysis
The Briefing Room with Neil Raden and MarkLogic
Live Webcast on Oct. 2, 2012
Understanding context is a critical success factor for any decision-maker. Getting a clear view of the big picture can help guide all kinds of important decisions. That's why many organizations are focused on weaving together structured and "unstructured" data, to create a strategic view of enterprise issues and opportunities. The answers are usually found somewhere in between a SQL query and a Google-style search.
Check out this episode of The Briefing Room to learn from veteran Analyst Neil Raden of Hired Brains, who will explain how a new breed of analytical applications can generate a wide range of targeted insights. He'll be briefed by Steve Guttman of MarkLogic, who will tout his company's Enterprise NoSQL database, which combines the durability of traditional relational databases, with the versatility of modern Big Data engines. He'll also discuss real-world examples of new applications for various industries.
Best Practices For Building and Operating A Managed Data Lake - StampedeCon 2016StampedeCon
This session will detail best practices for architecting, building, operating and managing an Analytics Data Lake platform. Key topics will include:
1) Defining next-generation Data Lake architectures. The defacto standard has been commodity DAS servers with HDFS, but there are now multiple solutions aimed at separating compute and storage, virtualizing or containerizing Hadoop applications, and utilizing Hadoop compatible or embedded HDFS filesystems. This portion will explore the options available, and the pros and cons of each.
2) Data Ingest. There are many ways to load data into a Data Lake, including standardized Apache tools (Sqoop, Flume, Kafka, Storm, Spark, NiFi), standard file and object protocols (SFTP, NFS, Rest, WebHDFS), and proprietary tools (eg, Zaloni Bedrock, DataTorrent). This section will explore these options in the context of best fit to workflows; it will also look at key gaps and challenges, particularly in the areas of data formats and integration with metadata/cataloging tools.
3) Metadata & Cataloguing. One of the biggest inhibitors of successful Data Lake deployments is Data Governance, particularly in the areas of indexing, cataloguing and metadata management. It is nearly impossible to run analytics on top of a Data Lake and get meaningful & timely results without solving these problems. This portion will explore both emerging open standards (Apache Atlas, HCatalog) and proprietary tools (Cloudera Navigator, Zaloni Bedrock/Mica, Informatica Metadata Manager), and balance the pros, cons and gaps of each.
4) Security & Access Controls. Solving these challenges are key for adoption in regulatory driven industries like Healthcare & Financial Services. There are multiple Apache projects and proprietary tools to address this, but the challenge is making security and access controls consistent across the entire application and infrastructure stack, and over the data lifecycle, and being able to audit this in the face of legal challenges. This portion will explore available options and best practices.
5) Provisioning & Workflow Management. The real promise of the Data Lake is integrating Analytics workflows and tools on converged infrastructure-with shared data-and build “As A Service” oriented architectures that are oriented towards self-service data exploration and Analytics for end users. This is an emerging and immature area, but this session will explore some potential concepts, tools and options to achieve this.
This will be a moderately technical session, with the above topics being illustrated by real world examples. Attendees should have basic familiarity with Hadoop and the associated Apache projects.
All Together Now: A Recipe for Successful Data GovernanceInside Analysis
The Briefing Room with David Loshin and Phasic Systems
Slides from the Live Webcast on July 10, 2012
Getting disparate groups of professionals to agree on business terminology can take forever, especially when big dollars or major issues are at stake. Many data governance programs languish indefinitely because of simple hang-ups. But a new approach has recently achieved monumental results for the United States Navy. The detailed process has since been codified and combined with a NoSQL technology that enables even the most complex data models and definitions to be distilled into simple, functional data flows.
Check out this episode of The Briefing Room to hear Analyst David Loshin of Knowledge Integrity explain why effective Data Governance requires cooperation. Loshin will be briefed by Geoffrey Malafsky of Phasic Systems who will tout his company's proprietary protocol for extracting, defining and managing critical information assets and processes. He'll explain how their approach allows everyone to be "correct" in their definitions, without causing data quality or performance issues in associated information systems. And he'll explain how their Corporate NoSQL engine enables real-time harmonization of definitions and dimensions.
Visit us at: http://www.insideanalysis.com
Bridging the Gap: Analyzing Data in and Below the CloudInside Analysis
The Briefing Room with Dean Abbott and Tableau Software
Live Webcast July 23, 2013
http://www.insideanalysis.com
Today’s desire for analytics extends well beyond the traditional domain of Business Intelligence. That’s partly because business users are realizing the value of mixing and matching all kinds of data, from all kinds of sources. One emerging market driver is Cloud-based data, and the desire companies have to analyze this data cohesively with their on-premise data sets.
Register for this episode of The Briefing Room to learn from Analyst Dean Abbott, who will explain how the ability to access data in the cloud can play a critical role for generating business value from analytics. He’ll be briefed by Ellie Fields of Tableau Software who will tout Tableau’s latest release, which includes native connectors to cloud-based applications like Salesforce.com, Amazon Redshift, Google Analytics and BigQuery. She’ll also demonstrate how Tableau can combine cloud data with other data sources, including spreadsheets, databases, cubes and even Big Data.
It is a fascinating, explosive time for enterprise analytics.
It is from the position of analytics leadership that the mission will be executed and company leadership will emerge. The data professional is absolutely sitting on the performance of the company in this information economy and has an obligation to demonstrate the possibilities and originate the architecture, data, and projects that will deliver analytics. After all, no matter what business you’re in, you’re in the business of analytics.
The coming years will be full of big changes in enterprise analytics and Data Architecture. William will kick off the fourth year of the Advanced Analytics series with a discussion of the trends winning organizations should build into their plans, expectations, vision, and awareness now.
Accelerate Big Data Application Development with Cascading and HDP, Hortonwor...Hortonworks
Accelerate Big Data Application Development with Cascading and HDP, webinar hosted by Hortonworks and Concurrent. Visit Hortonworks.com/webinars to access the recording.
Join Cloudian, Hortonworks and 451 Research for a panel-style Q&A discussion about the latest trends and technology innovations in Big Data and Analytics. Matt Aslett, Data Platforms and Analytics Research Director at 451 Research, John Kreisa, Vice President of Strategic Marketing at Hortonworks, and Paul Turner, Chief Marketing Officer at Cloudian, will answer your toughest questions about data storage, data analytics, log data, sensor data and the Internet of Things. Bring your questions or just come and listen!
Business Discovery and QlikView 11 Keynote presentation Slides presented by John Callan, Senior Director of Product Marketing at QlikTech at the Business Discovery London event on 22nd November 2011
Denodo DataFest 2016: Big Data Virtualization in the CloudDenodo
Watch the full session: Denodo DataFest 2016 sessions: https://goo.gl/kahTgf
Many firms are adopting “cloud first” strategy and are migrating their on-premises technologies to the cloud. Logitech is one of them. They have adopted the AWS platform and big data on the cloud for all of their analytical needs, including Amazon Redshift and S3.
In this presentation, the Principal of Big Data and Analytics team at Logitech, Avinash Deshpande will present:
• The business rationale for migrating to the cloud
• How data virtualization enables the migration
• Running data virtualization itself in the cloud
This session also includes a panel discussion with:
• Avinash Deshpande, Principal – Big Data and Analytics at Logitech
• Kurt Jackson, Platform Lead at Autodesk
• Dan Young, Chief Data Architect at Indiana University
• Paul Moxon, Head of Product Management at Denodo (as moderator)
This session is part of the Denodo DataFest 2016 event. You can also watch more Denodo DataFest sessions on demand here: https://goo.gl/VXb6M6
Actionable Data: Mastering the Hybrid Analytics MixPerficient, Inc.
With an increase in the adoption of cloud applications, most organizations today are in some form of hybrid state (i.e. using a combination of on-premise and cloud applications to run their business). Regardless of where the data resides, you need a complete view of the company spanning across different parts of the business, combining insightful data across both onsite and public cloud instances.
In this webinar, we looked at multiple approaches that organizations have successfully used to consolidate data from multiple cloud and on-premise applications and to perform seamless analytics across these varied data sources.
The Anywhere Enterprise – How a Flexible Foundation Opens DoorsInside Analysis
The Briefing Room with Dr. Robin Bloor and InfiniDB
Live Webcast on August 12, 2014
Watch the archive:
https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=1e562c3a4b9e9cb9a054f0ec216d578b
Today’s organizations need all kinds of data, from a wide and growing array of sources. Marshaling all that data into one location can be difficult, even unrealistic. Increasingly, innovative companies are taking a much more distributed approach to storing and processing data. The end result is an information architecture that supports a broader range of business activities, and reduces dependence on costly data movement.
Register for this episode of The Briefing Room to learn from veteran Analyst Dr. Robin Bloor, as he explains how a distributed approach to data management can open doors to new business opportunities. He’ll be briefed by Jim Tommaney of InfiniDB who will explain how his company’s database has the flexibility to run on-prem, in the cloud, with cluster files systems or even Hadoop’s HDFS. He’ll also show how InfiniDB can serve as a conduit to companies looking to transform their information architecture to better satisfy changing market demands.
Visit InsideAnlaysis.com for more information.
The Great Lakes: How to Approach a Big Data ImplementationInside Analysis
The Briefing Room with Dr. Robin Bloor and Think Big, a Teradata Company
Live Webcast April 7, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=4114b87441ab7b2b4c52f6b24776e5a1
The more things change in Big Data, the more they stay the same. Indeed, there are many similarities between a Hadoop-based Data Lake and today’s modern Data Warehouse. Regardless of platform, information workers must still be able to turn their assets into action quickly, without taking a hit on governance or downstream performance.
Register for this episode of The Briefing Room to hear veteran Analyst Dr. Robin Bloor as he explains the challenges facing organizations who endeavor on Big Data projects. He’ll be briefed by Rick Stellwagen of Think Big, a Teradata Company, who will outline his company’s approach to handling Big Data implementations. Rick will discuss the role of the data lake, and how timely response of queries is critical for reporting and analysis.
Visit InsideAnalysis.com for more information.
Smart companies know that business intelligence surfaces insights. With complex analytics, data mining and everything in between, it takes many moving parts to serve up the big picture. The key is to provide full-stack visibility into the entire BI environment, ensuring solid service and system performance.
Learn more at http://www.insideanalysis.com
Agile, Automated, Aware: How to Model for SuccessInside Analysis
The Briefing Room with David Loshin and Embarcadero
Live Webcast October 27, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/onstage/g.php?MTID=eea9877b71c653c499c809c5693eae8fe
Data management teams face some tough challenges these days. Organizations need business-driven visibility that enables understanding and awareness of enterprise data assets – without worrying about definitions and change management. But with information architectures evolving into a hybrid mix of data objects and data services built over relational databases as well as big data stores, serving up accurately defined, reusable data can become a complex issue.
Register for this episode of The Briefing Room to learn from veteran Analyst David Loshin as he explains the importance of agile, automated workflows in today’s enterprise. He’ll be briefed by Ron Huizenga of Embarcadero, who will discuss how his company’s ER/Studio suite approaches data modeling and management from a modern architecture standpoint. He will explain that unifying the way information is represented can not only eliminate the need for costly workarounds, but also foster collaboration between data architects, developers and business users.
Visit InsideAnalysis.com for more information.
First in Class: Optimizing the Data Lake for Tighter IntegrationInside Analysis
The Briefing Room with Dr. Robin Bloor and Teradata RainStor
Live Webcast October 13, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=012bb2c290097165911872b1f241531d
Hadoop data lakes are emerging as peers to corporate data warehouses. However, successful data management solutions require a fusion of all relevant data, new and old, which has proven challenging for many companies. With a data lake that’s been optimized for fast queries, solid governance and lifecycle management, users can take data management to a whole new level.
Register for this episode of The Briefing Room to learn from veteran Analyst Dr. Robin Bloor as he discusses the relevance of data lakes in today’s information landscape. He’ll be briefed by Mark Cusack of Teradata, who will explain how his company’s archiving solution has developed into a storage point for raw data. He’ll show how the proven compression, scalability and governance of Teradata RainStor combined with Hadoop can enable an optimized data lake that serves as both reservoir for historical data and as a "system of record” for the enterprise.
Visit InsideAnalysis.com for more information.
Fit For Purpose: Preventing a Big Data LetdownInside Analysis
The Briefing Room with Dr. Robin Bloor and RedPoint Global
Live Webcast October 6, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=9982ad3a2603345984895f279e849d35
Gartner recently placed Big Data in its “trough of disillusionment,” reflective of many leaders’ struggle to prove the value of Hadoop within their organization. While the promise of enhanced data integration and enrichment is obvious, measurable results have remained elusive. This episode of The Briefing Room will outline how to successfully tie Big Data to existing business applications, preventing your next Hadoop project from being another “Big Data letdown.”
Register today to learn from veteran Analyst Dr. Robin Bloor as he discusses the importance of converging enterprise data integration with intelligence and scalability. He’ll be briefed by George Corugedo of RedPoint Global, who will provide concrete examples of how the convergence of scalable cloud platforms, ever-expanding data sources and intelligent execution can turn the Big Data hype into demonstrable business value.
Visit InsideAnalysis.com for more information.
To Serve and Protect: Making Sense of Hadoop Security Inside Analysis
The Briefing Room with Dr. Robin Bloor and HP Security Voltage
Live Webcast September 22, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=45ece7082b1d7c2cc8179bc7a1a69ea5
Hadoop is rapidly becoming a development platform and dominant server environment, and organizations are keen to take advantage of its massively scalable – and relatively inexpensive – resources. It is not, however, without its limitations, and it often requires a contingent of complementary components in order to behave within an information architecture. One area often overlooked is security, a factor that, if not considered from the onset, can insert great risk when putting sensitive data in Hadoop.
Register for this episode of The Briefing Room to learn from veteran Analyst Dr. Robin Bloor as he discusses how security was never a design point for Hadoop and what organizations can do about it. He’ll be briefed by Sudeep Venkatesh of HP Security Voltage, who will explain the intricacies surrounding a secure Hadoop implementation. He will show how techniques like format-preserving and partial-field encryption can allow for analytics over protected data, with zero performance impact.
Visit InsideAnalysis.com for more information.
The Hadoop Guarantee: Keeping Analytics Running On TimeInside Analysis
The Briefing Room with Dr. Robin Bloor and Pepperdata
Live Webcast September 15, 2015
Watch the Archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=32f198185d9d0c4cf32c27bdd1498b2a
Industry researchers agree: the importance of Hadoop will continue to grow as more companies recognize the range of benefits they can reap, from lower-cost storage to better business insights. At the same time, advances in the Hadoop ecosystem are addressing many of the key concerns that have hampered adoption, including performance and reliability. As a result, Hadoop is fast becoming a first-class citizen in the world of enterprise computing.
Register for this episode of The Briefing Room to hear veteran Analyst Dr. Robin Bloor explain how the Hadoop ecosystem is evolving into a mature foundation for managing enterprise data. He’ll be briefed by Sean Suchter of Pepperdata, who will explain how his company’s software brings predictability and reliability to Hadoop through dynamic, policy-based controls and monitoring. He’ll show how to guarantee service-level agreements by slowing down low-priority tasks as needed. He’ll also discuss the holy grail of Hadoop: how to enable mixed workloads.
Visit InsideAnalysis.com for more information.
Special Edition with Dr. Robin Bloor
Live Webcast September 9, 2015
Watch the Archive: https://bloorgroup.webex.com/bloorgroup/onstage/g.php?MTID=e8b9ac35d8e4ffa3452562c1d4286a975
Do the math: algebra will transform information management. Just as the relational database revolutionized the information landscape, so will a just-released, complete algebra of data overhaul the industry itself. So says Dr. Robin Bloor in his new book, the Algebra of Data, which he’ll outline in this special one-hour webcast.
Once organizations learn how to express their data sets algebraically, the benefits will be significant and far-reaching. Data quality problems will slowly subside; queries will run orders of magnitude faster; integration challenges will fade; and countless tedious jobs in the data management space will bid their farewell. But first, software companies must evolve, and that will take time.
Visit InsideAnalysis.com for more information.
The Role of Data Wrangling in Driving Hadoop AdoptionInside Analysis
The Briefing Room with Mark Madsen and Trifacta
Live Webcast September 1, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/onstage/g.php?MTID=eb655874d04ba7d560be87a9d906dd2fd
Like all enterprise software solutions, Hadoop must deliver business value in order to be a success. Much of the innovation around the big data industry these days therefore addresses usability. While there will always be a technical side to the Hadoop equation, the need for user-friendly tools to manage the data will continue to focus on business users. That’s why self-service data preparation or "data wrangling" is a serious and growing trend, one which promises to move Hadoop beyond the early adopter phase and more into the mainstream of business.
Register for this episode of The Briefing Room to hear veteran Analyst Mark Madsen of Third Nature explain why business users will play an increasingly important role in the evolution of big data. He’ll be briefed by Trifacta's Will Davis and Alon Bartur, who will demonstrate how Trifacta's solution empowers business users to “wrangle" data of all shapes and sizes faster and easier than ever before. They’ll discuss why a new approach to accessing and preparing diverse data is required and how it can accelerate and broaden the use of big data within organizations.
Visit InsideAnalysis.com for more information.
Ahead of the Stream: How to Future-Proof Real-Time AnalyticsInside Analysis
Business seems to move faster by the day, with the most cutting edge companies taking advantage of real-time data streams for heavy duty analytics. But with so much innovation happening in so many places, how can companies stay ahead of the game? One answer is to future-proof your analytics architecture by using an abstraction layer that can translate your business use-case or work-flow to one of many leading innovative technologies to address the growing number of use cases in this dynamic field.
Register for this episode of The Briefing Room to hear veteran Analyst Dr. Robin Bloor, as he explains how a data flow architecture can harness a wide range of streaming solutions. He'll be briefed by Anand Venugopal of Impetus Technologies, who will showcase his company's StreamAnalytix platform, which was designed from the ground up to leverage multiple major streaming engines available today, including Apache Spark, Apache Storm and others. He'll demonstrate how StreamAnalytix provides enterprise-class performance while incorporating best-of-breed open-source components.
View the archive at: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=925d1e9b639b78c6cf76a1bbbf485b2b
All Together Now: Connected Analytics for the Internet of EverythingInside Analysis
The Briefing Room with Mark Madsen and Cisco
Live Webcast August 18, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=0eff120f8b2879b582b77f4ff207ee54
Today's digital enterprises are seeing an explosion of data at the edge. The Internet of Everything is fast approaching a critical mass that will demand a sea change in how companies process data. This new world of information is widely distributed, streaming, and overall becoming too big to move. Experts predict that within two to three years, the bulk of analytic processing will take place on the fringes of information architectures. As a result, forward-thinking companies are dramatically shifting their analytic strategies.
Register for this episode of The Briefing Room to hear veteran Analyst Mark Madsen of Third Nature explain how a new era of information architectures is now unfolding, paving the way to much more responsive and agile business models. He'll be briefed by Kim Macpherson of the Cisco Data and Analytics Business Unit, who will explain how her company's platform is uniquely suited for this new, federated analytic paradigm. She'll demonstrate how edge analytics can help companies address opportunities quickly and effectively.
Visit InsideAnalysis.com for more information.
Goodbye, Bottlenecks: How Scale-Out and In-Memory Solve ETLInside Analysis
The Briefing Room with Dr. Robin Bloor and Splice Machine
Live Webcast August 11, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/onstage/g.php?MTID=e1b33c9d45b178e13784b4a971a4c1349
The ETL process was born out of necessity, and for decades it has been the glue between data sources and target applications. But as data
growth soars and increased competition demands real-time data, standard ETL has become brittle and often unmanageable. Scaling up resources can do the trick, but it’s very costly and only a matter of time before the processes hit another bottleneck. When outmoded ETL stands in the way of real-time analytics, it might be time to consider a completely new approach.
Register for this episode of The Briefing Room to learn from veteran Analyst Dr. Robin Bloor as he explains how modern, data-driven architectures must adopt an equally capable data integration strategy. He’ll be briefed by Rich Reimer of Splice Machine, who will discuss how his company solves ETL performance issues and enables real-time analytics and reports on big data. He will show that by leveraging the scale-out power of Hadoop and the in-memory speed of Spark, users can bring both analytical and operational systems together, eventually performing transformations only when needed.
Visit InsideAnalysis.com for more information.
The Biggest Picture: Situational Awareness on a Global LevelInside Analysis
The Briefing Room with Dr. Robin Bloor and Modus Operandi
Live Webcast July 28, 2015
Watch the Archive: https://bloorgroup.webex.com/bloorgroup/onstage/g.php?MTID=efc4082d9b0b0adfcd753a7435d2d6a1b
The analytic bottlenecks of yesterday need not apply today. The boundaries are also falling thanks in large part to the abundance of third-party data. The most data-driven companies these days are finding creative ways to dynamically incorporate data from within and beyond the firewall, thus building highly accurate, multidimensional views of their business, customer, competition or other subject areas.
Register for this episode of The Briefing Room to hear veteran Analyst Dr. Robin Bloor as he explains the magnitude of change that's occurring in the world of data, why it's happening now, and how you can take advantage. He'll be briefed by Mike Gilger and Boris Pelakh, who will showcase their company's enterprise analytics platform, which combines a range of battle-tested functionality to deliver dynamic situational awareness that can leverage a comprehensive array of data sets. They'll explain how the platform's reasoner benefits from a highly scalable rules engine, and a flexible modeling capability that can optimize data storage virtually on the fly.
Visit InsideAnalysis.com for more information.
Structurally Sound: How to Tame Your ArchitectureInside Analysis
The Briefing Room with Krish Krishnan and Teradata
Live Webcast July 21, 2015
Watch the Archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=602b2a8413e8719d39465f4d6291d505
Technology changes all the time, but the basic needs of the business are the same: BI and analytics. With new types of data, various analytics engines and multiple systems, giving business users seamless access to enterprise data can be a rather daunting process. One solution is to provide a complete fabric that spans the organization, touching all data points and masking the complexity behind disparate sources.
Register for this episode of The Briefing Room to learn from veteran Analyst Krish Krishnan as he explores how and why architectures have changed over the years. He’ll be briefed by Imad Birouty of Teradata, who will discuss his company’s QueryGrid, an analytics solution designed to provide access to data across all systems. He will show how QueryGrid essentially creates a logical data warehouse and enables users to leverage SQL over multiple data types.
Visit InsideAnalysis.com for more information.
SQL In Hadoop: Big Data Innovation Without the RiskInside Analysis
The Briefing Room with Dr. Robin Bloor and Actian
Live Webcast July 14, 2015
Watch the Archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=bbd4395ea2f8c60a03cfefc68c7aa823
Innovation often implies risk, which is why businesses have many issues to weigh when considering change. Yet the remarkable growth of data is driving many traditional systems into the ground, forcing information workers to take a critical look at their existing tools. Technologies like Hadoop offer economical solutions to big data management, but to truly take advantage of its capabilities, organizations must modernize their infrastructure.
Register for this episode of The Briefing Room to learn from veteran Analyst Dr. Robin Bloor as he explains how and why organizations should improve legacy systems. He’ll be briefed by Todd Untrecht of Actian, who will tout his company’s Actian Vortex, a SQL-in-Hadoop solution. He will show how integrating a SQL engine directly in the Hadoop cluster can lead to faster analytics and greater control, while still maintaining existing investments.
Visit InsideAnalysis.com for more information.
The Briefing Room with Dr. Robin Bloor and SYSTAP
Live Webcast June 30, 2015
Watch the Archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=0ff3889293f6c090483295fd7362c5a4
There's a reason why the biggest Web companies these days leverage graph technology: it is incredibly powerful for revealing a wide range of insights. Unlike other analytical databases, graph can very quickly identify the kinds of patterns that lead to better business decisions. Though relatively nascent in existing data centers, graph databases are proving to be well-suited for all kinds of business use cases, from clustering and hypothesis generation to failure detection and cyber analytics.
Register for this episode of The Briefing Room to learn from veteran Analyst Dr. Robin Bloor as he discusses how semantic technology fits in the spectrum of database and discovery solutions. He’ll be briefed by Brad Bebee of SYSTAP, who will showcase his company’s Blazegraph products and Mapgraph technology. He will explain how SYSTAP’s approach overcomes the challenge of scalability, and how graph technology’s powerful data management capabilities can deliver better enterprise performance and analytics using GPUs and other approaches.
Visit InsideAnalysis.com for more information.
A Revolutionary Approach to Modernizing the Data WarehouseInside Analysis
Hot Technologies with Rick Sherman, Dr. Robin Bloor and Snowflake Computing
Live Webcast June 25, 2015
Watch the Archive: https://bloorgroup.webex.com/bloorgroup/onstage/g.php?MTID=e6e6de6cdfa8926e7a9d52e099a1a08e2
Enterprise software tends to advance in one of two ways: evolutionary and revolutionary. Evolutionary advances happen through incremental improvements made to an existing code base over a long period of time. Revolutionary advances happen when a new solution is designed from scratch, breaking cleanly from legacy approaches to take advantage of technology innovations that can span from hardware to software and methodologies.
Register for this episode of Hot Technologies to hear veteran analysts Rick Sherman of Athena IT Solutions and Dr. Robin Bloor along with Bob Muglia, CEO of Snowflake Computing, explain how a confluence of advances in the data world have opened up new doors for revolutionary advances in data warehousing. They will discuss new technology innovations and how they can be used to create data warehouses with the power, flexibility, and resiliency that modern enterprises need without the complexities and latencies inherent to traditional approaches.
Visit InsideAnalysis.com for more information.
The Maturity Model: Taking the Growing Pains Out of HadoopInside Analysis
The Briefing Room with Rick van der Lans and Think Big, a Teradata Company
Live Webcast on June 16, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=197f8106531874cc5c14081ca214eaff
Hadoop is arguably one of the most disruptive technologies of the last decade. Once lauded solely for its ability to transform the speed of batch processing, it has marched steadily forward and promulgated an array of performance-enhancing accessories, notably Spark and YARN. Hadoop has evolved into much more than a file system and batch processor, and it now promises to stand as the data management and analytics backbone for enterprises.
Register for this episode of The Briefing Room to learn from veteran Analyst Rick van der Lans, as he discusses the emerging roles of Hadoop within the analytics ecosystem. He’ll be briefed by Ron Bodkin of Think Big, a Teradata Company, who will explore Hadoop’s maturity spectrum, from typical entry use cases all the way up the value chain. He’ll show how enterprises that already use Hadoop in production are finding new ways to exploit its power and build creative, dynamic analytics environments.
Visit InsideAnalysis.com for more information.
Rethinking Data Availability and Governance in a Mobile WorldInside Analysis
The Briefing Room with Malcolm Chisholm and Druva
Live Webcast on June 9, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=baf82d3835c5dfa63202dcbe322a3ad7
The emergence of the mobile workforce has left an indelible mark on the enterprise; every employee is now mobile, and business data continues to be dispatched to the far reaches of the enterprise. While this has added enormous opportunity for increased productivity, it has also muddied the waters when it comes to controlling and protecting valuable data assets. As companies quickly evolve to address the new set of challenges posed by this shift in data usage, IT must ensure that all data, no matter where it’s generated or stored, is available and governed just as if it were still safely behind the corporate firewall.
Register for this episode of The Briefing Room to hear veteran Analyst Malcolm Chisholm as he explains the myriad challenges that mobile data introduces when addressing regulations and compliance needs, requiring new approaches to data governance. He’ll be briefed by Dave Packer of Druva, who will outline his company’s converged data protection strategy, which brings data center class capabilities to backup, availability and governance for the mobile workforce. He will share strategies to meet regional data residency, data recovery, legal hold and eDiscovery requirements and more.
Visit InsideAnalysis.com for more information.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
3. ! Reveal the essential characteristics of enterprise
software, good and bad
! Provide a forum for detailed analysis of today s
innovative technologies
! Give vendors a chance to explain their product to
savvy analysts
! Allow audience members to pose serious questions...
and get answers!
Twitter Tag: #briefr The Briefing Room
4. December: Innovators
January: Big Data
February: Analytics
March: Discovery
Twitter Tag: #briefr The Briefing Room
5. ! Cloud computing has come a long way and can now offer an
array of hosted services: software (SaaS), platform (PaaS),
infrastructure (IaaS), as well as other services and resources
(storage, security, API, desktop, etc.).
! Cloud services can be deployed as public, private, or as a
hybrid according to needs.
! From an IT perspective, cloud computing can provide the agility
to add capabilities and scale on the fly with virtually no capital
or human investment.
! From a business perspective, the cloud can offer benefits such
as unified data access, faster time-to-value and self-service
analytics.
Twitter Tag: #briefr The Briefing Room
6. Wayne Eckerson has been a thought leader in the
data warehousing, business intelligence and
performance management fields since 1995. He has
conducted numerous in-depth research studies and
is the author of the best-selling book “Performance
Dashboards: Measuring, Monitoring, and Managing
Your Business.” He is a noted keynote speaker and
blogger, and he consults and conducts workshops on
business analytics, performance dashboards, and
business intelligence, among other topics. For many
years, Wayne served as director of education and
research at The Data Warehousing Institute (TDWI).
Wayne is also a principal consultant at BI Leader
Consulting and a founder of BI Leadership Forum.
Wayne is also director of research at TechTarget. He
can be reached at weckerson@bileader.com.
Twitter Tag: #briefr The Briefing Room
7. ! Birst offers a SaaS-based, multi-tenant BI platform. It can also
be deployed on-premise.
! The Birst solution is capable of unifying siloed technologies,
automating data management and providing agile enterprise-
class analytics.
! The Birst approach enables business users to manage and add
new data sources, create custom dashboards and collaborate
across the organization — without dependency on IT.
Twitter Tag: #briefr The Briefing Room
8. Brad Peters is the CEO and co-founder of Birst. Brad
has spent the last 10 years building analytics products
and solutions. Prior to working at Birst, he helped
found and later led the Analytics product line at
Siebel Systems, which forms the basis of Oracle’s
current OBIEE product family. Recognizing the
limitations of enterprise analytics offerings and the
revolutionary power of Cloud technologies, Brad
founded Birst in 2005. Brad started his career as an
investment banker for Morgan Stanley in the New York
M&A practice. Brad regularly blogs for Forbes.com
where he writes about Cloud and business software
related issues.
Brad received a BS and MS in electrical engineering
and computer science from UC Berkeley, where he
was a National Science Foundation Fellow. He
received his MBA from Harvard Business School.
Twitter Tag: #briefr The Briefing Room
9. ALL
GROWN
UP:
MATURATION
OF
ANALYTICS
IN
THE
CLOUD
Brad
Peters
CEO
and
Co-‐Founder
November
6,
2012
9
12. THE
END-‐TO-‐END
ANALYTICS
PROBLEM
Connect
to
Source
Denormalize
Data
Create
Dimensional
Create
Business
Distribute
ApplicaCons
Model
Model
• Publish
pre-‐
• Produce
• Iden>fy
what
can
be
• Seman>c
layer
digested
data
• Connect
securely
“aggregatable”
data
aggregated
(reports)
• Allows
business
• Extract
data
• Create/flaSen
• Manage
changes
and
history
users
to
create
• Create
interac>ve
• Full
hierarchies
for
roll-‐
ups
queries
without
analysis
• Incremental
• Snapshots
• Consolidate
sources
knowing
SQL
or
(dashboards)
• Slowly
changing
• Cleanse
data
aSributes
underlying
• Adhoc/
physical
structure
visualiza>on
• Embed
in
apps
13. CLOUD
1.0
DISAPPOINTED
Connect
to
Source
Distribute
ApplicaCons
• Publish
pre-‐
digested
data
• Connect
securely
(reports)
• Extract
data
• Create
interac>ve
• Full
(upload
data)
analysis
• Incremental
(dashboards)
• Adhoc/
visualiza>on
• Embed
in
apps
15. WHAT
THIS
GIVES
YOU
Speed
Flexibility
Power
• Rapid
Deployment
• No
Limits
• Full
analy>c
• Quick
Itera>on
• Any
app
capabili>es
• Mul>ple
deployments
• Mul>ple
access
models
15
17. END-‐TO-‐END
CAPABILITIES
Connect
to
Source
Denormalize
Data
Create
Dimensional
Create
Business
Distribute
ApplicaCons
Model
Model
• Publish
pre-‐
• Produce
• Iden>fy
what
can
be
• Seman>c
layer
digested
data
• Connect
securely
“aggregatable”
data
aggregated
(reports)
• Allows
business
• Extract
data
• Create/flaSen
• Manage
changes
and
history
users
to
create
• Create
interac>ve
• Full
hierarchies
for
roll-‐
ups
queries
without
analysis
• Incremental
• Snapshots
• Consolidate
sources
knowing
SQL
or
(dashboards)
• Slowly
changing
• Cleanse
data
aSributes
underlying
• Adhoc/
physical
structure
visualiza>on
• Embed
in
apps
18. ABOUT
BIRST
Key
Birst
Facts:
• #1
Cloud
BI
Provider
Market
&
Product
Leader
• More
than
1,000
organiza>ons
rely
on
Birst
•
Founded
in
2005
Slide
18
20. BI
in
the
Cloud:
AdopCon
Trends
Wayne
Eckerson
www.bileadership.com
20
21. Using
the
Cloud
for
any
part
of
your
BI
program?
0
Yes
-‐
36%
No
-‐
64%
BI
Leadership
Forum
Survey
of
112
BI
Directors,
June,
2011
www.bileadership.com
21
22. What
type
of
Cloud
Infrastructure?
Public
Cloud
53%
Private
Cloud
18%
Hybrid
30%
www.bileadership.com
22
23. Top
Reason
to
use
the
Cloud?
Speed
of
implementa>on
30%
Reduced
HW/SW
maintenance
30%
Flexibility
19%
Cost
11%
Performance
5%
Other
5%
www.bileadership.com
23
24. Planning
to
Increase
in
Next
12
Months?
Increase
65%
Stay
the
same
16%
Not
sure
16%
Decrease
3%
www.bileadership.com
24
25. Top
Reason
NOT
to
use
the
Cloud?
Security
33%
Other
32%
Performance
9%
No
execu>ve
support
7.2%
Corporate
policy
5.8%
Vendor
lock-‐in
4%
Reliability
3%
Difficult
to
use
3%
Pricing
too
complex
1%
www.bileadership.com
25
26. QuesCons
BI
has
proven
more
challenging
to
do
in
the
cloud
than
other
solware
segments
largely
because
it
involves
custom
development.
In
other
words,
you
can’t
purchase
a
one-‐size-‐fits-‐all
BI
package.
While
some
cloud
BI
vendors
have
gone
the
package
route,
you
have
not.
• Do
you
plan
to
deliver
BI
packages
for
various
industries
or
func>onal
areas?
• How
do
you
automate
the
development
process
to
deliver
func>onal
applica>ons
over
the
cloud?
• How
does
your
business
model
differ
from
cloud
BI
vendors
offering
packaged
BI
cloud
applica>ons?
In
other
words,
how
is
it
possible
to
make
money
building
custom
applica>ons
using
a
subscrip>on
pricing
model?
www.bileader.com
26
27. QuesCons
• With
the
advent
of
your
BI
appliance,
are
you
a
cloud
BI
vendor
with
some
on
premise
solware,
or
are
you
an
on
premise
vendor
with
some
cloud
solware?
• Is
it
possible
for
users
to
create
a
hybrid
BI
environment,
where
some
processing
is
done
in
the
cloud
and
some
on
premise?
What
types
of
processing
are
best
suited
to
each
plaqorm?
www.bileader.com
27
28. Bio/Contact
informaCon
• Thought
leader
in
BI
field
• Founder,
BI
Leadership
Forum
• Director
of
Research,
TechTarget
• Former
director
of
research
at
The
Data
Warehousing
Ins>tute
• Author
Wayne
W.
Eckerson
weckerson@bileadership.com
www.bileadership.com
28