The Briefing Room with Rick Sherman and Actian
Slides from the Live Webcast on Aug. 28, 2012
The appetite for high-powered analytics is greater than ever these days, with increasing numbers of business users clamoring for insights. At the same time, source systems are proliferating, and the nature of questions being asked is getting more complex. Indeed, the entire landscape of analytics is changing in fundamental ways. How can your organization stay ahead of the curve?
Register for this episode of The Briefing Room to learn from veteran Analyst Rick Sherman how a variety of technologies can change the manner in which analytics are done. He'll be briefed by Fred Gallagher of Actian, who will explain how his company's Vectorwise technology leverages vector processing to expedite even the most complex queries when compared to traditional columnar or relational databases.
For more information visit: http://www.insideanalysis.com
BI continues to grow in demand in 2012 and the foreseeable future. The demand for BI/Analytics has driven the need for company investment in "Big Data" and trying to outdue competitors by exploiting advantages through the use of data patterns.
Embedded Analytics: The Next Mega-Wave of InnovationInside Analysis
Could embedded analytics change the way consumers do business? A whole range of Web-based and traditional software providers are now embedding analytical power into their applications such that users can do more complex analysis of their data. The use cases span such industries as eCommerce, telecom, security and other such data-intensive verticals. As a result of this trend, the providers and their customers can gain greater insights about their businesses and thus improve decisions.
Check out this episode of The Briefing Room to hear Analyst John Myers of EMA explain how delivering embedded analytics can expand the value of analysis to customers and partners all over the world, while raising the bar for how business is done. Myers will be briefed by Susan Davis of Infobright, who will tout her company’s success in enabling solution providers to deliver real-time analytical capabilities to their customers.
Left Brain, Right Brain: How to Unify Enterprise AnalyticsInside Analysis
The Briefing Room with Robin Bloor and Teradata
Live Webcast on Jan. 29, 2013
Despite its name, effective Data Science requires a certain amount of artistic flair. Analysts must be creative about how and where they find the insights that will drive business value. One classic roadblock to that kind of frictionless process? Programming. Not everyone can code Java, which makes the unstructured domain of Hadoop quite challenging for the average business analyst.
Check out the slides from this episode of the Briefing Room to hear veteran Analyst Dr. Robin Bloor explain how a new generation of analytical platforms will solve the complexity of unifying structured and unstructured data. He'll be briefed by Steve Wooledge of Teradata Aster who will tout his company's Big Data Appliance, which leverages the SQL-H bridge, an innovation designed to connect Hadoop with SQL.
Visit: http://www.insideanalysis.com
Revolution R Enterprise - 100% R and More Webinar PresentationRevolution Analytics
R users already know why the R language is the lingua franca of statisticians today: because it's the most powerful statistical language in the world. Revolution Analytics builds on the power of open source R, and adds performance, productivity and integration features to create Revolution R Enterprise. In this presentation, author and blogger David Smith will introduce the additional capabilities of Revolution R Enterprise.
BI continues to grow in demand in 2012 and the foreseeable future. The demand for BI/Analytics has driven the need for company investment in "Big Data" and trying to outdue competitors by exploiting advantages through the use of data patterns.
Embedded Analytics: The Next Mega-Wave of InnovationInside Analysis
Could embedded analytics change the way consumers do business? A whole range of Web-based and traditional software providers are now embedding analytical power into their applications such that users can do more complex analysis of their data. The use cases span such industries as eCommerce, telecom, security and other such data-intensive verticals. As a result of this trend, the providers and their customers can gain greater insights about their businesses and thus improve decisions.
Check out this episode of The Briefing Room to hear Analyst John Myers of EMA explain how delivering embedded analytics can expand the value of analysis to customers and partners all over the world, while raising the bar for how business is done. Myers will be briefed by Susan Davis of Infobright, who will tout her company’s success in enabling solution providers to deliver real-time analytical capabilities to their customers.
Left Brain, Right Brain: How to Unify Enterprise AnalyticsInside Analysis
The Briefing Room with Robin Bloor and Teradata
Live Webcast on Jan. 29, 2013
Despite its name, effective Data Science requires a certain amount of artistic flair. Analysts must be creative about how and where they find the insights that will drive business value. One classic roadblock to that kind of frictionless process? Programming. Not everyone can code Java, which makes the unstructured domain of Hadoop quite challenging for the average business analyst.
Check out the slides from this episode of the Briefing Room to hear veteran Analyst Dr. Robin Bloor explain how a new generation of analytical platforms will solve the complexity of unifying structured and unstructured data. He'll be briefed by Steve Wooledge of Teradata Aster who will tout his company's Big Data Appliance, which leverages the SQL-H bridge, an innovation designed to connect Hadoop with SQL.
Visit: http://www.insideanalysis.com
Revolution R Enterprise - 100% R and More Webinar PresentationRevolution Analytics
R users already know why the R language is the lingua franca of statisticians today: because it's the most powerful statistical language in the world. Revolution Analytics builds on the power of open source R, and adds performance, productivity and integration features to create Revolution R Enterprise. In this presentation, author and blogger David Smith will introduce the additional capabilities of Revolution R Enterprise.
White Paper - Data Warehouse Documentation RoadmapDavid Walker
All projects need documentation and many companies provide templates as part of a methodology. This document describes the templates, tools and source documents used by Data Management & Warehousing. It serves two purposes:
• For projects using other methodologies or creating their own set of documents to use as a checklist. This allows the project to ensure that the documentation covers the essential areas for describing the data warehouse.
• To demonstrate our approach to our clients by describing the templates and deliverables that are produced.
Documentation, methodologies and templates are inherently both incomplete and flexible. Projects may wish to add, change, remove or ignore any part of any document. Some may also believe that aspects of one document would sit better in another. If this is the case then users of this document and these templates are encouraged to change them to fit their needs.
Data Management & Warehousing believes that the approach or methodology for building a data warehouse should be to use a series of guides and checklists. This ensures that small teams of relatively skilled resources developing the system can cover all aspects of the project whilst being free to deal with the specific issues of their environment to deliver exceptional solutions, rather than a rigid methodology that ensures that large teams of relatively unskilled staff can meet a minimum standard.
The Briefing Room with John Myers and Alteryx
Live Webcast on Nov. 27, 2012
What's the biggest challenge with Big Data so far? By and large, it's the big pain in delivering the right data in a timely fashion, and in a way that decision-makers can easily use. That's quickly changing because of the tremendous demand for tools that even non-technical business users can effectively employ. Capabilities are being designed by software vendors large and small, to provide easier access and more intuitive ways for working with Big Data. Even still, the effort to make Big Data useful is very much a work in progress.
Check out this episode of The Briefing Room to hear veteran Analyst John Myers of EMA explain why Big Data poses challenges and opportunities for professionals looking to better understand their markets, prospects and customers. Myers will be briefed by Paul Ross of Alteryx, who will tout his company's efforts to "humanize" Big Data using their strategic analytics platform, designed to: 1) facilitate access to Big Data, especially in combination with other data sets; 2) give analysts an intuitive, workflow-based approach for build the targeted analytics their business needs; and, 3) make the consumption of these analytics by decision-makers as simple as using the apps they use at home.
Visit: http://www.insideanalysis.com
Slides: Moving from a Relational Model to NoSQLDATAVERSITY
Businesses are quickly moving to NoSQL databases to power their modern applications. However, a technology migration involves risk, especially if you have to change your data model. What if you could host a relatively unmodified RDBMS schema on your NoSQL database, then optimize it over time?
We’ll show you how Couchbase makes it easy to:
• Use SQL for JSON to query your data and create joins
• Optimize indexes and perform HashMap queries
• Build applications and analysis with NoSQL
Accelerating Data Lakes and Streams with Real-time AnalyticsArcadia Data
As organizations modernize their data and analytics platforms, the data lake concept has gained momentum as a shared enterprise resource for supporting insights across multiple lines of business. The perception is that data lakes are vast, slow-moving bodies of data, but innovations like Apache Kafka for streaming-first architectures put real-time data flows at the forefront. Combining real-time alerts and fast-moving data with rich historical analysis lets you respond quickly to changing business conditions with powerful data lake analytics to make smarter decisions.
Join this complimentary webinar with industry experts from 451 Research and Arcadia Data who will discuss:
- Business requirements for combining real-time streaming and ad hoc visual analytics.
- Innovations in real-time analytics using tools like Confluent’s KSQL.
- Machine-assisted visualization to guide business analysts to faster insights.
- Elevating user concurrency and analytic performance on data lakes.
- Applications in cybersecurity, regulatory compliance, and predictive maintenance on manufacturing equipment all benefit from streaming visualizations.
Information Management: Answering Today’s Enterprise ChallengeBob Rhubart
As presented by George Lumpkin at OTN Architect Day, Redwood Shores, CA, 7/22/09.
Find an OTN Architect Day event near you: http://www.oracle.com/technology/architect/archday.html
Interact with Architect Day presenters and participants on Oracle Mix: https://mix.oracle.com/groups/15511
Estimating the Total Costs of Your Cloud Analytics PlatformDATAVERSITY
Organizations today need a broad set of enterprise data cloud services with key data functionality to modernize applications and utilize machine learning. They need a platform designed to address multi-faceted needs by offering multi-function Data Management and analytics to solve the enterprise’s most pressing data and analytic challenges in a streamlined fashion. They need a worry-free experience with the architecture and its components.
When Worlds Collide: Intelligence, Analytics and OperationsInside Analysis
The Briefing Room with Shawn Rogers and Composite Software
Slides from the Live Webcast on May 15, 2012
Everyone wants more data these days, though often for different reasons. Business analysts, data scientists and front-line workers all know the value of having that extra piece of information. The big question remains -- how can all these needs be supported without taxing IT and without breaking the bank? And how can the worlds of traditional Business Intelligence, Big Data Analytics and Transaction Systems combine to improve business outcomes?
In this episode of The Briefing Room, veteran Analyst Shawn Rogers of Enterprise Management Associates explains what is needed to take advantage from today's hybrid data ecosystem. He'll be briefed by Bob Eve of Composite Software who will explain how innovative enterprises are using data virtualization to gain insight across these worlds and doing so with greater agility and lower costs.
For more information visit: http://www.insideanalysis.com
Watch us on YouTube: http://www.youtube.com/playlist?list=PL5EE76E2EEEC8CF9E
Slides: Why You Need End-to-End Data Quality to Build Trust in KafkaDATAVERSITY
By adopting streaming architectures like Apache Kafka as a way to ingest and move large amounts of data very quickly, organizations are making major investments to access real-time data – and fundamentally changing how they do business. However, the advantages of Kafka can quickly be outweighed by the threat of poor Data Quality. Without Data Quality, all of the time and resources spent in building a new framework will fail to return the benefits that a Kafka platform offers.
Join Infogix’s Jeff Brown as he shares how data trust in your Kafka streaming framework is achievable when you put the proper validations and Data Quality components in place.
In this webinar, you’ll learn:
• Why organizations are moving to a streaming-based architecture
• What challenges are being faced when adopting Kafka messages as a new system-to-system communication method
• How to build data trust within your organization and its streaming framework
• Key directions on how to reconcile, balance, validate, and apply Data Quality to your streaming Data Architecture
• What customers are saying about their Kafka investment and how they’re working with Infogix to deliver data trust
MLOps - Getting Machine Learning Into ProductionMichael Pearce
Creating autonomy and self-sufficiency by giving people what they need in order to do the things they need to do! What gets in the way, and how can we overcome those barriers? How do we get started quickly, effectively and safely? We'll come together to look at what MLOps entails, some of the tools available and what common MLOps pipelines look like.
The Agile Analyst: Solving the Data Problem with VirtualizationInside Analysis
The Briefing Room with Radiant Advisors and Cisco
Live Webcast Jan. 21, 2014
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=05e9d4ccbd2505ce15bc8de699f9c961
Today’s business analyst needs data from all kinds of places: the data warehouse, data marts, web services as well as local and departmental files and spreadsheets. The fact is, even seasoned analysts typically spend more than half their time hunting and gathering data, which impedes analytical insights and limits time to value. Increasingly, innovative organizations are turning to data virtualization as a faster path to analytics, thus expediting business impact.
Register for this episode of The Briefing Room to hear Analysts Lindy Ryan and John O'Brien of Radiant Advisors explain how analytical sandboxes and data virtualization can enable true analytic agility. They will be briefed by Marc Breissinger of Cisco Data Virtualization Business Unit, who will tout his company’s upcoming analytic platform Data Collage, a desktop tool for designed for analysts who need agile access to enterprise data. He will discuss how Data Collage allows users to easily combine data and accelerate the development of new analytics.
Visit InsideAnalysis.com for more information.
Enabling Flexible Governance for All Data SourcesInside Analysis
The Briefing Room with Robin Bloor and Birst
Live Webcast on Feb. 5, 2013
All the effort that goes into data governance can quickly be lost if effective guard rails aren't in place. However, end users invariably need additional data sets in order to get a complete picture of what's happening. All too often, some or all of those additional data sources have not yet run the gauntlet of governance. Striking a balance between core and contextual data can help ensure that your business stays on top of opportunities without straying from the path.
Check out this episode of The Briefing Room to learn from veteran Analyst Dr. Robin Bloor, who will explain the nuances of integrating governed and ungoverned data in ways that business users can easily leverage. He'll be briefed by Brad Peters of Birst who will demonstrate how managed data mashups can provide the kind of flexibility and agility that can lead to valuable insights. He'll explain how Birst's architecture can significantly lighten the load on IT without sacrificing data integrity, security or governance.
Visit: http://www.insideanalysis.com
Master Data Management (MDM) has been one of the hot technology areas that are striving to solve the age old data quality and data management problems of the Master Data such as Customer, Product, Chart of Accounts (COA), etc. Of late given the ever increasing capabilities of Hardware, global single instances of packaged applications, mergers and acquisitions, it has become apparent that the data quality problems associated with Master data have been continue to worsen. It is in this context that the MDM solutions try to address the management of master data with robust data quality solutions. The Trading Community Architecture (TCA) framework is an Oracle's answer to solve the problem associated with managing the customer data. Of late the TCA has evolved much more into managing of Location data, Supplier data, Citizen Data, etc. The objective of this session is to provide the overview of Master Data Management (MDM) and Oracle's Trading Community Architecture (TCA) and how it can be used to model the customer data in an enterprise. This is an entry level session and any one with keen interest to learn what MDM and TCA can attend this session. Learn the basics of Master Data Management (MDM), MDM for Customer, and Oracle's Trading Community Architectue (TCA) Learn about the importance of MDM to an enterprise Take a brief look at the TCA's logical data model and the power/flexibility of model to solution cusotmer data
Data Works Berlin 2018 - Worldpay - PCI ComplianceDavid Walker
A presentation from the Data Works conference in 2018 that looks how Worldpay, a major payments provider, deployed a secure Hadoop cluster in order to meet business requirements and in the process became on e of the few fully certified PCI compliance clusters in the world
Presentation slides of Dr. Jarkko Suhonen at X International Interdisciplinary Scientific Research Congress (X CIC), June 12-14, 2014, Santo Domingo, Dominican Republic.
White Paper - Data Warehouse Documentation RoadmapDavid Walker
All projects need documentation and many companies provide templates as part of a methodology. This document describes the templates, tools and source documents used by Data Management & Warehousing. It serves two purposes:
• For projects using other methodologies or creating their own set of documents to use as a checklist. This allows the project to ensure that the documentation covers the essential areas for describing the data warehouse.
• To demonstrate our approach to our clients by describing the templates and deliverables that are produced.
Documentation, methodologies and templates are inherently both incomplete and flexible. Projects may wish to add, change, remove or ignore any part of any document. Some may also believe that aspects of one document would sit better in another. If this is the case then users of this document and these templates are encouraged to change them to fit their needs.
Data Management & Warehousing believes that the approach or methodology for building a data warehouse should be to use a series of guides and checklists. This ensures that small teams of relatively skilled resources developing the system can cover all aspects of the project whilst being free to deal with the specific issues of their environment to deliver exceptional solutions, rather than a rigid methodology that ensures that large teams of relatively unskilled staff can meet a minimum standard.
The Briefing Room with John Myers and Alteryx
Live Webcast on Nov. 27, 2012
What's the biggest challenge with Big Data so far? By and large, it's the big pain in delivering the right data in a timely fashion, and in a way that decision-makers can easily use. That's quickly changing because of the tremendous demand for tools that even non-technical business users can effectively employ. Capabilities are being designed by software vendors large and small, to provide easier access and more intuitive ways for working with Big Data. Even still, the effort to make Big Data useful is very much a work in progress.
Check out this episode of The Briefing Room to hear veteran Analyst John Myers of EMA explain why Big Data poses challenges and opportunities for professionals looking to better understand their markets, prospects and customers. Myers will be briefed by Paul Ross of Alteryx, who will tout his company's efforts to "humanize" Big Data using their strategic analytics platform, designed to: 1) facilitate access to Big Data, especially in combination with other data sets; 2) give analysts an intuitive, workflow-based approach for build the targeted analytics their business needs; and, 3) make the consumption of these analytics by decision-makers as simple as using the apps they use at home.
Visit: http://www.insideanalysis.com
Slides: Moving from a Relational Model to NoSQLDATAVERSITY
Businesses are quickly moving to NoSQL databases to power their modern applications. However, a technology migration involves risk, especially if you have to change your data model. What if you could host a relatively unmodified RDBMS schema on your NoSQL database, then optimize it over time?
We’ll show you how Couchbase makes it easy to:
• Use SQL for JSON to query your data and create joins
• Optimize indexes and perform HashMap queries
• Build applications and analysis with NoSQL
Accelerating Data Lakes and Streams with Real-time AnalyticsArcadia Data
As organizations modernize their data and analytics platforms, the data lake concept has gained momentum as a shared enterprise resource for supporting insights across multiple lines of business. The perception is that data lakes are vast, slow-moving bodies of data, but innovations like Apache Kafka for streaming-first architectures put real-time data flows at the forefront. Combining real-time alerts and fast-moving data with rich historical analysis lets you respond quickly to changing business conditions with powerful data lake analytics to make smarter decisions.
Join this complimentary webinar with industry experts from 451 Research and Arcadia Data who will discuss:
- Business requirements for combining real-time streaming and ad hoc visual analytics.
- Innovations in real-time analytics using tools like Confluent’s KSQL.
- Machine-assisted visualization to guide business analysts to faster insights.
- Elevating user concurrency and analytic performance on data lakes.
- Applications in cybersecurity, regulatory compliance, and predictive maintenance on manufacturing equipment all benefit from streaming visualizations.
Information Management: Answering Today’s Enterprise ChallengeBob Rhubart
As presented by George Lumpkin at OTN Architect Day, Redwood Shores, CA, 7/22/09.
Find an OTN Architect Day event near you: http://www.oracle.com/technology/architect/archday.html
Interact with Architect Day presenters and participants on Oracle Mix: https://mix.oracle.com/groups/15511
Estimating the Total Costs of Your Cloud Analytics PlatformDATAVERSITY
Organizations today need a broad set of enterprise data cloud services with key data functionality to modernize applications and utilize machine learning. They need a platform designed to address multi-faceted needs by offering multi-function Data Management and analytics to solve the enterprise’s most pressing data and analytic challenges in a streamlined fashion. They need a worry-free experience with the architecture and its components.
When Worlds Collide: Intelligence, Analytics and OperationsInside Analysis
The Briefing Room with Shawn Rogers and Composite Software
Slides from the Live Webcast on May 15, 2012
Everyone wants more data these days, though often for different reasons. Business analysts, data scientists and front-line workers all know the value of having that extra piece of information. The big question remains -- how can all these needs be supported without taxing IT and without breaking the bank? And how can the worlds of traditional Business Intelligence, Big Data Analytics and Transaction Systems combine to improve business outcomes?
In this episode of The Briefing Room, veteran Analyst Shawn Rogers of Enterprise Management Associates explains what is needed to take advantage from today's hybrid data ecosystem. He'll be briefed by Bob Eve of Composite Software who will explain how innovative enterprises are using data virtualization to gain insight across these worlds and doing so with greater agility and lower costs.
For more information visit: http://www.insideanalysis.com
Watch us on YouTube: http://www.youtube.com/playlist?list=PL5EE76E2EEEC8CF9E
Slides: Why You Need End-to-End Data Quality to Build Trust in KafkaDATAVERSITY
By adopting streaming architectures like Apache Kafka as a way to ingest and move large amounts of data very quickly, organizations are making major investments to access real-time data – and fundamentally changing how they do business. However, the advantages of Kafka can quickly be outweighed by the threat of poor Data Quality. Without Data Quality, all of the time and resources spent in building a new framework will fail to return the benefits that a Kafka platform offers.
Join Infogix’s Jeff Brown as he shares how data trust in your Kafka streaming framework is achievable when you put the proper validations and Data Quality components in place.
In this webinar, you’ll learn:
• Why organizations are moving to a streaming-based architecture
• What challenges are being faced when adopting Kafka messages as a new system-to-system communication method
• How to build data trust within your organization and its streaming framework
• Key directions on how to reconcile, balance, validate, and apply Data Quality to your streaming Data Architecture
• What customers are saying about their Kafka investment and how they’re working with Infogix to deliver data trust
MLOps - Getting Machine Learning Into ProductionMichael Pearce
Creating autonomy and self-sufficiency by giving people what they need in order to do the things they need to do! What gets in the way, and how can we overcome those barriers? How do we get started quickly, effectively and safely? We'll come together to look at what MLOps entails, some of the tools available and what common MLOps pipelines look like.
The Agile Analyst: Solving the Data Problem with VirtualizationInside Analysis
The Briefing Room with Radiant Advisors and Cisco
Live Webcast Jan. 21, 2014
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=05e9d4ccbd2505ce15bc8de699f9c961
Today’s business analyst needs data from all kinds of places: the data warehouse, data marts, web services as well as local and departmental files and spreadsheets. The fact is, even seasoned analysts typically spend more than half their time hunting and gathering data, which impedes analytical insights and limits time to value. Increasingly, innovative organizations are turning to data virtualization as a faster path to analytics, thus expediting business impact.
Register for this episode of The Briefing Room to hear Analysts Lindy Ryan and John O'Brien of Radiant Advisors explain how analytical sandboxes and data virtualization can enable true analytic agility. They will be briefed by Marc Breissinger of Cisco Data Virtualization Business Unit, who will tout his company’s upcoming analytic platform Data Collage, a desktop tool for designed for analysts who need agile access to enterprise data. He will discuss how Data Collage allows users to easily combine data and accelerate the development of new analytics.
Visit InsideAnalysis.com for more information.
Enabling Flexible Governance for All Data SourcesInside Analysis
The Briefing Room with Robin Bloor and Birst
Live Webcast on Feb. 5, 2013
All the effort that goes into data governance can quickly be lost if effective guard rails aren't in place. However, end users invariably need additional data sets in order to get a complete picture of what's happening. All too often, some or all of those additional data sources have not yet run the gauntlet of governance. Striking a balance between core and contextual data can help ensure that your business stays on top of opportunities without straying from the path.
Check out this episode of The Briefing Room to learn from veteran Analyst Dr. Robin Bloor, who will explain the nuances of integrating governed and ungoverned data in ways that business users can easily leverage. He'll be briefed by Brad Peters of Birst who will demonstrate how managed data mashups can provide the kind of flexibility and agility that can lead to valuable insights. He'll explain how Birst's architecture can significantly lighten the load on IT without sacrificing data integrity, security or governance.
Visit: http://www.insideanalysis.com
Master Data Management (MDM) has been one of the hot technology areas that are striving to solve the age old data quality and data management problems of the Master Data such as Customer, Product, Chart of Accounts (COA), etc. Of late given the ever increasing capabilities of Hardware, global single instances of packaged applications, mergers and acquisitions, it has become apparent that the data quality problems associated with Master data have been continue to worsen. It is in this context that the MDM solutions try to address the management of master data with robust data quality solutions. The Trading Community Architecture (TCA) framework is an Oracle's answer to solve the problem associated with managing the customer data. Of late the TCA has evolved much more into managing of Location data, Supplier data, Citizen Data, etc. The objective of this session is to provide the overview of Master Data Management (MDM) and Oracle's Trading Community Architecture (TCA) and how it can be used to model the customer data in an enterprise. This is an entry level session and any one with keen interest to learn what MDM and TCA can attend this session. Learn the basics of Master Data Management (MDM), MDM for Customer, and Oracle's Trading Community Architectue (TCA) Learn about the importance of MDM to an enterprise Take a brief look at the TCA's logical data model and the power/flexibility of model to solution cusotmer data
Data Works Berlin 2018 - Worldpay - PCI ComplianceDavid Walker
A presentation from the Data Works conference in 2018 that looks how Worldpay, a major payments provider, deployed a secure Hadoop cluster in order to meet business requirements and in the process became on e of the few fully certified PCI compliance clusters in the world
Presentation slides of Dr. Jarkko Suhonen at X International Interdisciplinary Scientific Research Congress (X CIC), June 12-14, 2014, Santo Domingo, Dominican Republic.
Investigación en multimorbilidad. El Papel de las guías de práctica clínic…GuíaSalud
Presentación realizada por Alexandra Prados Torres (grupo investigación en enfermedades crónicas EpiChron) del IACS en las Jornadas Científicas "Guías de Práctica Clínica y Pluripatología" de GuíaSalud, Madrid, 21 de febrero de 2013.
A Strategic View of Enterprise Reporting and Analytics: The Data FunnelInside Analysis
The Briefing Room with Colin White and Jaspersoft
Slides from the Live Webcast on June 12, 2012
As the corporate appetite for analytics and reporting grows, companies must find a way to secure a strategic view of their information architecture. End users with varying degrees of expertise need a wide range of data and reports delivered in a timely fashion. As the audience for analytics expands, that puts pressure on IT infrastructure and staff. And now with the promise of Hadoop and MapReduce, the organization's desire for business insight becomes even more significant.
In this episode of The Briefing Room, veteran Analyst Colin White of BI Research will explain the value of being strategic with enterprise reporting. White will be briefed by Karl Van den Bergh of Jaspersoft, who will tout his company's “data funnel” concept, which is designed to strategically manage an organization's information architecture. By aligning information assets along this funnel, IT can effectively address the spectrum of analytical needs – from simple reporting to complex, ad hoc analysis – without over-taxing personnel and system resources.
Technically Speaking: How Self-Service Analytics Fosters CollaborationInside Analysis
The Briefing Room with Wayne Eckerson and Tableau Software
Achieving self-service analytics requires tight collaboration between business users and their technical counterparts. Data sources, interfaces, business rules and governance guard rails must be designed to accommodate particular topical domains, and levels of expertise. The process of creating such customized solutions tends to result in very productive collaboration, not just across the IT-business boundary, but also between and among business users and IT professionals alike.
Check out this episode of The Briefing Room to hear veteran Analyst Wayne Eckerson explain how self-service solutions should be designed and implemented. He'll be briefed by Ellie Fields of Tableau Software who will discuss how to create, adjust and share visualizations that can help you understand and communicate your information effectively.
http://www.insideanalysis.com
At the Tipping Point: Considerations for Cloud BI in a Multi-platform BI Ente...Inside Analysis
The Briefing Room with Wayne Eckerson and Birst
Live Webcast on March 4, 2014
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=573944bdf3e01bfb977fa9f3d1d623c2
At the Tipping Point: Considerations for Cloud BI in a Multi-platform BI Enterprise - Mar. 4
The Briefing Room with Wayne Eckerson and Birst
Cloud BI, once a novelty, is now going mainstream. More and more organizations are adopting Cloud BI and folding it in their BI strategy. Learn what considerations are necessary to determine if Cloud BI should be part of your multi-platform BI architecture.
Register for this episode of The Briefing Room to learn from veteran Analyst Wayne Eckerson as he explains what to consider when choosing cloud-based solutions. He will be briefed by Brad Peters of Birst, who will tout his company’s enterprise-caliber cloud BI platform. Peters will describe how the coexistence of cloud applications and traditional environments can provide the kind of agility and flexibility that can lead to faster time to value.
Visit InsideAnlaysis.com for more information.
All Grown Up: Maturation of Analytics in the CloudInside Analysis
The Briefing Room with Wayne Eckerson and Birst
Live Webcast on Nov. 6, 2012
The desire for analytics today extends far beyond the traditional domain of Business Intelligence. The challenge is that operational systems come in countless shapes and sizes. Furthermore, each application treats data somewhat differently. But there are patterns of data flow and transformation that pervade all such systems. And there's one big place where all these data types and use cases have come together architecturally: the Cloud.
Watch this episode of the Briefing Room to hear veteran Analyst Wayne Eckerson explain how Cloud computing is ushering in a new era of analytics and intelligence. He'll be briefed by Brad Peters of Birst who will tout his company's purpose-built analytics platform. He'll discuss how the Birst engine processes and delivers raw data from disparate systems, offering the deployment flexibility of Software-as-a-Service, together with the capabilities of enterprise-class BI.
Data Discovery and BI - Is there Really a Difference?Inside Analysis
The Briefing Room with John O'Brien and Birst
Live Webcast Dec. 3, 2013
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?AT=pb&SP=EC&rID=7869542&rKey=1f6574abc879ca42
While the disciplines of business intelligence and discovery certainly overlap, there are key distinctions between the two, both in terms of design point and user interface. While traditionally it is believed different architectures are required to address these differing analytic needs, is that really the case? Or is discovery simply another key capability within an overall BI platform?
Register for this episode of The Briefing Room to learn from veteran Analyst John O'Brien of Radiant Advisors as he outlines best practices for enabling high-quality business intelligence and discovery, and the architectural capabilities to enable both. He'll be briefed by Brad Peters of Birst who will tout his company's cloud BI platform. In particular, Peters will demonstrate how the Birst architecture was especially designed for enterprise-caliber BI and argue for a more inclusive future BI architecture.
Visit InsideAnalysis.com for more information
The Big Picture: Big Data for the New Wave of AnalyticsInside Analysis
The Briefing Room with Neil Raden and MarkLogic
Live Webcast on Oct. 2, 2012
Understanding context is a critical success factor for any decision-maker. Getting a clear view of the big picture can help guide all kinds of important decisions. That's why many organizations are focused on weaving together structured and "unstructured" data, to create a strategic view of enterprise issues and opportunities. The answers are usually found somewhere in between a SQL query and a Google-style search.
Check out this episode of The Briefing Room to learn from veteran Analyst Neil Raden of Hired Brains, who will explain how a new breed of analytical applications can generate a wide range of targeted insights. He'll be briefed by Steve Guttman of MarkLogic, who will tout his company's Enterprise NoSQL database, which combines the durability of traditional relational databases, with the versatility of modern Big Data engines. He'll also discuss real-world examples of new applications for various industries.
Analytic Excellence - Saying Goodbye to Old ConstraintsInside Analysis
The Briefing Room with Dr. Robin Bloor and Actian
Live Webcast August 6, 2013
http://www.insideanalysis.com
With all the innovations in compute power these days, one of the hardest hurdles to overcome is the tendency to think in old ways. By and large, the processing constraints of yesterday no longer apply. The new constraints revolve around the strategic management of data, and the effective use of business analytics. How can your organization take the helm in this new era of analysis?
Register for this episode of The Briefing Room to find out! Veteran Analyst Wayne Eckerson of The BI Leadership Forum, will explain how a handful of key innovations has significantly changed the game for data processing and analytics. He'll be briefed by John Santaferraro of Actian, who will tout his company's unique position in "scale-up and scale-out" for analyzing data.
ADV Slides: 2021 Trends in Enterprise AnalyticsDATAVERSITY
It is a fascinating, explosive time for enterprise analytics.
It is from the position of analytics leadership that the mission will be executed, and company leadership will emerge. The data professional is absolutely sitting on the performance of the company in this information economy and has an obligation to demonstrate the possibilities and originate the architecture, data, and projects that will deliver analytics. After all, no matter what business you’re in, you’re in the business of analytics.
The coming years will be full of big changes in enterprise analytics and Data Architecture. William will kick off the third year of the Advanced Analytics series with a discussion of the trends winning organizations should build into their plans, expectations, vision, and awareness now.
Seeing Redshift: How Amazon Changed Data Warehousing ForeverInside Analysis
The Briefing Room with Claudia Imhoff and Birst
Live Webcast April 9, 2013
What a difference a day can make! When Amazon announced their new RedShift offering – a data warehouse in the cloud – the entire industry of information management changed. The most notable disruption? Price. At a whopping $1,000 per year for a terabyte, RedShift achieved a price-point improvement that amounts to at least two orders of magnitude, if not three when compared to its top-tier competitors. But pricing is just one change; there's also the entire process by which data warehousing is done.
Register for this episode of The Briefing Room to hear veteran Analyst Dr. Claudia Imhoff explain why a new cloud-based reality for data warehousing significantly changes the game for business intelligence and analytics. She'll be briefed by Brad Peters of Birst who will tout his company's BI solution, which has been specifically architected for cloud-based hosting. Peters will discuss several key intricacies of doing BI in the cloud, including the unique provisioning, loading and modeling requirements. Founded in 2004, Birst has nearly a decade of doing cloud-based BI and Analytics.
Visit: http://www.insideanalysis.com
Big Data in Action – Real-World Solution ShowcaseInside Analysis
The Briefing Room with Radiant Advisors and IBM
Live Webcast on February 25, 2014
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=53c9b7fa2000f98f5b236747e3602511
The power of Big Data depends heavily upon the context in which it's used, and most organizations are just beginning to figure out where, how and when to leverage it. One key to success is integration with existing information systems, many of which still rely on relational database technologies. Finding ways to blend these two worlds can help companies generate measurable business value in fairly short order.
Register for this episode of The Briefing Room to hear Analysts Lindy Ryan and John O'Brien as they explain how the combination of traditional Business Intelligence with Big Data Analytics can provide game-changing results in today's information economy. They'll be briefed by Eric Poulin and Paul Flach of Stream Integration who will share best practices for designing and implementing Big Data solutions. They'll discuss the components of IBM BigInsights, and explain how BigSheets can empower non-technical users who need to explore self-structured data.
Visit InsideAnlaysis.com for more information.
Bridging the Gap: Analyzing Data in and Below the CloudInside Analysis
The Briefing Room with Dean Abbott and Tableau Software
Live Webcast July 23, 2013
http://www.insideanalysis.com
Today’s desire for analytics extends well beyond the traditional domain of Business Intelligence. That’s partly because business users are realizing the value of mixing and matching all kinds of data, from all kinds of sources. One emerging market driver is Cloud-based data, and the desire companies have to analyze this data cohesively with their on-premise data sets.
Register for this episode of The Briefing Room to learn from Analyst Dean Abbott, who will explain how the ability to access data in the cloud can play a critical role for generating business value from analytics. He’ll be briefed by Ellie Fields of Tableau Software who will tout Tableau’s latest release, which includes native connectors to cloud-based applications like Salesforce.com, Amazon Redshift, Google Analytics and BigQuery. She’ll also demonstrate how Tableau can combine cloud data with other data sources, including spreadsheets, databases, cubes and even Big Data.
Accelerate Self-Service Analytics with Data Virtualization and VisualizationDenodo
Watch full webinar here: https://bit.ly/3fpitC3
Enterprise organizations are shifting to self-service analytics as business users need real-time access to holistic and consistent views of data regardless of its location, source or type for arriving at critical decisions.
Data Virtualization and Data Visualization work together through a universal semantic layer. Learn how they enable self-service data discovery and improve performance of your reports and dashboards.
In this session, you will learn:
- Challenges faced by business users
- How data virtualization enables self-service analytics
- Use case and lessons from customer success
- Overview of the highlight features in Tableau
Modernizing Your IT Infrastructure with Hadoop - Cloudera Summer Webinar Seri...Cloudera, Inc.
You will also learn how to understand key challenges when deploying a Hadoop cluster in production, manage the entire Hadoop lifecycle using a single management console, deliver integrated management of the entire cluster to maximize IT and business agility.
When and How Data Lakes Fit into a Modern Data ArchitectureDATAVERSITY
Whether to take data ingestion cycles off the ETL tool and the data warehouse or to facilitate competitive Data Science and building algorithms in the organization, the data lake – a place for unmodeled and vast data – will be provisioned widely in 2020.
Though it doesn’t have to be complicated, the data lake has a few key design points that are critical, and it does need to follow some principles for success. Avoid building the data swamp, but not the data lake! The tool ecosystem is building up around the data lake and soon many will have a robust lake and data warehouse. We will discuss policy to keep them straight, send data to its best platform, and keep users’ confidence up in their data platforms.
Data lakes will be built in cloud object storage. We’ll discuss the options there as well.
Get this data point for your data lake journey.
Take Action: The New Reality of Data-Driven BusinessInside Analysis
The Briefing Room with Dr. Robin Bloor and WebAction
Live Webcast on July 23, 2014
Watch the archive:
https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=360d371d3a49ad256942f55350aa0a8b
The waiting used to be the hardest part, but not anymore. Today’s cutting-edge enterprises can seize opportunities faster than ever, thanks to an array of technologies that enable real-time responsiveness across the spectrum of business processes. Early adopters are solving critical business challenges by enabling the rapid-fire design, development and production of very specific applications. Functionality can range from improved customer engagement to dynamic machine-to-machine interactions.
Register for this episode of The Briefing Room to learn from veteran Analyst Dr. Robin Bloor, who will tout a new era in data-driven organizations, and why a data flow architecture will soon be critical for industry leaders. He’ll be briefed by Sami Akbay of WebAction, who will showcase his company’s real-time data management platform, which combines all the component parts needed to access, process and leverage data big and small. He’ll explain how this new approach can provide game-changing power to organizations of all types and sizes.
Visit InsideAnlaysis.com for more information.
DevOps is to Infrastructure as Code, as DataOps is to...?Data Con LA
Data Con LA 2020
Description
The idea that “Software Is Eating the World” has been with us for over a decade. The flexibility and agility that it provides has been central to its impact on every industry and even to the process of its own production. Software development itself underwent a revolution through the Agile Movement, central to which was the shift in organizational dynamics embodied by DevOps, which was rooted in collaboration between development and operations teams to enable rapid, fluid delivery of software to customers. The DevOps organizational shift was enabled by Infrastructure As Code (IaC), the complementary technological shift that advocated for infrastructure provisioning and configuration to be automated through flexible, reliable software instead of complex, error-prone, manual processes. It unlocked speed and consistency in software development and deployment, between development and production. Together, DevOps and IaC as the paired organizational and technological shifts were the proverbial grease that streamlined software development.
Today, we see the movie replaying. Except, we are graduating from a Software (driven) World to a Data (driven) World. DataOps has emerged as the organizational mindset shift essential to unlocking the value of data in the enterprise. What will be the concomitant technological shift that will enable the modern, data driven enterprise? We posit that Data Transformation will be the accompanying parallel, technological shift in the data world that will enable rapid creation and reconfiguration of advanced analytics using software and in a manner that preserves consistency between development and production.
In this talk, we will discuss:
Motivations behind the movements in both worlds
The core constructs of each of the movements in both worlds
The impact of being software driven in both worlds
Explore the elegant symmetry of the Software and Data Worlds and how they interoperate
Learn how Data Transformation can set the foundation for the modern data driven enterprise and enable a quantum leap in Digital Transformation
Speaker
Karun Bakshi, Xcalar, VP of Product Marketing
Moving Targets: Harnessing Real-time Value from Data in Motion Inside Analysis
The Briefing Room with David Loshin and Datawatch
Live Webcast Feb. 17, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=4a053043c45cf0c2f6453dfb8577c72a
Patience may be a virtue, but when it comes to streaming analytics, waiting is no option. Between Big Data and the Internet of Things, businesses are faced with more data and greater complexity than ever before. Traditional information architectures simply cannot support the kind of processing necessary to make use of this fast-moving resource. The modern context requires a shorter path to analytics, one that narrows the gap between governance and discovery
Register for this episode of The Briefing Room to hear veteran Analyst David Loshin as he explains how the prevalence of streaming data is changing business pace and processes. He’ll be briefed by Dan Potter of Datawatch, who will tout his company’s real-time data discovery platform for data in motion. He will show how self-service data preparation can lead to faster insights, ultimately fostering the ability to make precise decisions at the right time.
Visit InsideAnalysis.com for more information.
Similar to No Time-Outs: How to Empower Round-the-Clock Analytics (20)
Smart companies know that business intelligence surfaces insights. With complex analytics, data mining and everything in between, it takes many moving parts to serve up the big picture. The key is to provide full-stack visibility into the entire BI environment, ensuring solid service and system performance.
Learn more at http://www.insideanalysis.com
Agile, Automated, Aware: How to Model for SuccessInside Analysis
The Briefing Room with David Loshin and Embarcadero
Live Webcast October 27, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/onstage/g.php?MTID=eea9877b71c653c499c809c5693eae8fe
Data management teams face some tough challenges these days. Organizations need business-driven visibility that enables understanding and awareness of enterprise data assets – without worrying about definitions and change management. But with information architectures evolving into a hybrid mix of data objects and data services built over relational databases as well as big data stores, serving up accurately defined, reusable data can become a complex issue.
Register for this episode of The Briefing Room to learn from veteran Analyst David Loshin as he explains the importance of agile, automated workflows in today’s enterprise. He’ll be briefed by Ron Huizenga of Embarcadero, who will discuss how his company’s ER/Studio suite approaches data modeling and management from a modern architecture standpoint. He will explain that unifying the way information is represented can not only eliminate the need for costly workarounds, but also foster collaboration between data architects, developers and business users.
Visit InsideAnalysis.com for more information.
First in Class: Optimizing the Data Lake for Tighter IntegrationInside Analysis
The Briefing Room with Dr. Robin Bloor and Teradata RainStor
Live Webcast October 13, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=012bb2c290097165911872b1f241531d
Hadoop data lakes are emerging as peers to corporate data warehouses. However, successful data management solutions require a fusion of all relevant data, new and old, which has proven challenging for many companies. With a data lake that’s been optimized for fast queries, solid governance and lifecycle management, users can take data management to a whole new level.
Register for this episode of The Briefing Room to learn from veteran Analyst Dr. Robin Bloor as he discusses the relevance of data lakes in today’s information landscape. He’ll be briefed by Mark Cusack of Teradata, who will explain how his company’s archiving solution has developed into a storage point for raw data. He’ll show how the proven compression, scalability and governance of Teradata RainStor combined with Hadoop can enable an optimized data lake that serves as both reservoir for historical data and as a "system of record” for the enterprise.
Visit InsideAnalysis.com for more information.
Fit For Purpose: Preventing a Big Data LetdownInside Analysis
The Briefing Room with Dr. Robin Bloor and RedPoint Global
Live Webcast October 6, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=9982ad3a2603345984895f279e849d35
Gartner recently placed Big Data in its “trough of disillusionment,” reflective of many leaders’ struggle to prove the value of Hadoop within their organization. While the promise of enhanced data integration and enrichment is obvious, measurable results have remained elusive. This episode of The Briefing Room will outline how to successfully tie Big Data to existing business applications, preventing your next Hadoop project from being another “Big Data letdown.”
Register today to learn from veteran Analyst Dr. Robin Bloor as he discusses the importance of converging enterprise data integration with intelligence and scalability. He’ll be briefed by George Corugedo of RedPoint Global, who will provide concrete examples of how the convergence of scalable cloud platforms, ever-expanding data sources and intelligent execution can turn the Big Data hype into demonstrable business value.
Visit InsideAnalysis.com for more information.
To Serve and Protect: Making Sense of Hadoop Security Inside Analysis
The Briefing Room with Dr. Robin Bloor and HP Security Voltage
Live Webcast September 22, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=45ece7082b1d7c2cc8179bc7a1a69ea5
Hadoop is rapidly becoming a development platform and dominant server environment, and organizations are keen to take advantage of its massively scalable – and relatively inexpensive – resources. It is not, however, without its limitations, and it often requires a contingent of complementary components in order to behave within an information architecture. One area often overlooked is security, a factor that, if not considered from the onset, can insert great risk when putting sensitive data in Hadoop.
Register for this episode of The Briefing Room to learn from veteran Analyst Dr. Robin Bloor as he discusses how security was never a design point for Hadoop and what organizations can do about it. He’ll be briefed by Sudeep Venkatesh of HP Security Voltage, who will explain the intricacies surrounding a secure Hadoop implementation. He will show how techniques like format-preserving and partial-field encryption can allow for analytics over protected data, with zero performance impact.
Visit InsideAnalysis.com for more information.
The Hadoop Guarantee: Keeping Analytics Running On TimeInside Analysis
The Briefing Room with Dr. Robin Bloor and Pepperdata
Live Webcast September 15, 2015
Watch the Archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=32f198185d9d0c4cf32c27bdd1498b2a
Industry researchers agree: the importance of Hadoop will continue to grow as more companies recognize the range of benefits they can reap, from lower-cost storage to better business insights. At the same time, advances in the Hadoop ecosystem are addressing many of the key concerns that have hampered adoption, including performance and reliability. As a result, Hadoop is fast becoming a first-class citizen in the world of enterprise computing.
Register for this episode of The Briefing Room to hear veteran Analyst Dr. Robin Bloor explain how the Hadoop ecosystem is evolving into a mature foundation for managing enterprise data. He’ll be briefed by Sean Suchter of Pepperdata, who will explain how his company’s software brings predictability and reliability to Hadoop through dynamic, policy-based controls and monitoring. He’ll show how to guarantee service-level agreements by slowing down low-priority tasks as needed. He’ll also discuss the holy grail of Hadoop: how to enable mixed workloads.
Visit InsideAnalysis.com for more information.
Special Edition with Dr. Robin Bloor
Live Webcast September 9, 2015
Watch the Archive: https://bloorgroup.webex.com/bloorgroup/onstage/g.php?MTID=e8b9ac35d8e4ffa3452562c1d4286a975
Do the math: algebra will transform information management. Just as the relational database revolutionized the information landscape, so will a just-released, complete algebra of data overhaul the industry itself. So says Dr. Robin Bloor in his new book, the Algebra of Data, which he’ll outline in this special one-hour webcast.
Once organizations learn how to express their data sets algebraically, the benefits will be significant and far-reaching. Data quality problems will slowly subside; queries will run orders of magnitude faster; integration challenges will fade; and countless tedious jobs in the data management space will bid their farewell. But first, software companies must evolve, and that will take time.
Visit InsideAnalysis.com for more information.
The Role of Data Wrangling in Driving Hadoop AdoptionInside Analysis
The Briefing Room with Mark Madsen and Trifacta
Live Webcast September 1, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/onstage/g.php?MTID=eb655874d04ba7d560be87a9d906dd2fd
Like all enterprise software solutions, Hadoop must deliver business value in order to be a success. Much of the innovation around the big data industry these days therefore addresses usability. While there will always be a technical side to the Hadoop equation, the need for user-friendly tools to manage the data will continue to focus on business users. That’s why self-service data preparation or "data wrangling" is a serious and growing trend, one which promises to move Hadoop beyond the early adopter phase and more into the mainstream of business.
Register for this episode of The Briefing Room to hear veteran Analyst Mark Madsen of Third Nature explain why business users will play an increasingly important role in the evolution of big data. He’ll be briefed by Trifacta's Will Davis and Alon Bartur, who will demonstrate how Trifacta's solution empowers business users to “wrangle" data of all shapes and sizes faster and easier than ever before. They’ll discuss why a new approach to accessing and preparing diverse data is required and how it can accelerate and broaden the use of big data within organizations.
Visit InsideAnalysis.com for more information.
Ahead of the Stream: How to Future-Proof Real-Time AnalyticsInside Analysis
Business seems to move faster by the day, with the most cutting edge companies taking advantage of real-time data streams for heavy duty analytics. But with so much innovation happening in so many places, how can companies stay ahead of the game? One answer is to future-proof your analytics architecture by using an abstraction layer that can translate your business use-case or work-flow to one of many leading innovative technologies to address the growing number of use cases in this dynamic field.
Register for this episode of The Briefing Room to hear veteran Analyst Dr. Robin Bloor, as he explains how a data flow architecture can harness a wide range of streaming solutions. He'll be briefed by Anand Venugopal of Impetus Technologies, who will showcase his company's StreamAnalytix platform, which was designed from the ground up to leverage multiple major streaming engines available today, including Apache Spark, Apache Storm and others. He'll demonstrate how StreamAnalytix provides enterprise-class performance while incorporating best-of-breed open-source components.
View the archive at: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=925d1e9b639b78c6cf76a1bbbf485b2b
All Together Now: Connected Analytics for the Internet of EverythingInside Analysis
The Briefing Room with Mark Madsen and Cisco
Live Webcast August 18, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=0eff120f8b2879b582b77f4ff207ee54
Today's digital enterprises are seeing an explosion of data at the edge. The Internet of Everything is fast approaching a critical mass that will demand a sea change in how companies process data. This new world of information is widely distributed, streaming, and overall becoming too big to move. Experts predict that within two to three years, the bulk of analytic processing will take place on the fringes of information architectures. As a result, forward-thinking companies are dramatically shifting their analytic strategies.
Register for this episode of The Briefing Room to hear veteran Analyst Mark Madsen of Third Nature explain how a new era of information architectures is now unfolding, paving the way to much more responsive and agile business models. He'll be briefed by Kim Macpherson of the Cisco Data and Analytics Business Unit, who will explain how her company's platform is uniquely suited for this new, federated analytic paradigm. She'll demonstrate how edge analytics can help companies address opportunities quickly and effectively.
Visit InsideAnalysis.com for more information.
Goodbye, Bottlenecks: How Scale-Out and In-Memory Solve ETLInside Analysis
The Briefing Room with Dr. Robin Bloor and Splice Machine
Live Webcast August 11, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/onstage/g.php?MTID=e1b33c9d45b178e13784b4a971a4c1349
The ETL process was born out of necessity, and for decades it has been the glue between data sources and target applications. But as data
growth soars and increased competition demands real-time data, standard ETL has become brittle and often unmanageable. Scaling up resources can do the trick, but it’s very costly and only a matter of time before the processes hit another bottleneck. When outmoded ETL stands in the way of real-time analytics, it might be time to consider a completely new approach.
Register for this episode of The Briefing Room to learn from veteran Analyst Dr. Robin Bloor as he explains how modern, data-driven architectures must adopt an equally capable data integration strategy. He’ll be briefed by Rich Reimer of Splice Machine, who will discuss how his company solves ETL performance issues and enables real-time analytics and reports on big data. He will show that by leveraging the scale-out power of Hadoop and the in-memory speed of Spark, users can bring both analytical and operational systems together, eventually performing transformations only when needed.
Visit InsideAnalysis.com for more information.
The Biggest Picture: Situational Awareness on a Global LevelInside Analysis
The Briefing Room with Dr. Robin Bloor and Modus Operandi
Live Webcast July 28, 2015
Watch the Archive: https://bloorgroup.webex.com/bloorgroup/onstage/g.php?MTID=efc4082d9b0b0adfcd753a7435d2d6a1b
The analytic bottlenecks of yesterday need not apply today. The boundaries are also falling thanks in large part to the abundance of third-party data. The most data-driven companies these days are finding creative ways to dynamically incorporate data from within and beyond the firewall, thus building highly accurate, multidimensional views of their business, customer, competition or other subject areas.
Register for this episode of The Briefing Room to hear veteran Analyst Dr. Robin Bloor as he explains the magnitude of change that's occurring in the world of data, why it's happening now, and how you can take advantage. He'll be briefed by Mike Gilger and Boris Pelakh, who will showcase their company's enterprise analytics platform, which combines a range of battle-tested functionality to deliver dynamic situational awareness that can leverage a comprehensive array of data sets. They'll explain how the platform's reasoner benefits from a highly scalable rules engine, and a flexible modeling capability that can optimize data storage virtually on the fly.
Visit InsideAnalysis.com for more information.
Structurally Sound: How to Tame Your ArchitectureInside Analysis
The Briefing Room with Krish Krishnan and Teradata
Live Webcast July 21, 2015
Watch the Archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=602b2a8413e8719d39465f4d6291d505
Technology changes all the time, but the basic needs of the business are the same: BI and analytics. With new types of data, various analytics engines and multiple systems, giving business users seamless access to enterprise data can be a rather daunting process. One solution is to provide a complete fabric that spans the organization, touching all data points and masking the complexity behind disparate sources.
Register for this episode of The Briefing Room to learn from veteran Analyst Krish Krishnan as he explores how and why architectures have changed over the years. He’ll be briefed by Imad Birouty of Teradata, who will discuss his company’s QueryGrid, an analytics solution designed to provide access to data across all systems. He will show how QueryGrid essentially creates a logical data warehouse and enables users to leverage SQL over multiple data types.
Visit InsideAnalysis.com for more information.
SQL In Hadoop: Big Data Innovation Without the RiskInside Analysis
The Briefing Room with Dr. Robin Bloor and Actian
Live Webcast July 14, 2015
Watch the Archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=bbd4395ea2f8c60a03cfefc68c7aa823
Innovation often implies risk, which is why businesses have many issues to weigh when considering change. Yet the remarkable growth of data is driving many traditional systems into the ground, forcing information workers to take a critical look at their existing tools. Technologies like Hadoop offer economical solutions to big data management, but to truly take advantage of its capabilities, organizations must modernize their infrastructure.
Register for this episode of The Briefing Room to learn from veteran Analyst Dr. Robin Bloor as he explains how and why organizations should improve legacy systems. He’ll be briefed by Todd Untrecht of Actian, who will tout his company’s Actian Vortex, a SQL-in-Hadoop solution. He will show how integrating a SQL engine directly in the Hadoop cluster can lead to faster analytics and greater control, while still maintaining existing investments.
Visit InsideAnalysis.com for more information.
The Briefing Room with Dr. Robin Bloor and SYSTAP
Live Webcast June 30, 2015
Watch the Archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=0ff3889293f6c090483295fd7362c5a4
There's a reason why the biggest Web companies these days leverage graph technology: it is incredibly powerful for revealing a wide range of insights. Unlike other analytical databases, graph can very quickly identify the kinds of patterns that lead to better business decisions. Though relatively nascent in existing data centers, graph databases are proving to be well-suited for all kinds of business use cases, from clustering and hypothesis generation to failure detection and cyber analytics.
Register for this episode of The Briefing Room to learn from veteran Analyst Dr. Robin Bloor as he discusses how semantic technology fits in the spectrum of database and discovery solutions. He’ll be briefed by Brad Bebee of SYSTAP, who will showcase his company’s Blazegraph products and Mapgraph technology. He will explain how SYSTAP’s approach overcomes the challenge of scalability, and how graph technology’s powerful data management capabilities can deliver better enterprise performance and analytics using GPUs and other approaches.
Visit InsideAnalysis.com for more information.
A Revolutionary Approach to Modernizing the Data WarehouseInside Analysis
Hot Technologies with Rick Sherman, Dr. Robin Bloor and Snowflake Computing
Live Webcast June 25, 2015
Watch the Archive: https://bloorgroup.webex.com/bloorgroup/onstage/g.php?MTID=e6e6de6cdfa8926e7a9d52e099a1a08e2
Enterprise software tends to advance in one of two ways: evolutionary and revolutionary. Evolutionary advances happen through incremental improvements made to an existing code base over a long period of time. Revolutionary advances happen when a new solution is designed from scratch, breaking cleanly from legacy approaches to take advantage of technology innovations that can span from hardware to software and methodologies.
Register for this episode of Hot Technologies to hear veteran analysts Rick Sherman of Athena IT Solutions and Dr. Robin Bloor along with Bob Muglia, CEO of Snowflake Computing, explain how a confluence of advances in the data world have opened up new doors for revolutionary advances in data warehousing. They will discuss new technology innovations and how they can be used to create data warehouses with the power, flexibility, and resiliency that modern enterprises need without the complexities and latencies inherent to traditional approaches.
Visit InsideAnalysis.com for more information.
The Maturity Model: Taking the Growing Pains Out of HadoopInside Analysis
The Briefing Room with Rick van der Lans and Think Big, a Teradata Company
Live Webcast on June 16, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=197f8106531874cc5c14081ca214eaff
Hadoop is arguably one of the most disruptive technologies of the last decade. Once lauded solely for its ability to transform the speed of batch processing, it has marched steadily forward and promulgated an array of performance-enhancing accessories, notably Spark and YARN. Hadoop has evolved into much more than a file system and batch processor, and it now promises to stand as the data management and analytics backbone for enterprises.
Register for this episode of The Briefing Room to learn from veteran Analyst Rick van der Lans, as he discusses the emerging roles of Hadoop within the analytics ecosystem. He’ll be briefed by Ron Bodkin of Think Big, a Teradata Company, who will explore Hadoop’s maturity spectrum, from typical entry use cases all the way up the value chain. He’ll show how enterprises that already use Hadoop in production are finding new ways to exploit its power and build creative, dynamic analytics environments.
Visit InsideAnalysis.com for more information.
Rethinking Data Availability and Governance in a Mobile WorldInside Analysis
The Briefing Room with Malcolm Chisholm and Druva
Live Webcast on June 9, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=baf82d3835c5dfa63202dcbe322a3ad7
The emergence of the mobile workforce has left an indelible mark on the enterprise; every employee is now mobile, and business data continues to be dispatched to the far reaches of the enterprise. While this has added enormous opportunity for increased productivity, it has also muddied the waters when it comes to controlling and protecting valuable data assets. As companies quickly evolve to address the new set of challenges posed by this shift in data usage, IT must ensure that all data, no matter where it’s generated or stored, is available and governed just as if it were still safely behind the corporate firewall.
Register for this episode of The Briefing Room to hear veteran Analyst Malcolm Chisholm as he explains the myriad challenges that mobile data introduces when addressing regulations and compliance needs, requiring new approaches to data governance. He’ll be briefed by Dave Packer of Druva, who will outline his company’s converged data protection strategy, which brings data center class capabilities to backup, availability and governance for the mobile workforce. He will share strategies to meet regional data residency, data recovery, legal hold and eDiscovery requirements and more.
Visit InsideAnalysis.com for more information.
Communications Mining Series - Zero to Hero - Session 1DianaGray10
This session provides introduction to UiPath Communication Mining, importance and platform overview. You will acquire a good understand of the phases in Communication Mining as we go over the platform with you. Topics covered:
• Communication Mining Overview
• Why is it important?
• How can it help today’s business and the benefits
• Phases in Communication Mining
• Demo on Platform overview
• Q/A
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
How to Get CNIC Information System with Paksim Ga.pptxdanishmna97
Pakdata Cf is a groundbreaking system designed to streamline and facilitate access to CNIC information. This innovative platform leverages advanced technology to provide users with efficient and secure access to their CNIC details.
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
Maruthi Prithivirajan, Head of ASEAN & IN Solution Architecture, Neo4j
Get an inside look at the latest Neo4j innovations that enable relationship-driven intelligence at scale. Learn more about the newest cloud integrations and product enhancements that make Neo4j an essential choice for developers building apps with interconnected data and generative AI.
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
3. ! Reveal the essential characteristics of enterprise
software, good and bad
! Provide a forum for detailed analysis of today s
innovative technologies
! Give vendors a chance to explain their product to
savvy analysts
! Allow audience members to pose serious questions...
and get answers!
Twitter Tag: #briefr
5. ! Analytics has always been about discovering
insights that lead to better business decisions.
! More organizations are demanding faster time-to-
insight, while at the same time expecting
connectivity to and analytics on a wide variety of
data and data sources.
! Clever vendors look for ways to provide solutions
that not only scale at lightening speeds, but
deliver actionable insights.
Twitter Tag: #briefr
6. Robin Bloor is
Chief Analyst at
The Bloor Group.
Robin.Bloor@Bloorgroup.com
Twitter Tag: #briefr
8. Rick Sherman is the founder of Athena IT
Solutions, a Massachussetts-based firm that
provides business intelligence, data
integration & data warehouse consulting,
training and vendor services.
In addition to having more than 20 years of
experience in BI solutions, Rick writes on IT
topics and is a frequent speaker at industry
events.
He blogs at The Data Doghouse and can be
reached at:
rsherman@athena-solutions.com
Twitter Tag: #briefr
9. ! Actian Corporation is a database and software
development company.
! Its premier database platform is Vectorwise, a highly
performant analytic engine that implements
parallelism at every level, from the processor core to
data storage.
! Actian offers a cloud development platform for
building Action Apps, lightweight applications that
automate business actions triggered by real-time
changes in data.
Twitter Tag: #briefr
10. Fred Gallagher is GM, Vectorwise at Actian
Corporation and is responsible for managing the
business activities for this breakthrough product. He
joined Actian in 2006 as vice president of business
development.
Before joining Actian, Fred worked for Qlusters,
where he was responsible for worldwide sales,
marketing, and business development. At Qluster, he
successfully launched the industry's first open source
systems management project. Prior to that, he
worked at VMWare, where he was responsible for
worldwide software alliances, and where he
established 15 successful strategic alliances during a
high-growth period of two years. Previously he was
at Seagate Technology, where he was vice president
of worldwide channels and business development for
Seagate's XIOtech subsidiary. Fred holds Bachelor of
Arts and an MBA from Stanford University.
Twitter Tag: #briefr
11. Briefing Room
August 28
Speakers:
Rick Sherman
Fred Gallagher, General Manager Vectorwise, Actian Corporation
12. Actian Today
• Global
Reach,
Growing
with
Strong
Balance
sheet
• Highly
Profitable
with
strong
Cash
Balances
• 200
employees
across
11
Offices
• World’s
Fastest
Big
Data
Analy5cal
Engine
Vectorwise
• Experiencing
RAPID
GROWTH
• Affordable,
Leverages
Standard
Hardware
and
SoHware
• 10,000+
mission
cri5cal
applica5ons
–
inc
Data
Warehouses
Ingres
• Very
high
client
sa5sfac5on
• S5ll
Innova5ng:
GeoSpa5al
features
• Next
Genera5on
of
Ac5onable
BI
–
Connect
Insight
to
Ac5on
Ac5on
Apps
• Analyzing
Events
and
Data
12
13. But
most
enterprises
s5ll
only
use
1-‐5%
of
their
data
What
if
that
number
doubled,
tripled,
quadrupled…
?
Source:
Forrester
14. Enormous Opportunities for Big Data
$300bn
value
per
year
€250
bn
value
per
$600
bn
value
year
per
year
SOURCE:
McKinsey
Global
InsNtute
analysis
14
15. “Big” Data Management Challenge
Local Data
Data silos, distributed and disparate
!
Existing solutions are not real-time
!
Data in the Cloud
Expensive, inefficient or inflexible
!
Scalability not there
!
15
16. The Need for Speed and Agility
! Business users require interactivity
! Desktop users tolerate 10 to 20 seconds
! Mobile users tolerate 2 to 5 seconds
! Increases in user concurrency
! Take action in “business time”
! Be faster than your competition
! Optimize your business
16
17. Reduce Latency Between Events and Action
Event
Latency
Capture
Value
Latency
Analyze
Informed
decision
ready
Latency
to
be
made
A*ribu/on:
Ac5on
Jean-‐Michel
Franco
of
Business
&
Decision
Time
latencies
17
18. Vectorwise:
Affordable Performance – Proven!
QphH 0
100,000
200,000
300,000
400,000
June
‘12
Vectorwise
445,529
May
‘11
Vectorwise
436,788
Aug
‘11
SQL
Server
219,888
June
‘11
Oracle
209,534
Sept
‘11
Oracle
201,487
Apr
‘11
SQL
Server
173,962
Dec
‘10
Sybase
IQ
164,747
Apr
‘10
Oracle
140,181
Dec
‘11
SQL
Server
134,117
Fastest TPC-H QphH@1TB Benchmark (non-clustered)
Source:
www.tpc.org
/
June
15,
2012
18
19. Vectorwise:
Affordable Performance – Proven!
QphH 0
100,000
200,000
300,000
400,000
Hardware
Cost
(excluding
discounts)
June
‘12
Vectorwise
445,529
$57,146
May
‘11
Vectorwise
436,788
$85,621
Aug
‘11
SQL
Server
219,888
$460,869
June
‘11
Oracle
209,534
$2,402,706
Sept
‘11
Oracle
201,487
$753,392
Apr
‘11
SQL
Server
173,962
$278,527
Dec
‘10
Sybase
IQ
164,747
$1,229,968
Apr
‘10
Oracle
140,181
$1,249,967
Dec
‘11
SQL
Server
134,117
$258,880
Fastest TPC-H QphH@1TB Benchmark (non-clustered)
Source:
www.tpc.org
/
June
15,
2012
19
21. Customer Stories: Sheetz and Zoho
! Leader in Convenience Stores ! SaaS Company with 6 million Users
! Problem ! Problem and Requirements
! Customer data growing rapidly
! Multiple data sources
! Ease of use for self-service BI
! Need to control costs
! Affordability
! Huge data growth
! 200,000 users of Zoho Reports
! Vectorwise results Vectorwise results
! Expand data to analyze two years ! Exceptional performance
! Manage growth for three years ! Affordability for a SaaS offering
21
23. Customer Stories: Badoo
! Fastest Growing Social Network
! Problem
! Limited slice and dice analytics
! Better target ad campaigns
! Huge data growth
! Vectorwise results
! Detailed answers in seconds
! Immediate actions
23
24. Summary
! Successful businesses require speed and agility
! BI solutions must address these requirements
! Recommendations for how to get started and succeed:
! Align IT goals and organization with user needs and business goals
! Include operational processes in requirements (business and IT)
! POC with Vectorwise for affordable performance and scalability
24
34. • The query patterns for BI business analytics are much different than
transactional processing. What are the key differences? How do you address
them?
• Traditionally BI implementations required a sophisticated data architecture
including a DW, data marts (dimensional), OLAP, “flattened” datasets,
aggregated/summarized tables and other reporting data stores. Also maybe
an ODS (operational data store), staging tables and various data shadow
systems. Do you reduce the complexity of the traditional data architecture?
• A key component of developing the business analytics is to define what data
the business needs, how they plan to analyze on it, design the queries, tune
the database, etc. And then do it again for each query. How do you change
that?
• Business analytics typically involves a variety of BI tools such as reports,
dashboards, scorecards, ad-hoc analytics, data visualization, data discovery
and predictive analytics. How do you interact with these tools?
Twitter Tag: #briefr
35. • Business analytics/BI, data integration and DW requires a lot of varied skills.
What type of skills are needed to successfully implement your solutions? Do
you raise the increase on skills needed for implementation?
• The limiting factor on many enterprise-wide BI and DW programs has been
cost, but emerging technology is perceived as more expensive. How do you
lower the TCO?
• Assume that most enterprises have a DW (maybe even MDM) in order to
enable consistent and conformed data. How does your solution leverage the
DW? Does you solution lessen the need for a DW?
• There was a lot of hype regarding BI Appliances a while ago and many
vendors used that term to label various hardware & software combinations.
From the hype, what has emerged to impact BI & how? What are the
“pretender” technologies that have not fulfilled on hype (no vendor names!)?
Twitter Tag: #briefr