The Briefing Room with Mike Ferguson and Alteryx
Live Webcast on Feb. 12, 2013
Today's savvy organizations know that a streamlined approach to data and applications can put the power of predictive analytics right where it needs to be: in the hands of the user. Sure, training is still required, but a real revolution is underway for the graphic design of such user interfaces. Central to this overhaul of design is the concept of intelligent, simple workflow, which enables users to get things done in an orderly fashion.
Check out the slides for this episode of The Briefing Room to hear analyst Mike Ferguson of Intelligent Business Strategies as he explains why interface design and workflow must go hand-in-hand. He will be briefed by Matt Madden of Alteryx, who will tout his company’s predictive platform, a solution that leverages an array of traditional and Big Data analytics applications, designed for problem solvers and decision makers. Madden will also provide several customer use cases that demonstrate the new normal in predictive analytics.
Change a gear up with Evolutionary ArchitectureLuca Grulla
In the fast-moving world of technology startups, change is the only constant. As engineers and technologists, we should embed change in our thinking. By making change a first-class citizenship in our engineering philosophy, via an Agile mindset paired with evolutionary architecture, the Signal AI Technology Team can act as a catalyst for product innovation and business opportunities.
Red Hat: Self driving IT is here, and it's realDynatrace
Join the always-thought provoking Chris Morgan for an update on the current state of the containerized IT landscape and how Red Hat and Dynatrace are transforming the current approach, leading the way to a self driving IT.
Dynatrace: Davis - Hololens - AI update - Cloud announcements - Self driving ITDynatrace
Dynatrace announced new features for their AI assistant Davis including notifications for Amazon Echo and Slack. They also discussed plans to further integrate Davis with Dynatrace search capabilities. The company announced a new Innovator Program providing hardware, workshops and early access to new features for 15 participants paying an annual $25k fee. Finally, they demonstrated a new integration with Microsoft HoloLens and discussed how Davis is built on Dynatrace APIs to provide multimodal interfaces for the future.
Dynatrace: Accelerate your cloud innovation Welcome to Perform 2018Dynatrace
Quick look at the incredible growth and success of the new Dynatrace solution. Why has the world realized it needs our unique capabilities and why are our customers so keen to transition with us?
Ayush Tiwari [PTC] | Unlock IoT Value with PTC’s ThingWorx Platform & InfluxD...InfluxData
PTC enables global manufacturers to realize double-digit impact with software solutions that enable them to accelerate product and service innovation, improve operational efficiency and increase workforce productivity. For developing IIoT solutions, PTC has partnered with InfluxData to manage time series data at scale. With PTC’s ThingWorx platform capabilities to rapidly build IIoT applications coupled with InfluxData’s leading time series data storage platform, customers are destined on a path to success in their digital transformation journey. Learn how selecting the ThingWorx solution and InfluxDB will unlock your IoT value.
Qlik Sense is an amazing software which plays a key role in business analytics and data discovery. This tool has improved considerably since its inception.
The Alteryx Designer solves this by delivering an intuitive workflow for data blending and advanced analytics that leads to deeper insights in hours, not the weeks typical of traditional approaches! The Alteryx Designer empowers data analysts by combining data blending, predictive analytics, spatial analytics, and reporting, visualization and analytic apps into one workflow.
Change a gear up with Evolutionary ArchitectureLuca Grulla
In the fast-moving world of technology startups, change is the only constant. As engineers and technologists, we should embed change in our thinking. By making change a first-class citizenship in our engineering philosophy, via an Agile mindset paired with evolutionary architecture, the Signal AI Technology Team can act as a catalyst for product innovation and business opportunities.
Red Hat: Self driving IT is here, and it's realDynatrace
Join the always-thought provoking Chris Morgan for an update on the current state of the containerized IT landscape and how Red Hat and Dynatrace are transforming the current approach, leading the way to a self driving IT.
Dynatrace: Davis - Hololens - AI update - Cloud announcements - Self driving ITDynatrace
Dynatrace announced new features for their AI assistant Davis including notifications for Amazon Echo and Slack. They also discussed plans to further integrate Davis with Dynatrace search capabilities. The company announced a new Innovator Program providing hardware, workshops and early access to new features for 15 participants paying an annual $25k fee. Finally, they demonstrated a new integration with Microsoft HoloLens and discussed how Davis is built on Dynatrace APIs to provide multimodal interfaces for the future.
Dynatrace: Accelerate your cloud innovation Welcome to Perform 2018Dynatrace
Quick look at the incredible growth and success of the new Dynatrace solution. Why has the world realized it needs our unique capabilities and why are our customers so keen to transition with us?
Ayush Tiwari [PTC] | Unlock IoT Value with PTC’s ThingWorx Platform & InfluxD...InfluxData
PTC enables global manufacturers to realize double-digit impact with software solutions that enable them to accelerate product and service innovation, improve operational efficiency and increase workforce productivity. For developing IIoT solutions, PTC has partnered with InfluxData to manage time series data at scale. With PTC’s ThingWorx platform capabilities to rapidly build IIoT applications coupled with InfluxData’s leading time series data storage platform, customers are destined on a path to success in their digital transformation journey. Learn how selecting the ThingWorx solution and InfluxDB will unlock your IoT value.
Qlik Sense is an amazing software which plays a key role in business analytics and data discovery. This tool has improved considerably since its inception.
The Alteryx Designer solves this by delivering an intuitive workflow for data blending and advanced analytics that leads to deeper insights in hours, not the weeks typical of traditional approaches! The Alteryx Designer empowers data analysts by combining data blending, predictive analytics, spatial analytics, and reporting, visualization and analytic apps into one workflow.
Smarter Business i praktiken - IBM Smarter Business 2011IBM Sverige
Presentation från IBM Smarter Business 2011. Spår: Utveckla produkter och tjänster kostnadseffektivt.
Den tekniska utvecklingen skenar och kunder och konsumenter kräver bättre och smartare lösningar som underlättar deras affärer och vardag. Betyder speed to market något för dig? Missa då inte denna session där Colin Williams visar hur dom bästa och mest framgångsrika utvecklingsteamen arbetar.
Talare: Colin Williams, Software Sales Executive- WW Rational Tiger Team.
Mer information på www.smarterbusiness.se
Building Adaptive Apps with APIs and Data: I Love APIs 2014 CTO KeynoteApigee | Google Cloud
Day two I Love APIs 2014 conference kicked off with a keynote from Anant Jhingran - Apigee's CTO. He unveiled the latest version of Apigee Insights and outlined just how businesses become adaptive. He helped another packed auditorium of API lovers grasp the power of API-driven adaptive apps and how data scientists and developers can now collaborate.
https://www.youtube.com/watch?v=KEMiFWebJtc&index=4&list=PLIXjuPlujxxy72-kEorgBF6DxXZRsj1y
Dynatrace: Meet our captain of product and all things awesome, Steve TackDynatrace
Through his work with our global customers, Steve is an expert on the rapid changes playing out in today's IT environments and the challenges this presents now and the future. In this session Steve will dive a little deeper into how we're shaking up the industry; he'll hit on our latest innovation and you'll get to learn why you need to join us on a journey of transition today.
Starbucks: Building a new dev culture and freeing time for innovation: A Star...Dynatrace
Naveen Dronavalli is the Manager of Application Development at Starbucks. He discussed Starbucks' journey in implementing the Dynatrace application performance monitoring (APM) solution to improve monitoring of their applications and reduce mean time to resolution for issues. Some key benefits realized included proactively identifying user logout, database performance, and logging issues. Starbucks has continued expanding use of Dynatrace and exploring its synthetic monitoring and AI capabilities to further support innovation.
This document provides an overview of architecting a digital enterprise. It discusses key aspects of business and technical architecture, including digital products and consumer experience, digital environments, platforms, and reference architectures. The technical architecture section describes a layered "system of systems" reference architecture with systems of engagement, integration, record, and automation, along with an API-enabled digital platform approach.
Knowage official presentation 2018. Knowage is the only open source and full capabilities suite for any modern business analysis.
Feel free to use this presentation to present and promote Knowage open source suite!
Alteryx and Tableau: Iron Mountain's Sherpa to business insightMattSemple1
Presentation made to Alteryx Inspire Conference (London 2019) by Matt Semple (Iron Mountain) and David Turley (InterWorks) explaining the data journey undertaken by Iron Mountain to transform its analytics function and deliver advanced commercial and operational analytics to its global management teams.
A transformation journey leveraging the advanced technology of Alteryx and Tableau aligned to a clear vision of the data roadmap. All supported by a great consulting partnership with the InterWorks team in London.
5 Reasons DevOps Toolchain Needs Time-Series Based MonitoringDevOps.com
Monolithic architectures are being replaced by microservices-driven apps and the cloud- based infrastructure is being tied together and instrumented by DevOps processes. This is driving the need for greater visibility and better monitoring. Legacy monitoring solutions fail to deliver the much needed sub-second visibility. Let’s take a look at Time-Series platforms and how they are delivering the level of visibility and monitoring needed by today’s DevOps initiatives.
In this webinar, we will take a look at Time-Series Data Platforms and outline how InfluxData’s leading Time-Series data platform can deliver the next-gen monitoring for your DevOps projects.
RightScale Roadtrip - Accelerate To CloudRightScale
The Accelerate to Cloud keynote will help you understand the current state of cloud adoption, identify the business value for your organization, and provide you a framework to plot your course to cloud adoption.
As developers, we all want to be more productive. Serverless helps you do just that, by letting you focus on the business logic while shifting operations somewhere else. As more companies discover this emerging technology, we also discover drawbacks like state management. In this session, we’ll focus on what serverless is, how it helps developers, what potential drawbacks exist, and how we can add state management into serverless.
SpagoBI version 6 rebranded as Knowage offers unpaired analytical experience,...OW2
On May 3rd 2017, Engineering released the new version of SpagoBI, the OW2 flagship project on Business Intelligence. Starting from this version, the project is now branded as Knowage, and is available in two editions:
- the Community Edition, entirely open source, released and down-loadable from the OW2 forge, supported by the open source community
- the Entreprise Edition, fully supported by Engineering under a subscription model
Knowage suite is composed of several modules, each one conceived for a specific analytical domain. They can be used individually as a complete solution for a certain task, or combined with one another to ensure full coverage of users’ requirements, allowing to build a tailored product.
The presentation focus on the Community Edition and provides examples, demos and use cases on the most important functional evolutions of the product:
- the brand new and responsive user interface based on Angular.js
- the extended set of graphical widgets to perform advanced data visualization
- the self-service capabilities to perform data discovery
- the new web-based user interface to define the metadata layer by exploiting the data federation principle
- the possibility to define and import advanced analytic algorithms, written in R or Python, into a shared function catalog
Navigating the Digital Transformation Landscape WSO2
The document discusses digital transformation and provides guidance on how organizations can navigate it. Digital transformation means improving customer experience through new digital products and business models, optimizing operations, and facilitating digital-native customers and employees. It involves adopting new technologies like APIs, mobile/IoT integration, and data insights, as well as shifting to cloud platforms. Organizations are advised to build developer ecosystems, fail fast through prototypes, empower employees, address business problems, and learn continuously throughout the process. The key message is that digital transformation is ultimately a business transformation powered by software and technology.
The document discusses foundational technologies for data-driven businesses. It describes how data is growing exponentially and outlines challenges in using data due to issues like inconsistency, duplication, and size. It then presents an intelligent data lifecycle framework involving ingesting, interpreting, and transforming data. Key foundational technologies are discussed like messaging systems, data virtualization, rules engines, machine learning, business process management, and robotic process automation. An anti-money laundering use case is presented using these technologies in an open system architecture.
Extending Operations from On-premises Solutions Towards Hybrid and Cloud - Da...Codit
This document discusses extending operations from on-premises solutions to hybrid and cloud environments. It covers moving from release cycles of 6-18 months and a maintenance focus to continuous delivery, microservices, and agility. Continuous monitoring is discussed as an important part of the new operating model, and how machine learning and AI can help with monitoring. Key features of monitoring in Azure are outlined, including Azure Monitor, Metrics, Logs, and the use of machine learning for anomaly detection and support bots.
The Future of Integration - Toon Vanhoutte @CONNECT19Codit
The heart of the digital transformation story for many businesses lies in integration. Today’s app market place allows businesses to pick and choose from a smorgasbord of apps and these apps need to get connected to improve transparency across the company and make it easier to make high-level decisions. Businesses need a flexible integration solution that not only helps them get connected to employees and customers, but also paves the way to the future.
The Agile Analyst: Solving the Data Problem with VirtualizationInside Analysis
The Briefing Room with Radiant Advisors and Cisco
Live Webcast Jan. 21, 2014
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=05e9d4ccbd2505ce15bc8de699f9c961
Today’s business analyst needs data from all kinds of places: the data warehouse, data marts, web services as well as local and departmental files and spreadsheets. The fact is, even seasoned analysts typically spend more than half their time hunting and gathering data, which impedes analytical insights and limits time to value. Increasingly, innovative organizations are turning to data virtualization as a faster path to analytics, thus expediting business impact.
Register for this episode of The Briefing Room to hear Analysts Lindy Ryan and John O'Brien of Radiant Advisors explain how analytical sandboxes and data virtualization can enable true analytic agility. They will be briefed by Marc Breissinger of Cisco Data Virtualization Business Unit, who will tout his company’s upcoming analytic platform Data Collage, a desktop tool for designed for analysts who need agile access to enterprise data. He will discuss how Data Collage allows users to easily combine data and accelerate the development of new analytics.
Visit InsideAnalysis.com for more information.
Technically Speaking: How Self-Service Analytics Fosters CollaborationInside Analysis
This document summarizes an upcoming webinar series from Bloor Research Group on enterprise software and business intelligence technologies. The webinars will take place monthly from June to November, covering topics like intelligence, disruption, analytics, integration, databases, and cloud computing. Attendees can ask questions of presenters and get detailed analysis of innovative technologies. The webinars aim to reveal enterprise software characteristics and give vendors a chance to explain their products to analysts.
Smarter Business i praktiken - IBM Smarter Business 2011IBM Sverige
Presentation från IBM Smarter Business 2011. Spår: Utveckla produkter och tjänster kostnadseffektivt.
Den tekniska utvecklingen skenar och kunder och konsumenter kräver bättre och smartare lösningar som underlättar deras affärer och vardag. Betyder speed to market något för dig? Missa då inte denna session där Colin Williams visar hur dom bästa och mest framgångsrika utvecklingsteamen arbetar.
Talare: Colin Williams, Software Sales Executive- WW Rational Tiger Team.
Mer information på www.smarterbusiness.se
Building Adaptive Apps with APIs and Data: I Love APIs 2014 CTO KeynoteApigee | Google Cloud
Day two I Love APIs 2014 conference kicked off with a keynote from Anant Jhingran - Apigee's CTO. He unveiled the latest version of Apigee Insights and outlined just how businesses become adaptive. He helped another packed auditorium of API lovers grasp the power of API-driven adaptive apps and how data scientists and developers can now collaborate.
https://www.youtube.com/watch?v=KEMiFWebJtc&index=4&list=PLIXjuPlujxxy72-kEorgBF6DxXZRsj1y
Dynatrace: Meet our captain of product and all things awesome, Steve TackDynatrace
Through his work with our global customers, Steve is an expert on the rapid changes playing out in today's IT environments and the challenges this presents now and the future. In this session Steve will dive a little deeper into how we're shaking up the industry; he'll hit on our latest innovation and you'll get to learn why you need to join us on a journey of transition today.
Starbucks: Building a new dev culture and freeing time for innovation: A Star...Dynatrace
Naveen Dronavalli is the Manager of Application Development at Starbucks. He discussed Starbucks' journey in implementing the Dynatrace application performance monitoring (APM) solution to improve monitoring of their applications and reduce mean time to resolution for issues. Some key benefits realized included proactively identifying user logout, database performance, and logging issues. Starbucks has continued expanding use of Dynatrace and exploring its synthetic monitoring and AI capabilities to further support innovation.
This document provides an overview of architecting a digital enterprise. It discusses key aspects of business and technical architecture, including digital products and consumer experience, digital environments, platforms, and reference architectures. The technical architecture section describes a layered "system of systems" reference architecture with systems of engagement, integration, record, and automation, along with an API-enabled digital platform approach.
Knowage official presentation 2018. Knowage is the only open source and full capabilities suite for any modern business analysis.
Feel free to use this presentation to present and promote Knowage open source suite!
Alteryx and Tableau: Iron Mountain's Sherpa to business insightMattSemple1
Presentation made to Alteryx Inspire Conference (London 2019) by Matt Semple (Iron Mountain) and David Turley (InterWorks) explaining the data journey undertaken by Iron Mountain to transform its analytics function and deliver advanced commercial and operational analytics to its global management teams.
A transformation journey leveraging the advanced technology of Alteryx and Tableau aligned to a clear vision of the data roadmap. All supported by a great consulting partnership with the InterWorks team in London.
5 Reasons DevOps Toolchain Needs Time-Series Based MonitoringDevOps.com
Monolithic architectures are being replaced by microservices-driven apps and the cloud- based infrastructure is being tied together and instrumented by DevOps processes. This is driving the need for greater visibility and better monitoring. Legacy monitoring solutions fail to deliver the much needed sub-second visibility. Let’s take a look at Time-Series platforms and how they are delivering the level of visibility and monitoring needed by today’s DevOps initiatives.
In this webinar, we will take a look at Time-Series Data Platforms and outline how InfluxData’s leading Time-Series data platform can deliver the next-gen monitoring for your DevOps projects.
RightScale Roadtrip - Accelerate To CloudRightScale
The Accelerate to Cloud keynote will help you understand the current state of cloud adoption, identify the business value for your organization, and provide you a framework to plot your course to cloud adoption.
As developers, we all want to be more productive. Serverless helps you do just that, by letting you focus on the business logic while shifting operations somewhere else. As more companies discover this emerging technology, we also discover drawbacks like state management. In this session, we’ll focus on what serverless is, how it helps developers, what potential drawbacks exist, and how we can add state management into serverless.
SpagoBI version 6 rebranded as Knowage offers unpaired analytical experience,...OW2
On May 3rd 2017, Engineering released the new version of SpagoBI, the OW2 flagship project on Business Intelligence. Starting from this version, the project is now branded as Knowage, and is available in two editions:
- the Community Edition, entirely open source, released and down-loadable from the OW2 forge, supported by the open source community
- the Entreprise Edition, fully supported by Engineering under a subscription model
Knowage suite is composed of several modules, each one conceived for a specific analytical domain. They can be used individually as a complete solution for a certain task, or combined with one another to ensure full coverage of users’ requirements, allowing to build a tailored product.
The presentation focus on the Community Edition and provides examples, demos and use cases on the most important functional evolutions of the product:
- the brand new and responsive user interface based on Angular.js
- the extended set of graphical widgets to perform advanced data visualization
- the self-service capabilities to perform data discovery
- the new web-based user interface to define the metadata layer by exploiting the data federation principle
- the possibility to define and import advanced analytic algorithms, written in R or Python, into a shared function catalog
Navigating the Digital Transformation Landscape WSO2
The document discusses digital transformation and provides guidance on how organizations can navigate it. Digital transformation means improving customer experience through new digital products and business models, optimizing operations, and facilitating digital-native customers and employees. It involves adopting new technologies like APIs, mobile/IoT integration, and data insights, as well as shifting to cloud platforms. Organizations are advised to build developer ecosystems, fail fast through prototypes, empower employees, address business problems, and learn continuously throughout the process. The key message is that digital transformation is ultimately a business transformation powered by software and technology.
The document discusses foundational technologies for data-driven businesses. It describes how data is growing exponentially and outlines challenges in using data due to issues like inconsistency, duplication, and size. It then presents an intelligent data lifecycle framework involving ingesting, interpreting, and transforming data. Key foundational technologies are discussed like messaging systems, data virtualization, rules engines, machine learning, business process management, and robotic process automation. An anti-money laundering use case is presented using these technologies in an open system architecture.
Extending Operations from On-premises Solutions Towards Hybrid and Cloud - Da...Codit
This document discusses extending operations from on-premises solutions to hybrid and cloud environments. It covers moving from release cycles of 6-18 months and a maintenance focus to continuous delivery, microservices, and agility. Continuous monitoring is discussed as an important part of the new operating model, and how machine learning and AI can help with monitoring. Key features of monitoring in Azure are outlined, including Azure Monitor, Metrics, Logs, and the use of machine learning for anomaly detection and support bots.
The Future of Integration - Toon Vanhoutte @CONNECT19Codit
The heart of the digital transformation story for many businesses lies in integration. Today’s app market place allows businesses to pick and choose from a smorgasbord of apps and these apps need to get connected to improve transparency across the company and make it easier to make high-level decisions. Businesses need a flexible integration solution that not only helps them get connected to employees and customers, but also paves the way to the future.
The Agile Analyst: Solving the Data Problem with VirtualizationInside Analysis
The Briefing Room with Radiant Advisors and Cisco
Live Webcast Jan. 21, 2014
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=05e9d4ccbd2505ce15bc8de699f9c961
Today’s business analyst needs data from all kinds of places: the data warehouse, data marts, web services as well as local and departmental files and spreadsheets. The fact is, even seasoned analysts typically spend more than half their time hunting and gathering data, which impedes analytical insights and limits time to value. Increasingly, innovative organizations are turning to data virtualization as a faster path to analytics, thus expediting business impact.
Register for this episode of The Briefing Room to hear Analysts Lindy Ryan and John O'Brien of Radiant Advisors explain how analytical sandboxes and data virtualization can enable true analytic agility. They will be briefed by Marc Breissinger of Cisco Data Virtualization Business Unit, who will tout his company’s upcoming analytic platform Data Collage, a desktop tool for designed for analysts who need agile access to enterprise data. He will discuss how Data Collage allows users to easily combine data and accelerate the development of new analytics.
Visit InsideAnalysis.com for more information.
Technically Speaking: How Self-Service Analytics Fosters CollaborationInside Analysis
This document summarizes an upcoming webinar series from Bloor Research Group on enterprise software and business intelligence technologies. The webinars will take place monthly from June to November, covering topics like intelligence, disruption, analytics, integration, databases, and cloud computing. Attendees can ask questions of presenters and get detailed analysis of innovative technologies. The webinars aim to reveal enterprise software characteristics and give vendors a chance to explain their products to analysts.
Left Brain, Right Brain: How to Unify Enterprise AnalyticsInside Analysis
The Briefing Room with Robin Bloor and Teradata
Live Webcast on Jan. 29, 2013
Despite its name, effective Data Science requires a certain amount of artistic flair. Analysts must be creative about how and where they find the insights that will drive business value. One classic roadblock to that kind of frictionless process? Programming. Not everyone can code Java, which makes the unstructured domain of Hadoop quite challenging for the average business analyst.
Check out the slides from this episode of the Briefing Room to hear veteran Analyst Dr. Robin Bloor explain how a new generation of analytical platforms will solve the complexity of unifying structured and unstructured data. He'll be briefed by Steve Wooledge of Teradata Aster who will tout his company's Big Data Appliance, which leverages the SQL-H bridge, an innovation designed to connect Hadoop with SQL.
Visit: http://www.insideanalysis.com
Smart companies know that business intelligence surfaces insights. With complex analytics, data mining and everything in between, it takes many moving parts to serve up the big picture. The key is to provide full-stack visibility into the entire BI environment, ensuring solid service and system performance.
Learn more at http://www.insideanalysis.com
EAP - Accelerating behavorial analytics at PayPal using HadoopDataWorks Summit
PayPal today generates massive amounts of data?from clickstream logs to transactions and routine business events. Analyzing customer behavior across this data can be a daunting task. Data Technology team at PayPal has built a configurable engine, Event Analytics Pipeline (EAP), using Hadoop to ingest and process massive amounts of customer interaction data, match business-defined behavioral patterns, and generate entities and interactions matching those patterns. The pipeline is an ecosystem of components built using HDFS, HBase, a data catalog, and seamless connectivity to enterprise data stores. EAP?s data definition, data processing, and behavioral analysis can be adapted to many business needs. Leveraging Hadoop to address the problems of size and scale, EAP promotes agility by abstracting the complexities of big-data technologies using a set of tools and metadata that allow end users to control the behavioral-centric processing of data. EAP abstracts the massive data stored on HDFS as business objects, e.g., customer and page impression events, allowing analysts to easily extract patterns of events across billions of rows of data. The rules system built using HBase allows analysts to define relationships between entities and extrapolate them across disparate data sources to truly explore the universe of customer interaction and behaviors through a single lens.
Big data analytics can provide acquirers with revenue advantages, improved knowledge of customer needs, and greater operational efficiencies. It allows for enhanced fraud management, loyalty programs, and merchant services through analysis of large, diverse transaction datasets. Realizing these benefits requires integrating multiple data sources and deploying analytical tools to glean insights from both structured and unstructured payment information.
In this SlideShare we present an overview of what Floown is. The short answer? A SaaS platform for teams, organizations & businesses to effectively work with different people. The long answer? Well, you're gonna have to click through the SlideShare.
People are slowly beginning to realize that the times, they are a-changing. When it comes to the future of work and automation, it’s not a question of how, but when. We usually only react when it’s already too late. But this time, the writings on the wall are too overwhelming to just ignore them.
Now don’t get me wrong. I’m not saying that you should stock up on guns, build a shelter and prepare for Skynet. But it’s probably a good idea to at least start considering the idea that things might change faster than you think. And in the end, we would hate to say we told you so. So start preparing right now with these 6 crucial tips to survive the second machine age.
LoQutus helps organisations to innovate with analytics and to get insights with data visualisation. We also build large scale data layers to enable interaction with core data, and develop data-driven applications to deliver the insights our customers need. During this session we’ll share what we have learned along the way. We’ll show you our framework for self-service analytics & insights, and some successful case studies.
Strategy session 5 - unlocking the data dividend - andy steerAndy Steer
"A recent study completed by IDC examined the economic benefits accrued to organisations that made basic levels of investment in distinct areas of analytics and data management compared with the benefits accrued by organisations that opted for a broader and more diverse set of investments. The conclusion was that the leading organisations expect to capture in excess of $1.5 trillion more in value from their data and analytics initiatives over the next 4 years. This represents a 60% higher data dividend for the leading organisations.
To achieve these benefits organisations need to embrace the changing reality of the new data driven society and make a break from the beliefs and best practices inherent in traditional Business Intelligence programmes.
During the presentation Andy will expand on the data dividend concept, outline the 4 key investment areas that should be getting your attention and perhaps most importantly, explain how your existing SAP BusinessObjects technology can help you take your share of the estimated £53 billion UK data dividend."
Sumyag Insights provides data science and analytics services. They have a diverse team of over 15 data scientists and engineers with expertise in areas like machine learning, natural language processing, computer vision, and IoT. Their solutions include data wrangling, predictive modeling, prescriptive analytics, and building custom applications and dashboards. They follow an agile approach with sprints and focus on rapid prototyping to provide quick insights and business value to clients in industries like banking, insurance, retail, and manufacturing.
Metric Arts offers business intelligence solutions based on specific objectives of each of the companies who contact us through our freedom to select the best methodologies and tools. Our position as leaders in the development and implementation of business intelligence solutions for different types of companies have allowed us to build a corporate culture focused on continuous improvement, integrating advances in technology and the expertise of our professionals to solve problems different business areas.
This document discusses business analytics and next-generation business intelligence tools. It describes how business analytics is used to gain insights from data to inform business decisions and optimize processes. It also explains that successful business analytics depends on data quality, skilled analysts, and organizational commitment to data-driven decision making. The document then profiles the capabilities of next-generation BI tools, including their support for top-down reporting, bottom-up analysis, self-service capabilities, and their ability to provide insights quickly through in-memory processing and interactive visualizations.
Accenture: Analytics journey to roi Feb 2013Brian Crotty
The document outlines a four stage journey to analytical maturity:
1) Data - how to explore and utilize multi-channel data from various sources
2) Analytics - identifying the right analytics and tools to drive outcomes and build capabilities
3) Insights - providing analytically-informed, issue-based insights at scale through services and expertise
4) Actions - embedding insights into key decisions and processes and scaling capabilities across organizations
Innovate Analytics with Oracle Data Mining & Oracle RCapgemini
This document summarizes a presentation about innovating analytics with Oracle Data Mining and R. The presentation introduces data mining and R, how they can be used with Oracle BI 11g, and Oracle's predictive analytics stack. It provides examples of data mining use cases and encourages organizations to start predictive analytics projects by leveraging existing BI investments. The presentation aims to provide an understanding of data mining and R, how predictive analytics can benefit organizations, and how to get started with a predictive analytics project.
Webinar: Business Intelligence From The Inside OutCorSourceTechPDX
There are a lot of terms thrown around in the world of business intelligence and analytics. Presented as a webinar, this deck is an introduction to the terminology and power of business intelligence to transform companies.
This document discusses open analytics and its benefits. Open analytics combines open tools and agile engineering techniques to enable organizations to deliver analysis products more efficiently. This allows businesses to gain competitive advantages like increased growth, cost reductions, and innovation. Open analytics leverages open source software, open architectures, and open innovation to provide solutions for data processing, search, machine learning, storage, and visualization in a way that is easily extensible, mission agile, and teams analysis with technology. When applied properly with a focus on solving real business problems, open analytics can derive significant economic value for organizations.
A Strategic View of Enterprise Reporting and Analytics: The Data FunnelInside Analysis
The Briefing Room with Colin White and Jaspersoft
Slides from the Live Webcast on June 12, 2012
As the corporate appetite for analytics and reporting grows, companies must find a way to secure a strategic view of their information architecture. End users with varying degrees of expertise need a wide range of data and reports delivered in a timely fashion. As the audience for analytics expands, that puts pressure on IT infrastructure and staff. And now with the promise of Hadoop and MapReduce, the organization's desire for business insight becomes even more significant.
In this episode of The Briefing Room, veteran Analyst Colin White of BI Research will explain the value of being strategic with enterprise reporting. White will be briefed by Karl Van den Bergh of Jaspersoft, who will tout his company's “data funnel” concept, which is designed to strategically manage an organization's information architecture. By aligning information assets along this funnel, IT can effectively address the spectrum of analytical needs – from simple reporting to complex, ad hoc analysis – without over-taxing personnel and system resources.
Agile Data Science is a lean methodology that is adopted from Agile Software Development. At the core it centers around people, interactions, and building minimally viable products to ship fast and often to solicit customer feedback. In this presentation, I describe how this work was done in the past with examples. Get started today with our help by visiting http://www.alpinenow.com
Find how to add more value to your Business Intelligence and Performance Management solutions by incorporating predictive analytics using IBM Cognos 10. Learn about the integration of Predictive Analytics and SPSS functionality, and how it fits with the Cognos platform. View the video recording and download this deck: http://www.senturus.com/resources/ibm-cognos-10-demo-predictive-analytics/.
Senturus, a business analytics consulting firm, has a resource library with hundreds of free recorded webinars, trainings, demos and unbiased product reviews. Take a look and share them with your colleagues and friends: http://www.senturus.com/resources/.
TechWise with Eric Kavanagh, Dr. Robin Bloor and Dr. Kirk Borne
Live Webcast on July 23, 2014
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=59d50a520542ee7ed00a0c38e8319b54
Analytical applications are everywhere these days, and for good reason. Organizations large and small are using analytics to better understand any aspect of their business: customers, processes, behaviors, even competitors. There are several critical success factors for using analytics effectively: 1) know which kind of apps make sense for your company; 2) figure out which data sets you can use, both internal and external; 3) determine optimal roles and responsibilities for your team; 4) identify where you need help, either by hiring new employees or using consultants 5) manage your program effectively over time.
Register for this episode of TechWise to learn from two of the most experienced analysts in the business: Dr. Robin Bloor, Chief Analyst of The Bloor Group, and Dr. Kirk Borne, Data Scientist, George Mason University. Each will provide their perspective on how companies can address each of the key success factors in building, refining and using analytics to improve their business. There will then be an extensive Q&A session in which attendees can ask detailed questions of our experts and get answers in real time. Registrants will also receive a consolidated deck of slides, not just from the main presenters, but also from a variety of software vendors who provide targeted solutions.
Visit InsideAnlaysis.com for more information.
This document discusses cognitive computing and analytics technologies. It provides examples of how cognitive systems can be applied, such as a toy that learns from child interactions. The document outlines a cognitive strategy and foundation that includes collecting and analyzing both structured and unstructured data. It also discusses the importance of cloud services, infrastructure, and security for cognitive systems. Finally, the document describes some of the cognitive computing APIs available from IBM Watson and how the set of APIs has expanded over time.
Big Data : From HindSight to Insight to ForesightSunil Ranka
When it comes to Analytics and Reporting , There is a fine line between HindSight to Insight to Foresight . With the evolution of BigData technology, there is a need in deriving value out of the larger datasets, not available in the past. Even before we can start using the new shiny technologies, there is a need of understanding what is categorized as reporting or business intelligence or Big Data and Analytics. Based on my experience, people struggle to distinguish between reporting, Analytics, and Business Intelligence.
Manthan provides solutions and services across various domains including analytics, information management, big data, social media intelligence, mobile dashboards, master data management, and data quality. It has over 700 associates with expertise in research and development, different engagement models, and over 350 accelerators and solution templates. Services include consulting, implementation, custom development, and managed services.
How to Identify, Train or Become a Data ScientistInside Analysis
The Briefing Room with Neil Raden and Actian
Live Webcast Sept. 3, 2013
Visit: www.insideanalysis.com
Respected research institutes keep saying we have a shortage of data scientists, which makes sense because the title is so new. But most business analysts and serious data managers have at least some of the necessary training to fill this new role. And any number of curious, diligent professionals can learn how to be a data scientist, if they can get access to the right tools and education.
Register for this episode of The Briefing Room to hear veteran Analyst Neil Raden of Hired Brains offer insights about how to identify the key characteristics of a data scientist role. He'll then explain how professionals can incrementally improve their data science skills. He'll be briefed by John Santaferraro of Actian, who will showcase his company's Data Flow Engine, which provides unprecedented visual access to highly complex data flows. This, coupled with Actian's multiple analytics database technologies, opens the door to whole new avenues of possible insights.
The document discusses the growth of big data and analytics. It provides statistics showing massive growth in digital data from various sources. It then discusses the evolution of Hadoop and MapReduce for analyzing large, unstructured datasets. The document promotes Think Big Analytics as a pure-play big data consulting firm and solutions provider that partners with Amazon Web Services (AWS) to build enterprise analytics solutions for Fortune 1000 companies. Case studies and solution frameworks are presented for financial, online advertising, and other industries.
Four Key Considerations for your Big Data Analytics StrategyArcadia Data
This document discusses considerations for big data analytics strategies. It covers how big data analytics have evolved from focusing on structured data and batch processing to also including real-time, multi-structured data from various sources. It emphasizes that discovery is key and requires visual exploration of granular data details. Native big data analytics platforms are needed that can handle real-time streaming data and provide self-service capabilities through customizable applications. The document provides examples of how various companies are using big data analytics for applications like cybersecurity, customer analytics, and supply chain optimization.
Similar to The New Normal: Predictive Power on the Front Lines (20)
Agile, Automated, Aware: How to Model for SuccessInside Analysis
The Briefing Room with David Loshin and Embarcadero
Live Webcast October 27, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/onstage/g.php?MTID=eea9877b71c653c499c809c5693eae8fe
Data management teams face some tough challenges these days. Organizations need business-driven visibility that enables understanding and awareness of enterprise data assets – without worrying about definitions and change management. But with information architectures evolving into a hybrid mix of data objects and data services built over relational databases as well as big data stores, serving up accurately defined, reusable data can become a complex issue.
Register for this episode of The Briefing Room to learn from veteran Analyst David Loshin as he explains the importance of agile, automated workflows in today’s enterprise. He’ll be briefed by Ron Huizenga of Embarcadero, who will discuss how his company’s ER/Studio suite approaches data modeling and management from a modern architecture standpoint. He will explain that unifying the way information is represented can not only eliminate the need for costly workarounds, but also foster collaboration between data architects, developers and business users.
Visit InsideAnalysis.com for more information.
First in Class: Optimizing the Data Lake for Tighter IntegrationInside Analysis
The Briefing Room with Dr. Robin Bloor and Teradata RainStor
Live Webcast October 13, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=012bb2c290097165911872b1f241531d
Hadoop data lakes are emerging as peers to corporate data warehouses. However, successful data management solutions require a fusion of all relevant data, new and old, which has proven challenging for many companies. With a data lake that’s been optimized for fast queries, solid governance and lifecycle management, users can take data management to a whole new level.
Register for this episode of The Briefing Room to learn from veteran Analyst Dr. Robin Bloor as he discusses the relevance of data lakes in today’s information landscape. He’ll be briefed by Mark Cusack of Teradata, who will explain how his company’s archiving solution has developed into a storage point for raw data. He’ll show how the proven compression, scalability and governance of Teradata RainStor combined with Hadoop can enable an optimized data lake that serves as both reservoir for historical data and as a "system of record” for the enterprise.
Visit InsideAnalysis.com for more information.
Fit For Purpose: Preventing a Big Data LetdownInside Analysis
The Briefing Room with Dr. Robin Bloor and RedPoint Global
Live Webcast October 6, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=9982ad3a2603345984895f279e849d35
Gartner recently placed Big Data in its “trough of disillusionment,” reflective of many leaders’ struggle to prove the value of Hadoop within their organization. While the promise of enhanced data integration and enrichment is obvious, measurable results have remained elusive. This episode of The Briefing Room will outline how to successfully tie Big Data to existing business applications, preventing your next Hadoop project from being another “Big Data letdown.”
Register today to learn from veteran Analyst Dr. Robin Bloor as he discusses the importance of converging enterprise data integration with intelligence and scalability. He’ll be briefed by George Corugedo of RedPoint Global, who will provide concrete examples of how the convergence of scalable cloud platforms, ever-expanding data sources and intelligent execution can turn the Big Data hype into demonstrable business value.
Visit InsideAnalysis.com for more information.
To Serve and Protect: Making Sense of Hadoop Security Inside Analysis
HP Security Voltage provides data-centric security solutions to protect sensitive data in Hadoop environments. Their solutions leverage tokenization and encryption to safeguard data at rest, in motion, and in use across the data lifecycle. They presented use cases where their technology helped secure financial, healthcare, and telecommunications customer data in Hadoop and other platforms. Questions from analysts focused on implementation experience, performance impacts, integration with authentication, costs, and supported environments and partnerships.
The Hadoop Guarantee: Keeping Analytics Running On TimeInside Analysis
The Briefing Room with Dr. Robin Bloor and Pepperdata
Live Webcast September 15, 2015
Watch the Archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=32f198185d9d0c4cf32c27bdd1498b2a
Industry researchers agree: the importance of Hadoop will continue to grow as more companies recognize the range of benefits they can reap, from lower-cost storage to better business insights. At the same time, advances in the Hadoop ecosystem are addressing many of the key concerns that have hampered adoption, including performance and reliability. As a result, Hadoop is fast becoming a first-class citizen in the world of enterprise computing.
Register for this episode of The Briefing Room to hear veteran Analyst Dr. Robin Bloor explain how the Hadoop ecosystem is evolving into a mature foundation for managing enterprise data. He’ll be briefed by Sean Suchter of Pepperdata, who will explain how his company’s software brings predictability and reliability to Hadoop through dynamic, policy-based controls and monitoring. He’ll show how to guarantee service-level agreements by slowing down low-priority tasks as needed. He’ll also discuss the holy grail of Hadoop: how to enable mixed workloads.
Visit InsideAnalysis.com for more information.
Special Edition with Dr. Robin Bloor
Live Webcast September 9, 2015
Watch the Archive: https://bloorgroup.webex.com/bloorgroup/onstage/g.php?MTID=e8b9ac35d8e4ffa3452562c1d4286a975
Do the math: algebra will transform information management. Just as the relational database revolutionized the information landscape, so will a just-released, complete algebra of data overhaul the industry itself. So says Dr. Robin Bloor in his new book, the Algebra of Data, which he’ll outline in this special one-hour webcast.
Once organizations learn how to express their data sets algebraically, the benefits will be significant and far-reaching. Data quality problems will slowly subside; queries will run orders of magnitude faster; integration challenges will fade; and countless tedious jobs in the data management space will bid their farewell. But first, software companies must evolve, and that will take time.
Visit InsideAnalysis.com for more information.
The Role of Data Wrangling in Driving Hadoop AdoptionInside Analysis
The Briefing Room with Mark Madsen and Trifacta
Live Webcast September 1, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/onstage/g.php?MTID=eb655874d04ba7d560be87a9d906dd2fd
Like all enterprise software solutions, Hadoop must deliver business value in order to be a success. Much of the innovation around the big data industry these days therefore addresses usability. While there will always be a technical side to the Hadoop equation, the need for user-friendly tools to manage the data will continue to focus on business users. That’s why self-service data preparation or "data wrangling" is a serious and growing trend, one which promises to move Hadoop beyond the early adopter phase and more into the mainstream of business.
Register for this episode of The Briefing Room to hear veteran Analyst Mark Madsen of Third Nature explain why business users will play an increasingly important role in the evolution of big data. He’ll be briefed by Trifacta's Will Davis and Alon Bartur, who will demonstrate how Trifacta's solution empowers business users to “wrangle" data of all shapes and sizes faster and easier than ever before. They’ll discuss why a new approach to accessing and preparing diverse data is required and how it can accelerate and broaden the use of big data within organizations.
Visit InsideAnalysis.com for more information.
Ahead of the Stream: How to Future-Proof Real-Time AnalyticsInside Analysis
Business seems to move faster by the day, with the most cutting edge companies taking advantage of real-time data streams for heavy duty analytics. But with so much innovation happening in so many places, how can companies stay ahead of the game? One answer is to future-proof your analytics architecture by using an abstraction layer that can translate your business use-case or work-flow to one of many leading innovative technologies to address the growing number of use cases in this dynamic field.
Register for this episode of The Briefing Room to hear veteran Analyst Dr. Robin Bloor, as he explains how a data flow architecture can harness a wide range of streaming solutions. He'll be briefed by Anand Venugopal of Impetus Technologies, who will showcase his company's StreamAnalytix platform, which was designed from the ground up to leverage multiple major streaming engines available today, including Apache Spark, Apache Storm and others. He'll demonstrate how StreamAnalytix provides enterprise-class performance while incorporating best-of-breed open-source components.
View the archive at: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=925d1e9b639b78c6cf76a1bbbf485b2b
All Together Now: Connected Analytics for the Internet of EverythingInside Analysis
The Briefing Room with Mark Madsen and Cisco
Live Webcast August 18, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=0eff120f8b2879b582b77f4ff207ee54
Today's digital enterprises are seeing an explosion of data at the edge. The Internet of Everything is fast approaching a critical mass that will demand a sea change in how companies process data. This new world of information is widely distributed, streaming, and overall becoming too big to move. Experts predict that within two to three years, the bulk of analytic processing will take place on the fringes of information architectures. As a result, forward-thinking companies are dramatically shifting their analytic strategies.
Register for this episode of The Briefing Room to hear veteran Analyst Mark Madsen of Third Nature explain how a new era of information architectures is now unfolding, paving the way to much more responsive and agile business models. He'll be briefed by Kim Macpherson of the Cisco Data and Analytics Business Unit, who will explain how her company's platform is uniquely suited for this new, federated analytic paradigm. She'll demonstrate how edge analytics can help companies address opportunities quickly and effectively.
Visit InsideAnalysis.com for more information.
Goodbye, Bottlenecks: How Scale-Out and In-Memory Solve ETLInside Analysis
The Briefing Room with Dr. Robin Bloor and Splice Machine
Live Webcast August 11, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/onstage/g.php?MTID=e1b33c9d45b178e13784b4a971a4c1349
The ETL process was born out of necessity, and for decades it has been the glue between data sources and target applications. But as data
growth soars and increased competition demands real-time data, standard ETL has become brittle and often unmanageable. Scaling up resources can do the trick, but it’s very costly and only a matter of time before the processes hit another bottleneck. When outmoded ETL stands in the way of real-time analytics, it might be time to consider a completely new approach.
Register for this episode of The Briefing Room to learn from veteran Analyst Dr. Robin Bloor as he explains how modern, data-driven architectures must adopt an equally capable data integration strategy. He’ll be briefed by Rich Reimer of Splice Machine, who will discuss how his company solves ETL performance issues and enables real-time analytics and reports on big data. He will show that by leveraging the scale-out power of Hadoop and the in-memory speed of Spark, users can bring both analytical and operational systems together, eventually performing transformations only when needed.
Visit InsideAnalysis.com for more information.
The Biggest Picture: Situational Awareness on a Global LevelInside Analysis
The Briefing Room with Dr. Robin Bloor and Modus Operandi
Live Webcast July 28, 2015
Watch the Archive: https://bloorgroup.webex.com/bloorgroup/onstage/g.php?MTID=efc4082d9b0b0adfcd753a7435d2d6a1b
The analytic bottlenecks of yesterday need not apply today. The boundaries are also falling thanks in large part to the abundance of third-party data. The most data-driven companies these days are finding creative ways to dynamically incorporate data from within and beyond the firewall, thus building highly accurate, multidimensional views of their business, customer, competition or other subject areas.
Register for this episode of The Briefing Room to hear veteran Analyst Dr. Robin Bloor as he explains the magnitude of change that's occurring in the world of data, why it's happening now, and how you can take advantage. He'll be briefed by Mike Gilger and Boris Pelakh, who will showcase their company's enterprise analytics platform, which combines a range of battle-tested functionality to deliver dynamic situational awareness that can leverage a comprehensive array of data sets. They'll explain how the platform's reasoner benefits from a highly scalable rules engine, and a flexible modeling capability that can optimize data storage virtually on the fly.
Visit InsideAnalysis.com for more information.
Structurally Sound: How to Tame Your ArchitectureInside Analysis
The Briefing Room with Krish Krishnan and Teradata
Live Webcast July 21, 2015
Watch the Archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=602b2a8413e8719d39465f4d6291d505
Technology changes all the time, but the basic needs of the business are the same: BI and analytics. With new types of data, various analytics engines and multiple systems, giving business users seamless access to enterprise data can be a rather daunting process. One solution is to provide a complete fabric that spans the organization, touching all data points and masking the complexity behind disparate sources.
Register for this episode of The Briefing Room to learn from veteran Analyst Krish Krishnan as he explores how and why architectures have changed over the years. He’ll be briefed by Imad Birouty of Teradata, who will discuss his company’s QueryGrid, an analytics solution designed to provide access to data across all systems. He will show how QueryGrid essentially creates a logical data warehouse and enables users to leverage SQL over multiple data types.
Visit InsideAnalysis.com for more information.
SQL In Hadoop: Big Data Innovation Without the RiskInside Analysis
The Briefing Room with Dr. Robin Bloor and Actian
Live Webcast July 14, 2015
Watch the Archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=bbd4395ea2f8c60a03cfefc68c7aa823
Innovation often implies risk, which is why businesses have many issues to weigh when considering change. Yet the remarkable growth of data is driving many traditional systems into the ground, forcing information workers to take a critical look at their existing tools. Technologies like Hadoop offer economical solutions to big data management, but to truly take advantage of its capabilities, organizations must modernize their infrastructure.
Register for this episode of The Briefing Room to learn from veteran Analyst Dr. Robin Bloor as he explains how and why organizations should improve legacy systems. He’ll be briefed by Todd Untrecht of Actian, who will tout his company’s Actian Vortex, a SQL-in-Hadoop solution. He will show how integrating a SQL engine directly in the Hadoop cluster can lead to faster analytics and greater control, while still maintaining existing investments.
Visit InsideAnalysis.com for more information.
The document discusses SYSTAP and their graph database product Blazegraph. It provides an overview of SYSTAP and Blazegraph, highlighting that Blazegraph can scale to handle large graph datasets with billions or trillions of edges through various deployment options including embedded, high availability, scale-out, and GPU acceleration configurations. The document also discusses how Blazegraph is being used by organizations for applications like knowledge graphs, genomics, and defense/intelligence.
A Revolutionary Approach to Modernizing the Data WarehouseInside Analysis
The document discusses an upcoming panel discussion on hot technologies for 2015. It introduces the host and three analysts who will be participating: Rick Sherman from Athena IT Solutions, Dr. Robin Bloor from The Bloor Group, and Bob Muglia from Snowflake Computing. The panel will discuss modernizing the data warehouse and new database technologies.
The Maturity Model: Taking the Growing Pains Out of HadoopInside Analysis
The Briefing Room with Rick van der Lans and Think Big, a Teradata Company
Live Webcast on June 16, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=197f8106531874cc5c14081ca214eaff
Hadoop is arguably one of the most disruptive technologies of the last decade. Once lauded solely for its ability to transform the speed of batch processing, it has marched steadily forward and promulgated an array of performance-enhancing accessories, notably Spark and YARN. Hadoop has evolved into much more than a file system and batch processor, and it now promises to stand as the data management and analytics backbone for enterprises.
Register for this episode of The Briefing Room to learn from veteran Analyst Rick van der Lans, as he discusses the emerging roles of Hadoop within the analytics ecosystem. He’ll be briefed by Ron Bodkin of Think Big, a Teradata Company, who will explore Hadoop’s maturity spectrum, from typical entry use cases all the way up the value chain. He’ll show how enterprises that already use Hadoop in production are finding new ways to exploit its power and build creative, dynamic analytics environments.
Visit InsideAnalysis.com for more information.
Rethinking Data Availability and Governance in a Mobile WorldInside Analysis
The Briefing Room with Malcolm Chisholm and Druva
Live Webcast on June 9, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=baf82d3835c5dfa63202dcbe322a3ad7
The emergence of the mobile workforce has left an indelible mark on the enterprise; every employee is now mobile, and business data continues to be dispatched to the far reaches of the enterprise. While this has added enormous opportunity for increased productivity, it has also muddied the waters when it comes to controlling and protecting valuable data assets. As companies quickly evolve to address the new set of challenges posed by this shift in data usage, IT must ensure that all data, no matter where it’s generated or stored, is available and governed just as if it were still safely behind the corporate firewall.
Register for this episode of The Briefing Room to hear veteran Analyst Malcolm Chisholm as he explains the myriad challenges that mobile data introduces when addressing regulations and compliance needs, requiring new approaches to data governance. He’ll be briefed by Dave Packer of Druva, who will outline his company’s converged data protection strategy, which brings data center class capabilities to backup, availability and governance for the mobile workforce. He will share strategies to meet regional data residency, data recovery, legal hold and eDiscovery requirements and more.
Visit InsideAnalysis.com for more information.
The document discusses a new approach to application middleware called EnterpriseWeb that uses a unified object model, shared memory, and goal-oriented software agents to enable responsive and interconnected distributed processes. It aims to simplify application development and management by harmonizing different resource representations and providing common services. In contrast to traditional application stacks, EnterpriseWeb presents an application fabric that can dynamically compose and orchestrate processes and resources across diverse infrastructure. It has won several awards for its innovative semantic platform technology.
This document discusses the need for thought leadership and innovative thinking over a sole focus on technology and data. It argues that meta-ideas, rather than just metadata, are driving innovations today. Interdisciplinary thought across industries and non-traditional hires are needed to develop new perspectives and break from traditional views. As data grows exponentially, new approaches are required that combine different data techniques rather than relying on single technologies. Advanced data modeling is needed to capture human concepts and link data to real-world contexts and objectives.
This document discusses how Hadoop can help solve challenges with key corporate data known as "small data". Small data refers to structured data that is critical to main business activities. It discusses issues with small data like multiple sources and definitions causing inconsistencies. It proposes using Hadoop to iteratively detect errors and inconsistencies in small data to allow normalization. Normalization combines subject knowledge and rules to make small data more consistent and meaningful for analysis. The document argues Hadoop provides a flexible, fast environment for data analytics that can help address requirements around understanding, preparing, and maintaining small but critical corporate data.
Discover top-tier mobile app development services, offering innovative solutions for iOS and Android. Enhance your business with custom, user-friendly mobile applications.
Dandelion Hashtable: beyond billion requests per second on a commodity serverAntonios Katsarakis
This slide deck presents DLHT, a concurrent in-memory hashtable. Despite efforts to optimize hashtables, that go as far as sacrificing core functionality, state-of-the-art designs still incur multiple memory accesses per request and block request processing in three cases. First, most hashtables block while waiting for data to be retrieved from memory. Second, open-addressing designs, which represent the current state-of-the-art, either cannot free index slots on deletes or must block all requests to do so. Third, index resizes block every request until all objects are copied to the new index. Defying folklore wisdom, DLHT forgoes open-addressing and adopts a fully-featured and memory-aware closed-addressing design based on bounded cache-line-chaining. This design offers lock-free index operations and deletes that free slots instantly, (2) completes most requests with a single memory access, (3) utilizes software prefetching to hide memory latencies, and (4) employs a novel non-blocking and parallel resizing. In a commodity server and a memory-resident workload, DLHT surpasses 1.6B requests per second and provides 3.5x (12x) the throughput of the state-of-the-art closed-addressing (open-addressing) resizable hashtable on Gets (Deletes).
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/temporal-event-neural-networks-a-more-efficient-alternative-to-the-transformer-a-presentation-from-brainchip/
Chris Jones, Director of Product Management at BrainChip , presents the “Temporal Event Neural Networks: A More Efficient Alternative to the Transformer” tutorial at the May 2024 Embedded Vision Summit.
The expansion of AI services necessitates enhanced computational capabilities on edge devices. Temporal Event Neural Networks (TENNs), developed by BrainChip, represent a novel and highly efficient state-space network. TENNs demonstrate exceptional proficiency in handling multi-dimensional streaming data, facilitating advancements in object detection, action recognition, speech enhancement and language model/sequence generation. Through the utilization of polynomial-based continuous convolutions, TENNs streamline models, expedite training processes and significantly diminish memory requirements, achieving notable reductions of up to 50x in parameters and 5,000x in energy consumption compared to prevailing methodologies like transformers.
Integration with BrainChip’s Akida neuromorphic hardware IP further enhances TENNs’ capabilities, enabling the realization of highly capable, portable and passively cooled edge devices. This presentation delves into the technical innovations underlying TENNs, presents real-world benchmarks, and elucidates how this cutting-edge approach is positioned to revolutionize edge AI across diverse applications.
AppSec PNW: Android and iOS Application Security with MobSFAjin Abraham
Mobile Security Framework - MobSF is a free and open source automated mobile application security testing environment designed to help security engineers, researchers, developers, and penetration testers to identify security vulnerabilities, malicious behaviours and privacy concerns in mobile applications using static and dynamic analysis. It supports all the popular mobile application binaries and source code formats built for Android and iOS devices. In addition to automated security assessment, it also offers an interactive testing environment to build and execute scenario based test/fuzz cases against the application.
This talk covers:
Using MobSF for static analysis of mobile applications.
Interactive dynamic security assessment of Android and iOS applications.
Solving Mobile app CTF challenges.
Reverse engineering and runtime analysis of Mobile malware.
How to shift left and integrate MobSF/mobsfscan SAST and DAST in your build pipeline.
Essentials of Automations: Exploring Attributes & Automation ParametersSafe Software
Building automations in FME Flow can save time, money, and help businesses scale by eliminating data silos and providing data to stakeholders in real-time. One essential component to orchestrating complex automations is the use of attributes & automation parameters (both formerly known as “keys”). In fact, it’s unlikely you’ll ever build an Automation without using these components, but what exactly are they?
Attributes & automation parameters enable the automation author to pass data values from one automation component to the next. During this webinar, our FME Flow Specialists will cover leveraging the three types of these output attributes & parameters in FME Flow: Event, Custom, and Automation. As a bonus, they’ll also be making use of the Split-Merge Block functionality.
You’ll leave this webinar with a better understanding of how to maximize the potential of automations by making use of attributes & automation parameters, with the ultimate goal of setting your enterprise integration workflows up on autopilot.
Freshworks Rethinks NoSQL for Rapid Scaling & Cost-EfficiencyScyllaDB
Freshworks creates AI-boosted business software that helps employees work more efficiently and effectively. Managing data across multiple RDBMS and NoSQL databases was already a challenge at their current scale. To prepare for 10X growth, they knew it was time to rethink their database strategy. Learn how they architected a solution that would simplify scaling while keeping costs under control.
How information systems are built or acquired puts information, which is what they should be about, in a secondary place. Our language adapted accordingly, and we no longer talk about information systems but applications. Applications evolved in a way to break data into diverse fragments, tightly coupled with applications and expensive to integrate. The result is technical debt, which is re-paid by taking even bigger "loans", resulting in an ever-increasing technical debt. Software engineering and procurement practices work in sync with market forces to maintain this trend. This talk demonstrates how natural this situation is. The question is: can something be done to reverse the trend?
HCL Notes and Domino License Cost Reduction in the World of DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-and-domino-license-cost-reduction-in-the-world-of-dlau/
The introduction of DLAU and the CCB & CCX licensing model caused quite a stir in the HCL community. As a Notes and Domino customer, you may have faced challenges with unexpected user counts and license costs. You probably have questions on how this new licensing approach works and how to benefit from it. Most importantly, you likely have budget constraints and want to save money where possible. Don’t worry, we can help with all of this!
We’ll show you how to fix common misconfigurations that cause higher-than-expected user counts, and how to identify accounts which you can deactivate to save money. There are also frequent patterns that can cause unnecessary cost, like using a person document instead of a mail-in for shared mailboxes. We’ll provide examples and solutions for those as well. And naturally we’ll explain the new licensing model.
Join HCL Ambassador Marc Thomas in this webinar with a special guest appearance from Franz Walder. It will give you the tools and know-how to stay on top of what is going on with Domino licensing. You will be able lower your cost through an optimized configuration and keep it low going forward.
These topics will be covered
- Reducing license cost by finding and fixing misconfigurations and superfluous accounts
- How do CCB and CCX licenses really work?
- Understanding the DLAU tool and how to best utilize it
- Tips for common problem areas, like team mailboxes, functional/test users, etc
- Practical examples and best practices to implement right away
Programming Foundation Models with DSPy - Meetup SlidesZilliz
Prompting language models is hard, while programming language models is easy. In this talk, I will discuss the state-of-the-art framework DSPy for programming foundation models with its powerful optimizers and runtime constraint system.
Northern Engraving | Nameplate Manufacturing Process - 2024Northern Engraving
Manufacturing custom quality metal nameplates and badges involves several standard operations. Processes include sheet prep, lithography, screening, coating, punch press and inspection. All decoration is completed in the flat sheet with adhesive and tooling operations following. The possibilities for creating unique durable nameplates are endless. How will you create your brand identity? We can help!
Main news related to the CCS TSI 2023 (2023/1695)Jakub Marek
An English 🇬🇧 translation of a presentation to the speech I gave about the main changes brought by CCS TSI 2023 at the biggest Czech conference on Communications and signalling systems on Railways, which was held in Clarion Hotel Olomouc from 7th to 9th November 2023 (konferenceszt.cz). Attended by around 500 participants and 200 on-line followers.
The original Czech 🇨🇿 version of the presentation can be found here: https://www.slideshare.net/slideshow/hlavni-novinky-souvisejici-s-ccs-tsi-2023-2023-1695/269688092 .
The videorecording (in Czech) from the presentation is available here: https://youtu.be/WzjJWm4IyPk?si=SImb06tuXGb30BEH .
Digital Banking in the Cloud: How Citizens Bank Unlocked Their MainframePrecisely
Inconsistent user experience and siloed data, high costs, and changing customer expectations – Citizens Bank was experiencing these challenges while it was attempting to deliver a superior digital banking experience for its clients. Its core banking applications run on the mainframe and Citizens was using legacy utilities to get the critical mainframe data to feed customer-facing channels, like call centers, web, and mobile. Ultimately, this led to higher operating costs (MIPS), delayed response times, and longer time to market.
Ever-changing customer expectations demand more modern digital experiences, and the bank needed to find a solution that could provide real-time data to its customer channels with low latency and operating costs. Join this session to learn how Citizens is leveraging Precisely to replicate mainframe data to its customer channels and deliver on their “modern digital bank” experiences.
2. Welcome
Host:
Eric Kavanagh
eric.kavanagh@bloorgroup.com
Twitter Tag: #briefr The Briefing Room
3. Mission
! Reveal the essential characteristics of enterprise software,
good and bad
! Provide a forum for detailed analysis of today s innovative
technologies
! Give vendors a chance to explain their product to savvy
analysts
! Allow audience members to pose serious questions... and get
answers!
Twitter Tag: #briefr The Briefing Room
6. Analyst: Mike Ferguson
Mike Ferguson is Managing Director of
Intelligent Business Strategies Limited. As an
independent analyst and consultant, he
specializes in business intelligence, data
management and enterprise business
integration. With more than 30 years of IT
experience, Mike has consulted for dozens of
companies, spoken at events all over the
world and written numerous articles.
Formerly he was a principal and co-founder
of Codd and Date Europe Limited – the
inventors of the Relational Model, a Chief
Architect at Teradata on the Teradata DBMS
and European Managing Director of DataBase
Associates where he was a partner with Colin
White.
Twitter Tag: #briefr The Briefing Room
7. Alteryx
! Alteryx provides an enterprise-class analytics platform
which enables users to combine Big Data with information
assets across the organization
! Analysts can perform predictive and spatial analytics, as
well as produce sharable apps
! Alteryx’s Strategic Analytics Software is a desktop-to-cloud
solution that combines business data, industry content and
spatial processing
Twitter Tag: #briefr The Briefing Room
8. Matt Madden
Matt Madden is Senior Product
Marketing Manager at Alteryx. He has
over 13 years of experience helping
organizations realize the power and
benefits of analytics in the roles of
Sales and Marketing.
Twitter Tag: #briefr The Briefing Room
33. Alteryx In The Briefing Room
Mike Ferguson
Managing Director
Intelligent Business Strategies
February 2013
www.intelligentbusiness.biz
Twitter: @mikeferguson1
34. Traditional Data Warehousing and Business
Intelligence
Data Warehousing Business Intelligence
Integration / DQ P
o
BI
r
Data
Tools web
DW t
Platform
a Reports &
l analytics
Operational Data warehouse &
data data marts
What is Data Warehousing? What is Business Intelligence?
Data warehousing is the process of building Business Intelligence is actionable
an analytical system by cleaning and business insight that is produced by
integrating data from multiple data sources querying and analysing data in a data
warehouse or a data mart using BI tools
The analytical system can consist of 1 or
more databases A typical organisation has information
producers and information consumers.
34
35. What Is Self Service BI?
“ The creation of a BI environment whereby business users
can create and access BI reports, queries, and analytics
without the need for IT involvement”
§ Business users need to be able to:
• Be more self-sufficient
• Collaborate with others to share insights and make decisions
• Access personalised business insight
§ Self-service BI options
• Data discovery and visualisation tools
• Analytical workflow and visualisation tools
§ Self-service BI is NOT about self-service data warehousing
• Data governance and common data definitions are critical to
maximising the use of trusted data and facilitating common
understanding
35
36. Self-Service BI Data Discovery and Visualisation Tools
Allow Users to Quickly Produce Insight – e.g. Insurance
e.g. Calculate Net Premiums and Claims even when re-insurance data is not in
the DW
community
Data discovery
and Publish / Share
visualisation tool Consume /
insights Enhance /
Re-publish /
Data Act
In-memory data
visualisation
server with
in-memory
columnar
storage
Predictive model
Underwriting DW Ultimates Re-insurance
system data data
36
37. Self-Service Analytical Workflow Development & Visualisation
Tools Allow Users to Quickly Produce Insight – e.g. Insurance
e.g. Calculate Net Premiums and Claims even when re-insurance data is not in
the DW
community
Analytical workflow
development and Publish / Share
visualisation tool Consume /
insights Enhance /
Re-publish /
Analytical Act
Workflow execution
Workflow
Execution
Server
Predictive model
Underwriting DW Ultimates Re-insurance
system data data
37
38. Predictive Analytics Are Now Becoming Available In Self-
Service BI Tools – But Do Users Know How to Use Them
Business Analyst community
Publish / Share
Consume /
insights Enhance /
Re-publish /
Predictive models Data Discovery & Act
Visualisation OR
Analytical workflow server
The challenge is making it
easy for non-statistically
trained business analysts to
select the right algorithms
for the business questions
Predictive model they are trying to answer
Underwriting DW Ultimates Re-insurance
system data data
38
39. Impact of Self-Service BI/Analytical Tools on Data
Management
§ Business users needing data from multiple sources are using
front end tools for data integration rather than for data
analysis and visualisation
§ Potentially inconsistent data definitions and calculations for
the same data created by every user doing their own data
integration
§ Potentially a major increase in the proliferation of overlapping
data sets created by self-service BI business users not
connecting to data via a BI platform semantic layer
§ Potential for multiple versions of unmanaged data scattered
throughout the enterprise
• Potential for multiple versions of reference data
§ Potential for inconsistent data everywhere and not just
created by Excel users
39
40. Simplifying And Governing Data Access to Improve Self-
Self-Service BI
Service BI – One Approach is Via Data Virtualisation
community
Business Analyst Publish / Share
Consume /
Enhance /
Re-publish
Data Discovery &
Visualisation OR
Analytical workflow server
Data Virtualization
personal
Transaction
& office
systems
data
DW
Predictive
models
Data Management
40
41. Governing Information Distribution Is Also Important
- Information Producers and Information Consumers
Information Producers Information Consumers
Govern who can Govern what they
produce, what data can access and what
they can access and devices they can use Business
how they name data glossary
Information Distribution
Business
glossary
Business & Financial Analysts,
IT Developers, Some Managers
Executives, Managers, Frontline workers,
Govern Customers, Partners, Suppliers
distribution
41
42. New Data Sources Have Emerged Inside And Outside
The Enterprise That Business Now Wants To Analyse
Data volume
Data variety E.g. RFID tag
sensor
networks
Front Office Product/ BackOffice
service line 1
Service Finance
Customers
Product line 2
Supply Chain
Suppliers
Credit
Sales Product line 3 Procurement
Verification
Product line 4
Marketing HR
Product line n
Planning
Operations
Data volume
Data variety
weather data
Number of sources 42
43. Big Data Has Taken Us Beyond The Traditional Data
Warehouse – New Big Data Analytical Workloads
1. Complex analysis of structured data
2. Analysis of data in motion
3. Exploratory analysis of un-modeled multi-structured data
4. Graph analytics
5. Accelerating ETL and analytical processing of un-
modeled data to enrich data in a data warehouse or
analytical appliance
6. The storage and re-processing of archived data
43
44. The Changing Landscape – We Now Have Different
Platforms Optimised For Different Analytical Workloads
Big Data workloads result in multiple platforms now being needed for
analytical processing
Advanced Analytic DW & marts Advanced Analytics
(multi-structured data) (structured data)
NoSQL DB EDW DW
e.g. graph DB mart Appliance
Streaming NoSQL Hadoop Data Warehouse Analytical
data DBMS data store RDBMS RDBMS
44
45. Hadoop ‘Sandboxes’ Are Common for Data Scientist
Led Investigative Analysis of Multi-structured Data
sandbox sandbox
Un-modelled data
ETL new
MapReduce insights
Applications
(batch analysis)
Seismic Web
data logs
sensor
data
45
46. ETL Acceleration Is Also A Popular Big Data Use Case
For Bringing Additional Insights Into Data Warehouses
Hundreds of Cloud Data e.g. Deriving insight from huge
terabytes up volumes of social web content on
to petabytes sites like Twitter, Facebook. Digg,
MySpace, TripAdvisor, Linkedin….for
sentiment analytics
Operational
systems
Extract
D
Transform
DW
Cloud Data
Map/ Reduce I
analytical
applications
HDFS e.g. PIG, JAQL relevant
insight
46
47. This Requires Parsing & Extraction From Multi-Structured
Data While Integrating Data In A Big Data Environment
E-mail (semi-structured)
Load Parse Extract Transform …
Text (unstructured) 47
48. Data Deluge – Need To Accelerate And Automate Data Filtering To
Consume Data That Is Arriving Faster Than We Can Consume It
Enterprise
F
DI
A L Enterprise
systems
TT
AE
R
48
49. Data Management Tools Are Being Extended To Embrace
And Exploit MPP Hadoop Clusters AND Embed Analytics
Approaches:
• Custom code
• Data Management tools suites
• Self-service analytical workflow development tools???
Extract Data from Hadoop
Invoke Custom Analytics on Hadoop
Transform & Cleanse Data in Hadoop (MapReduce)
Data Parse & Prepare Data in Hadoop (MapReduce)
management Discover data in Hadoop
tools
Load Data into Hadoop
Trends: Expect MUCH more from data management
tool vendors including generation of MapReduce code
to clean and transform data
49
50. New Analytical Platforms Breed New Requirements
– Cross Silo Analytics for Harder Business Questions
Analyse?
RT Analytics Advanced Analytics DW & marts Advanced Analytics
(multi-structured data) (structured data)
NoSQL DB
EDW DW e.g. graph DB
mart Appliance
Streaming
data
50
51. Cross Silo Analytics Option - Multi-Platform Analytical
Workflows Need Analytics Embedded in ETL Processing
• Support parsing and extract of data from multi-structured data sources
• Help automate analysis and consumption of data
• Move the data to the best platform to do the analytics
• Support analytical processing across multiple analytical platforms
NoSQL DB
e.g. graph DB EDW
Step 1 Step 2 Step 3
Extract Load Parse Clean Transform Analyse Insights
51
52. Discussion Points
§ Competitive positioning
• Where does Alteryx fit in the analytical competitive landscape?
§ Product positioning
• Is Alteryx for Data Warehousing, Self-service BI or both?
§ Data Governance
• How does Alteryx facilitate support for data consistency and reuse
§ Analytical workloads
• What kinds of analytical workload is Alteryx providing solutions for?
• Big Data – How does Alteryx work with Big Data and NoSQL Platforms?
§ Performance
• How does Alteryx scale to handle concurrent users analysing and
consuming business insights
• How does Alteryx exploit underlying analytical platforms to get
performance with high volume multi-structured data?
52
54. Upcoming Topics
This month: Analytics
March: Operational
Intelligence
April: Intelligence
May: Integration
www.insideanalysis.com
Twitter Tag: #briefr The Briefing Room
55. Thank You
for Your
Attention
Certain images and/or photos on this page are the copyrighted property of 123RF Limited, their Contributors or Licensed Partners and are being used with
permission under license. These images and/or photos may not be copied or downloaded without permission from 123RF Limited.
Twitter Tag: #briefr The Briefing Room