Talk given on Dec. 3, 2014 at MIT, sponsored by Hack/Reduce. This talk looks at the history of Business Intelligence from first generation OLAP tools through modern Data Discovery and visualization tools. And looking forward, what can we learn from that evolution as numerous new tools and architectures for analytics emerge in the Big Data era.
You probably have heard about Big Data, but ever wondered what it exactly is? And why should you care?
Mobile is playing a large part in driving this explosion in data. The data are also created by the apps and other services in the background. As people are moving towards more digital channels, tons of data are being created. This data can be used in a lot of ways for personal and professional use. Big Data and mobile apps are converging in an enterprise and interacting; transforming the whole mobile ecosystem.
You probably have heard about Big Data, but ever wondered what it exactly is? And why should you care?
Mobile is playing a large part in driving this explosion in data. The data are also created by the apps and other services in the background. As people are moving towards more digital channels, tons of data are being created. This data can be used in a lot of ways for personal and professional use. Big Data and mobile apps are converging in an enterprise and interacting; transforming the whole mobile ecosystem.
The Present - the History of Business IntelligencePhocas Software
Learn the history of business intelligence in this three part series. In part one, we discussed how business intelligence software used to be (the past). In part two, we discuss business intelligence as it is in the present.
Big Data Analytics and a Chartered AccountantBharath Rao
Big Data Analytics is a growing field and currently being capitalized by many businesses. Businesses leverage on Big Data to gain a keen understanding of the Consumer Behavior and Market Understanding. Additionally Big Data can be used different fields such as Financial Audit, Control Assurance and Forensics.
This presentation is made to provide an insight regarding what opportunities reside for a Chartered Accountant in order to provide suitable value creation with regards to Big Data Analytics.
This presentation was made during my GMCS 2 Course at Mangalore branch of SIRC of ICAI and hence has limited number of slides.
Data Services and the Modern Data Ecosystem (Middle East)Denodo
Watch full webinar here: https://bit.ly/3xdSTIU
Digital Transformation has changed IT the way information services are delivered. The pace of business engagement, the rise of Digital IT (formerly known as “Shadow IT), has also increased demands on IT, especially in the area of Data Management. Data Services exploits widely adopted interoperability standards, providing a strong framework for information exchange but also has enabled a growth of robust systems of engagement that can now exploit information that was normally locked away in some internal silo.
Join us for our upcoming Middle East Webinar series episode, “Data Services and the Modern Data Ecosystem,” presented by Chief Evangelist MEA, Alexey Sidorov. Tune-in as we explore how a business can easily support and manage a Data Service ecosystem, providing a more flexible approach for information sharing supporting an ever diverse community of consumers.
Watch on-demand this webinar to learn:
- Why Data Services are a critical part of a modern data ecosystem
- How IT teams can manage Data Services and the increasing demand by businesses
- How Digital IT can benefit from Data Services and how this can support the need for rapid prototyping allowing businesses to experiment with data and fail fast where necessary.
- How a good Data Virtualization platform can encourage a culture of Data amongst business consumers (internally and externally)
Big Data LDN 2018: CONNECTING SILOS IN REAL-TIME WITH DATA VIRTUALIZATIONMatt Stubbs
Date: 14th November 2018
Location: Keynote Theatre
Time: 13:50 - 14:20
Speaker: Becky Smith
Organisation: Denodo
About: How many users inside and outside of your organization access your organization’s data? Dozens? Hundreds is probably more like it, each with their own structure and content requirements as well as different access rights. As a result, many organizations have witnessed the formation of “data delivery mills,” in various shapes and sizes. How does one create order and reliability in this world of chaotic data streams? Quite easily, if it’s done with data virtualization.
According to Gartner, "through 2020, 50% of enterprises will implement some form of data virtualization as one enterprise production option for data integration.” Data virtualization enables organizations to gain data insights from multiple, distributed data sources without the time-consuming processes of data extraction and loading. This allows for faster insights and fact-based decisions, which help business realize value sooner.
Join us to find out more about:
• What data virtualization actually means and how it differs from traditional data integration approaches.
• How you can connect and combine all your data in real-time, without compromising on scalability, security or governance.
• The benefits of data virtualization and its most important use cases.
Modern Integrated Data Environment - Whitepaper | QuboleVasu S
A whit-paper is about building a modern data platform for data driven organisations with using cloud data warehouse with modern data platform architecture
https://www.qubole.com/resources/white-papers/modern-integrated-data-environment
Business intelligence, Data Analytics & Data VisualizationMuthu Natarajan
Business Intelligence, Cloud Computing, Data Analytics, Data Scrubbing, Data Mining, Big Data & Intelligence, How to use Data into Information, Decision Based,Methods for Business Intelligence, Advanced Analytics, OLAP, MultiDimensional Data, Data Visualization
Accelerating Data-Driven Enterprise Transformation in Banking, Financial Serv...Denodo
Watch full webinar here: https://bit.ly/3c6v8K7
Banking, Financial Services and Insurance (BFSI) organizations are globally accelerating their digital journey, making rapid strides with their digitization efforts, and adding key capabilities to adapt and innovate in the new normal.
Many companies find digital transformation challenging as they rely on established systems that are often not only poorly integrated but also highly resistant to modernization without downtime. Hear how the BFSI industry is leveraging data virtualization that facilitates digital transformation via a modern data integration/data delivery approach to gain greater agility, flexibility, and efficiency.
In this session from Denodo, you will learn:
- Industry key trends and challenges driving the digital transformation mandate and platform modernization initiatives
- Key concepts of Data Virtualization, and how it can enable BFSI customers to develop critical capabilities for real-time / near real-time data integration
- Success Stories on organizations who already use data virtualization to differentiate themselves from the competition.
M365 Saturday Saskatchewan 2020 - Build your #PowerPlatform #GovernanceNicolas Georgeault
Sites from my session:
Managing the Microsoft Power Platform platform can sometimes be very complex, and because your users have access to the various Power Apps and Power Automate options from other services, it's important to understand and understand the intricacies of the options. Understanding the differences in fault environments and others will allow you to better understand its management and better control the costs.
We'll also discuss the risks of letting unchecked developments proliferate at the risk of seeing a repeat of situations already encountered with Microsoft Access and Excel and some applications that have become critical in your business but completely absent from your service contract.
Claudia Imhoff of the Boulder BI Brain Trust gives the lowdown on integrating real-time data to leverage modern BI practices for your business in this Information Builders Innovation Session presentation.
How to Swiftly Operationalize the Data Lake for Advanced Analytics Using a Lo...Denodo
Watch full webinar here: https://bit.ly/3mfFJqb
Presented at Chief Data Officer Live Series 2021, ASEAN (August Edition)
While big data initiatives have become necessary for any business to generate actionable insights, big data fabric has become a necessity for any successful big data initiative. The best-of-breed big data fabrics should deliver actionable insights to the business users with minimal effort, provide end-to-end security to the entire enterprise data platform, and provide real-time data integration while delivering a self-service data platform to business users.
Watch this on-demand session to learn how big data fabric enabled by Data Virtualization:
- Provides lightning fast self-service data access to business users
- Centralizes data security, governance, and data privacy
- Fulfills the promise of data lakes to provide actionable insights
The Present - the History of Business IntelligencePhocas Software
Learn the history of business intelligence in this three part series. In part one, we discussed how business intelligence software used to be (the past). In part two, we discuss business intelligence as it is in the present.
Big Data Analytics and a Chartered AccountantBharath Rao
Big Data Analytics is a growing field and currently being capitalized by many businesses. Businesses leverage on Big Data to gain a keen understanding of the Consumer Behavior and Market Understanding. Additionally Big Data can be used different fields such as Financial Audit, Control Assurance and Forensics.
This presentation is made to provide an insight regarding what opportunities reside for a Chartered Accountant in order to provide suitable value creation with regards to Big Data Analytics.
This presentation was made during my GMCS 2 Course at Mangalore branch of SIRC of ICAI and hence has limited number of slides.
Data Services and the Modern Data Ecosystem (Middle East)Denodo
Watch full webinar here: https://bit.ly/3xdSTIU
Digital Transformation has changed IT the way information services are delivered. The pace of business engagement, the rise of Digital IT (formerly known as “Shadow IT), has also increased demands on IT, especially in the area of Data Management. Data Services exploits widely adopted interoperability standards, providing a strong framework for information exchange but also has enabled a growth of robust systems of engagement that can now exploit information that was normally locked away in some internal silo.
Join us for our upcoming Middle East Webinar series episode, “Data Services and the Modern Data Ecosystem,” presented by Chief Evangelist MEA, Alexey Sidorov. Tune-in as we explore how a business can easily support and manage a Data Service ecosystem, providing a more flexible approach for information sharing supporting an ever diverse community of consumers.
Watch on-demand this webinar to learn:
- Why Data Services are a critical part of a modern data ecosystem
- How IT teams can manage Data Services and the increasing demand by businesses
- How Digital IT can benefit from Data Services and how this can support the need for rapid prototyping allowing businesses to experiment with data and fail fast where necessary.
- How a good Data Virtualization platform can encourage a culture of Data amongst business consumers (internally and externally)
Big Data LDN 2018: CONNECTING SILOS IN REAL-TIME WITH DATA VIRTUALIZATIONMatt Stubbs
Date: 14th November 2018
Location: Keynote Theatre
Time: 13:50 - 14:20
Speaker: Becky Smith
Organisation: Denodo
About: How many users inside and outside of your organization access your organization’s data? Dozens? Hundreds is probably more like it, each with their own structure and content requirements as well as different access rights. As a result, many organizations have witnessed the formation of “data delivery mills,” in various shapes and sizes. How does one create order and reliability in this world of chaotic data streams? Quite easily, if it’s done with data virtualization.
According to Gartner, "through 2020, 50% of enterprises will implement some form of data virtualization as one enterprise production option for data integration.” Data virtualization enables organizations to gain data insights from multiple, distributed data sources without the time-consuming processes of data extraction and loading. This allows for faster insights and fact-based decisions, which help business realize value sooner.
Join us to find out more about:
• What data virtualization actually means and how it differs from traditional data integration approaches.
• How you can connect and combine all your data in real-time, without compromising on scalability, security or governance.
• The benefits of data virtualization and its most important use cases.
Modern Integrated Data Environment - Whitepaper | QuboleVasu S
A whit-paper is about building a modern data platform for data driven organisations with using cloud data warehouse with modern data platform architecture
https://www.qubole.com/resources/white-papers/modern-integrated-data-environment
Business intelligence, Data Analytics & Data VisualizationMuthu Natarajan
Business Intelligence, Cloud Computing, Data Analytics, Data Scrubbing, Data Mining, Big Data & Intelligence, How to use Data into Information, Decision Based,Methods for Business Intelligence, Advanced Analytics, OLAP, MultiDimensional Data, Data Visualization
Accelerating Data-Driven Enterprise Transformation in Banking, Financial Serv...Denodo
Watch full webinar here: https://bit.ly/3c6v8K7
Banking, Financial Services and Insurance (BFSI) organizations are globally accelerating their digital journey, making rapid strides with their digitization efforts, and adding key capabilities to adapt and innovate in the new normal.
Many companies find digital transformation challenging as they rely on established systems that are often not only poorly integrated but also highly resistant to modernization without downtime. Hear how the BFSI industry is leveraging data virtualization that facilitates digital transformation via a modern data integration/data delivery approach to gain greater agility, flexibility, and efficiency.
In this session from Denodo, you will learn:
- Industry key trends and challenges driving the digital transformation mandate and platform modernization initiatives
- Key concepts of Data Virtualization, and how it can enable BFSI customers to develop critical capabilities for real-time / near real-time data integration
- Success Stories on organizations who already use data virtualization to differentiate themselves from the competition.
M365 Saturday Saskatchewan 2020 - Build your #PowerPlatform #GovernanceNicolas Georgeault
Sites from my session:
Managing the Microsoft Power Platform platform can sometimes be very complex, and because your users have access to the various Power Apps and Power Automate options from other services, it's important to understand and understand the intricacies of the options. Understanding the differences in fault environments and others will allow you to better understand its management and better control the costs.
We'll also discuss the risks of letting unchecked developments proliferate at the risk of seeing a repeat of situations already encountered with Microsoft Access and Excel and some applications that have become critical in your business but completely absent from your service contract.
Claudia Imhoff of the Boulder BI Brain Trust gives the lowdown on integrating real-time data to leverage modern BI practices for your business in this Information Builders Innovation Session presentation.
How to Swiftly Operationalize the Data Lake for Advanced Analytics Using a Lo...Denodo
Watch full webinar here: https://bit.ly/3mfFJqb
Presented at Chief Data Officer Live Series 2021, ASEAN (August Edition)
While big data initiatives have become necessary for any business to generate actionable insights, big data fabric has become a necessity for any successful big data initiative. The best-of-breed big data fabrics should deliver actionable insights to the business users with minimal effort, provide end-to-end security to the entire enterprise data platform, and provide real-time data integration while delivering a self-service data platform to business users.
Watch this on-demand session to learn how big data fabric enabled by Data Virtualization:
- Provides lightning fast self-service data access to business users
- Centralizes data security, governance, and data privacy
- Fulfills the promise of data lakes to provide actionable insights
A Business Analytics solution implementation which is a Web-based decision support solution which collates data from various sources, performs the analysis required on the data which is then presented in a customised format to the end users.
AYATA created a new infographic - “The Evolution of Big Data Analytics” - using the latest data available from industry analysts and its own findings. The infographic shows how Big Data Analytics continues to evolve toward the ‘final phase,’ Prescriptive Analytics.
Given at Oracle Open World 2011: Not to be confused with Oracle Database Vault (a commercial db security product), Data Vault Modeling is a specific data modeling technique for designing highly flexible, scalable, and adaptable data structures for enterprise data warehouse repositories. It has been in use globally for over 10 years now but is not widely known. The purpose of this presentation is to provide an overview of the features of a Data Vault modeled EDW that distinguish it from the more traditional third normal form (3NF) or dimensional (i.e., star schema) modeling approaches used in most shops today. Topics will include dealing with evolving data requirements in an EDW (i.e., model agility), partitioning of data elements based on rate of change (and how that affects load speed and storage requirements), and where it fits in a typical Oracle EDW architecture. See more content like this by following my blog http://kentgraziano.com or follow me on twitter @kentgraziano.
Evolution of Data Analytics: the past, the present and the futureVarun Nemmani
This paper delves into the topic of advanced analytics, the current industry demands to utilize and analyze huge/diverse amounts of data, how big data analytics is becoming a part of the decision making process and to anticipate trends. This paper takes the reader from Analytics era 1.0 to the current Analytics era 3.0; shows the future projections of big data analytics and also the current leaders of the Big Data Analytics market.
This presentation has been uploaded by Public Relations Cell, IIM Rohtak to help the B-school aspirants crack their interview by gaining basic knowledge on IT.
How to build your own Delve: combining machine learning, big data and SharePointJoris Poelmans
You are experiencing the benefits of machine learning everyday through product recommendations on Amazon & Bol.com, credit card fraud prevention, etc… So how can we leverage machine learning together with SharePoint and Yammer. We will first look into the fundamentals of machine learning and big data solutions and next we will explore how we can combine tools such as Windows Azure HDInsight, R, Azure Machine Learning to extend and support collaboration and content management scenarios within your organization.
ADV Slides: When and How Data Lakes Fit into a Modern Data ArchitectureDATAVERSITY
Whether to take data ingestion cycles off the ETL tool and the data warehouse or to facilitate competitive Data Science and building algorithms in the organization, the data lake – a place for unmodeled and vast data – will be provisioned widely in 2020.
Though it doesn’t have to be complicated, the data lake has a few key design points that are critical, and it does need to follow some principles for success. Avoid building the data swamp, but not the data lake! The tool ecosystem is building up around the data lake and soon many will have a robust lake and data warehouse. We will discuss policy to keep them straight, send data to its best platform, and keep users’ confidence up in their data platforms.
Data lakes will be built in cloud object storage. We’ll discuss the options there as well.
Get this data point for your data lake journey.
Analyzing Billions of Data Rows with Alteryx, Amazon Redshift, and TableauDATAVERSITY
Got lots of data? So does Amaysim, a leading Australian telecom provider, with its billions of rows of data. The organization successfully empowers its small team of data analysts with self-service data analytics platforms so they can easily access the data they need, perform advanced analytics, and visualize findings for all stakeholders. Register for this session and learn how Amaysim uses the Alteryx-Redshift-Tableau BI stack to easily and quickly:
Extract data from their data warehouse and blend and enrich it with other sources
Give data analytical context by running statistical, predictive, and deep geo-spatial analytics
Create visualizations from analytics and then update Tableau Workbooks directly from Alteryx, or publish the results in Amazon Redshift, for easy direct access for their stakeholders from Tableau
Hear from Adrian Loong, Alteryx Analytics Certified Expert (ACE), and product marketers from AWS and Alteryx on how organizations can use Alteryx, Amazon Redshift and Tableau to enable data analysts to spin up new self-service analytics instances to enable fast investigation for critical business decisions.
When and How Data Lakes Fit into a Modern Data ArchitectureDATAVERSITY
Whether to take data ingestion cycles off the ETL tool and the data warehouse or to facilitate competitive Data Science and building algorithms in the organization, the data lake – a place for unmodeled and vast data – will be provisioned widely in 2020.
Though it doesn’t have to be complicated, the data lake has a few key design points that are critical, and it does need to follow some principles for success. Avoid building the data swamp, but not the data lake! The tool ecosystem is building up around the data lake and soon many will have a robust lake and data warehouse. We will discuss policy to keep them straight, send data to its best platform, and keep users’ confidence up in their data platforms.
Data lakes will be built in cloud object storage. We’ll discuss the options there as well.
Get this data point for your data lake journey.
Data Lakehouse, Data Mesh, and Data Fabric (r1)James Serra
So many buzzwords of late: Data Lakehouse, Data Mesh, and Data Fabric. What do all these terms mean and how do they compare to a data warehouse? In this session I’ll cover all of them in detail and compare the pros and cons of each. I’ll include use cases so you can see what approach will work best for your big data needs.
We recently presented our technology solution for metadata discovery to the Boulder Business Intelligence Brains Trust in Colorado. (www.bbbt.us)
The whole session was also videod and there is a link to the recording at the end of the presentation.
Data Lakehouse, Data Mesh, and Data Fabric (r2)James Serra
So many buzzwords of late: Data Lakehouse, Data Mesh, and Data Fabric. What do all these terms mean and how do they compare to a modern data warehouse? In this session I’ll cover all of them in detail and compare the pros and cons of each. They all may sound great in theory, but I'll dig into the concerns you need to be aware of before taking the plunge. I’ll also include use cases so you can see what approach will work best for your big data needs. And I'll discuss Microsoft version of the data mesh.
Building Modern Data Platform with Microsoft AzureDmitry Anoshin
This presentation will cover Cloud history and Microsoft Azure Data Analytics capabilities. Moreover, it has a real-world example of DW modernization. Finally, we will check the alternative solution on Azure using Snowflake and Matillion ETL.
Architecting for Big Data: Trends, Tips, and Deployment OptionsCaserta
Joe Caserta, President at Caserta Concepts addressed the challenges of Business Intelligence in the Big Data world at the Third Annual Great Lakes BI Summit in Detroit, MI on Thursday, March 26. His talk "Architecting for Big Data: Trends, Tips and Deployment Options," focused on how to supplement your data warehousing and business intelligence environments with big data technologies.
For more information on this presentation or the services offered by Caserta Concepts, visit our website: http://casertaconcepts.com/.
Building an Effective Data Warehouse ArchitectureJames Serra
Why use a data warehouse? What is the best methodology to use when creating a data warehouse? Should I use a normalized or dimensional approach? What is the difference between the Kimball and Inmon methodologies? Does the new Tabular model in SQL Server 2012 change things? What is the difference between a data warehouse and a data mart? Is there hardware that is optimized for a data warehouse? What if I have a ton of data? During this session James will help you to answer these questions.
ADV Slides: Platforming Your Data for Success – Databases, Hadoop, Managed Ha...DATAVERSITY
Thirty years is a long time for a technology foundation to be as active as relational databases. Are their replacements here? In this webinar, we say no.
Databases have not sat around while Hadoop emerged. The Hadoop era generated a ton of interest and confusion, but is it still relevant as organizations are deploying cloud storage like a kid in a candy store? We’ll discuss what platforms to use for what data. This is a critical decision that can dictate two to five times additional work effort if it’s a bad fit.
Drop the herd mentality. In reality, there is no “one size fits all” right now. We need to make our platform decisions amidst this backdrop.
This webinar will distinguish these analytic deployment options and help you platform 2020 and beyond for success.
Jan 2017 Investment Recommendation for Tableaupaulchenuva
Buy Tableau was one of my stock pitches in 2017. I was predicting a 2x return. Instead, Tableau's stock increased over 4x since. Great to see Salesforce's acquisition today. Would love to hear your comments and feedback.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
"Impact of front-end architecture on development cost", Viktor TurskyiFwdays
I have heard many times that architecture is not important for the front-end. Also, many times I have seen how developers implement features on the front-end just following the standard rules for a framework and think that this is enough to successfully launch the project, and then the project fails. How to prevent this and what approach to choose? I have launched dozens of complex projects and during the talk we will analyze which approaches have worked for me and which have not.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
From Siloed Products to Connected Ecosystem: Building a Sustainable and Scala...
From Business Intelligence to Big Data - hack/reduce Dec 2014
1. From Business Intelligence to Big Data:
The Evolution of Business Analytics
@hackreduce – Dec. 3, 2014
Adam Ferrari
@AJFerrari
(All opinions expressed are my own / I’m not here representing my employers)
5. This talk
What did I learn as CTO of a BI product company as we
jumped into the BI market mid-stream, and then later as we
were acquired by one of the biggest “traditional BI” vendors?
Most Importantly:
Stay focused on real business value, not technology.
Note: My context is very “product provider” oriented, but I believe the lessons
are equally interesting to “product consumers” – after all, we’re all interested
in where the toolset is going and why
6. A note about scope
Analytics is a highly overloaded term
The vast majority of my experience, and the focus of this talk, is
around “BI-style” analytics, i.e.,
Delivering historical and aggregate views of data (e.g.,
charts, reports, dashboards, etc.) to business decision makers
There are many other important forms of “analytics”
E.g., Data mining, statistics, data science, etc.
These are very important and complementary,
but not in my scope here
7. Part 1 (of 3)
Some Ancient History
(or, a bunch of important stuff that happened before my time)
8. In the beginning…
…there was the cube
(well, there was a bunch of stuff before that – Hans Peter Luhn coins the term Business
Intelligence in 1958, Edgar Codd invents the relational data model in 1970, etc…
but we’ll start with the beginning of modern Business Intelligence, which is OLAP)
Image source: oracle.com
9. Research sponsored by Arbor Software in 1993,
defined the “12 Rules for OLAP Products”
Rule #1 – “Multidimensional Conceptual View”
12. ROLAP Modeling
• Manage mapping between
physical data stores, “logical
view” (core dimensional model),
and “business view”
• Definition of metrics,
dimensions
• Management of pre-
computed aggregates
Image source: rittmanmead.com
13. Data Warehousing: go big or go home
HW
• Teradata
• Netezza (IBM)
• Oracle Exadata
SW – Traditional DBMS
• Oracle
• MS SQL Server
• IBM DB2
SW – Analytical DBMS
• Vertica (HP)
• ParAccell (/ RedShift)
• SAP HANA
Image source: teradata.com
14. ETL- Extract/Transform/Load
Image source: informatica.com
Notable ETL Products
• Informatica Power Center
• Ascential DataStage (IBM)
• Ab Initio
• … numerous others
• Capture History
• Manage dimensions
– E.g., what happens if a
customer moves?
“slow changing dimensions”
• Pre-compute aggregates
• Serve as the versionable
managed record of how the
dimensional model of the
warehouse is derived from
the raw data
17. Business Analytics 1.0 - Pros & Cons
• Governance, re-use, and quality
– “One Version of the Truth” – correct, agreed upon, reusable definitions of core
business metrics and dimensions
But…
• Poor Agility – development process requires:
– Creating or modifying a dimensional model
– Creating ETL to populate the new model
– Creating report or dashboard content on top of the model
– Iterating to make the model perform
• Lack of self-service for end users
• Historically, poor user experience for end consumers
• Cost and Complexity – large, complex stack of
components, code, and configuration to manage, scale,
troubleshoot, etc.
18. Part 2 (of 3)
Some Recent History
(or, where I joined the story already in progress)
19. Data Discovery & Visualization
Key Features
• Visual data presentation
• Interactive data exploration –
“facets,” “lassos,” etc.
• Simplified stack – DBMS and Server optional
• Self-service: data loading & content creation,
no dimensional modeling
Notable products:
• QlikView (Qlik Tech)
• Tableau
• Spotfire (TIBCO)
• Endeca Latitude
(now Oracle Information Discovery)
• EdgeSpring (now Salesforce.com Wave)
• Business Objects Explorer
Image source: tibco.com
Image source: sap.com
24. Data Discovery Lessons
• Improved User Experience, Self-service
But…
• BI is still really hard
– Reading from raw, real-world operational schemas is messy and
complicated
– And the requisite history may not even be available
• The usability benefits of discovery tools come with significant
scalability limitations
• Additional data types – so called “unstructured” data (logs,
text, etc.) is even harder, as discovery tools (generally) target
structured, tabular data (didn’t address “Big Data”)
And…
• Traditional BI tools are rapidly adding better UX, Visualization,
and Self-service
25. Part 3 (of 3) (woohoo!)
Future History
(or, stuff that’s still anyone’s guess)
26. Our analytics ambitions have only grown!
We want BIG, EASY, DEEP analytics
• [BIG] the headline grabber:
More data from more sources, aka: Big Data
• [EASY] the real issue (IMHO):
Faster time to value, at lower cost of ownership
• [DEEP] increasingly important:
Deeper intelligence from data…
not just data, but actions, predictions, etc…
… Can we solve these problems without creating an
ever larger mess of technology and products?
27. [BIG]: the Hadoop Solution
Posits that what we need is a better, more flexible and
scalable foundation for the Data Warehouse – more like a
“data operating system” than a DBMS
Image source: cloudera.com
28. [BIG] and [EASY] “On-Hadoop” Solutions
Image source: gigaom.com
Platfora Architecture
Posit that although Hadoop
is indeed a powerful
platform, it’s complexity
needs to be wrapped in a BI
/ analytics application
Notable Products
• Platfora
• Datameer
• Oracle Big Data Discovery
(based on Endeca)
29. [BIG+]: The Logical Data Warehouse*
Posits that what is needed is a variety of data stores to constitute the
“Data Warehouse,” along with integration to allow data to be stored
and processed where most appropriate with little or no additional
development effort or operational management overhead
Image source: teradata.com
* From Understanding the Logical Data Warehouse: The Emerging
Practice, 21 June 2012, Mark A. Beyer and Roxane Edjlali
30. [EASY] The Cloud Solution
• Agility via all of the traditional cloud benefits –
reduced setup, less customization, reduced
ongoing management, etc…
• SaaS-based BI tools, such as
– GoodData
– Domo
• SaaS-based BI applications, such as
– Numerify (IT analytics on ServiceNow, etc.)
– InsightSquared (Sales analytics on Salesforce)
31. Other notable examples
• [DEEP] and [EASY]: BeyondCore – data discovery
with automatic/algorithmic analysis of attribute
relationships
• [DEEP]: Ayasdi – deeper insight into data based
on novel topological data visualization
• [DEEP] Alteryx – democratizing more complex
analytical workflows
• [EASY 2.0]: Looker – lightweight BI without
sacrificing modeling, yet avoiding the need for a
warehouse
• [BIG] and [EASY]: Tamr, Trifacta - curating and
wrangling data into usable forms
32. My guesses about the future?
• I voted with my feet. My beliefs:
– Fast time to real value is of paramount importance
• Zero-friction SaaS applications targeted to specific
business problems are an essential enabler – essential to
amortizing the cost of developing meaningful analytics
and quickly disseminating best practice updates – DIY just
doesn’t cut it any more in many cases.
– Our ability to do basic BI (dashboards, data
discovery, etc.) is mature, and the real action is in
deeper analysis of data
• Yet highly custom data science efforts are at odds with
fast time to value, and hard to advance in many cases
33. Crisply – quantified work for CRM
model & activity
activity
quantified
work
• Algorithmic quantification of the human effort behind each customer,
opportunity, support case, etc.
• Determine the true cost to acquire a specific customer or type of customer,
and understand the true profitability of that customer or segment over time
34. Thanks!
And stay focused on
the value that analytics creates
(the technology with follow from that)
Editor's Notes
http://www.tcsnycmarathon.org/analytics
Data Discovery tools improved agility and UX, and enabled more powerful self-service / DIY
But did these “model-less tools” truly advance Business Analytics, or just expand the toolset?
How will their impact trend as traditional tools become more agile and visual, at the same time that more modern tools advance the functional envelope?
Large organizations are still sorting out the impact of data discovery on their BI strategies, even as the picture changes quickly with new tools emerging and incumbent standards improving.