Check out this presentation from Pentaho and ESRG to learn why product managers should understand Big Data and hear about real-life products that have been elevated with these innovative technologies.
Learn more in the brief that inspired the presentation, Product Innovation with Big Data: http://www.pentaho.com/resources/whitepaper/product-innovation-big-data
Explore how data integration (or “mashups”) can maximize analytic value and help business teams create streamlined data pipelines that enables ad-hoc analytic inquiries. You’ll learn why businesses increasingly focused on blending data on demand and at the source, the concrete analytic advantages that this approach delivers, and the type of architectures required for delivering trusted, blended data. We provide a checklist to assess your data integration needs and capabilities, and review some real-world examples of how blending various data types has created significant analytic value and concrete business impact.
Pentaho Analytics for MongoDB - presentation from MongoDB World 2014Pentaho
Bo Borland presentation at MongoDB World in NYC, June 24, 2014. Data Integration and Advanced Analytics for MongoDB: Blend, Enrich and Analyze Disparate Data in a Single MongoDB View
What's in store for Big Data in 2015? Will the 'Internet of Things' fuel the Industrial Internet? Will Big Data get Cloudy? Check out the top five Big Data predictions for 2015 according to Quentin Gallivan, CEO, Pentah0
Up Your Analytics Game with Pentaho and Vertica Pentaho
Big Data is a game-changer.
In the face of exploding volumes and varieties of data, traditional data management and ETL systems just aren’t cutting it anymore. A new way of sifting through vast volumes of data to find the most relevant info, combining this data with other data sources to extract faster insights is desperately needed. Enter HP|Vertica and Pentaho with a proven solution for lightning fast queries and blended data and analytics capabilities for your business users.
Explore how data integration (or “mashups”) can maximize analytic value and help business teams create streamlined data pipelines that enables ad-hoc analytic inquiries. You’ll learn why businesses increasingly focused on blending data on demand and at the source, the concrete analytic advantages that this approach delivers, and the type of architectures required for delivering trusted, blended data. We provide a checklist to assess your data integration needs and capabilities, and review some real-world examples of how blending various data types has created significant analytic value and concrete business impact.
Pentaho Analytics for MongoDB - presentation from MongoDB World 2014Pentaho
Bo Borland presentation at MongoDB World in NYC, June 24, 2014. Data Integration and Advanced Analytics for MongoDB: Blend, Enrich and Analyze Disparate Data in a Single MongoDB View
What's in store for Big Data in 2015? Will the 'Internet of Things' fuel the Industrial Internet? Will Big Data get Cloudy? Check out the top five Big Data predictions for 2015 according to Quentin Gallivan, CEO, Pentah0
Up Your Analytics Game with Pentaho and Vertica Pentaho
Big Data is a game-changer.
In the face of exploding volumes and varieties of data, traditional data management and ETL systems just aren’t cutting it anymore. A new way of sifting through vast volumes of data to find the most relevant info, combining this data with other data sources to extract faster insights is desperately needed. Enter HP|Vertica and Pentaho with a proven solution for lightning fast queries and blended data and analytics capabilities for your business users.
30 for 30: Quick Start Your Pentaho EvaluationPentaho
These slides are from our recent 30 for 30 webinar tailored towards people that have downloaded the Pentaho evaluation and want to know more about all the data integration and business analytics components part of the trial, how to easily integrate data, and best practices for installing/developing content.
With the combination of Pentaho and MongoDB, it’s drastically simpler and faster to build single analytical views of clients by aggregating and blending data from a variety of internal sources (customer, transaction, position data) and external sources (social networking, central bank, news, pricing) with fast response times.
Webinar covers:
An insider’s view of new ways financial services companies are using MongoDB to rapidly store and consume unlimited shapes and sizes of data
How Pentaho makes it easy to enrich data in MongoDB with predictive scoring, visual data integration tools, reports, interactive dashboards, and data visualizations
A live demo of blending Twitter, equity pricing, and news data into a single analytical view that unlocks market intelligence to create investment opportunities
Open Analytics 2014 - Pedro Alves - Innovation though Open SourceOpenAnalytics Spain
Delivering the Future of Analytics: Innovation through Open Source Pentaho was born out of the desire to achieve positive, disruptive change in the business analytics market, dominated by bureaucratic megavendors offering expensive heavy-weight products built on outdated technology platforms. Pentaho’s open, embeddable data integration and analytics platform was developed with a strong open source heritage. This provided Pentaho a first-mover advantage to engage early with adopters of big data technologies and solve the difficult challenges of integrating both established and emerging data types to drive analytics. Continued technology innovations to support the big data ecosystem, have kept customers ahead of the big data curve. With the ability to drastically reduce the time to design, develop and deploy big data solutions, Pentaho counts numerous big data customers, both large and small, across the financial services, retail, travel, healthcare and government industries around the world.
Informatica Becomes Part of the Business Data Lake EcosystemCapgemini
Informatica is now part of the Business Data Lake ecosystem developed by Capgemini and Pivotal. Customers worldwide will now be able to leverage Informatica’s data integration software in addition to Pivotal’s advanced big data, analytics and application software, and Capgemini’s industry and implementation expertise. Informatica will deliver certified technologies for Data Integration, Data Quality and Master Data Management (MDM) to help enterprises distill raw data into actionable insights.
http://www.capgemini.com/resources/the-business-data-lake-delivering-the-speed-and-accuracy-to-solve-your-big-data-problems
MongoDB IoT City Tour EINDHOVEN: Analysing the Internet of Things: Davy Nys, ...MongoDB
Drawing on Pentaho's wide experience in solving customers' big data issues, Davy Nys will position the importance of analytics in the IoT:
[-] Understanding the challenges behind data integration & analytics for IoT
[-] Future proofing your information architecture for IoT
[-] Delivering IoT analytics, now and tomorrow
[-] Real customer examples of where Pentaho can help
Traditional BI vs. Business Data Lake – A ComparisonCapgemini
Traditional Business Intelligence (BI) systems provide various levels and kinds of analyses on structured data but they are not designed to handle unstructured data.
For these systems Big Data brings big problems because the data that flows in may be either structured or unstructured. That makes them hugely limited when it comes to delivering Big Data benefits.
The way forward is a complete rethink of the way we use BI - in terms of how the data is ingested, stored and analyzed.
More information: http://www.capgemini.com/big-data-analytics/pivotal
BI congres 2016-2: Diving into weblog data with SAS on Hadoop - Lisa Truyers...BICC Thomas More
9de BI congres van het BICC-Thomas More: 24 maart 2016
De hoeveelheid data die via weblogs verzameld wordt, neemt steeds meer toe. Lisa Truyers zet aan de hand van een praktische case uiteen hoe Keyrus hiermee aan de slag ging
Unlocking data science in the enterprise - with Oracle and ClouderaCloudera, Inc.
Today, leading organizations struggle to make their data scientists productive in their modern data platforms. Data scientists find it difficult to use their existing open source languages (e.g. Python, R) and libraries with Hadoop, especially when the clusters are secured with Kerberos. At the same time, IT doesn't want to give special access to these users, who require very diverse and specific environment configurations to run their experiments. As a result, most data science teams work away from the big data cluster, often on their laptops or in other data silos. The negative business impacts are a lack of insight and agility for the most advanced users, and the security, governance, and cost issues that arise from data silos.
Sudhir Menon, Vice President of Enterprise Information Management on Hilton’s innovation/renovation journey to create data as an enterprise asset .The data framework using HortonWorks Hadoop as the platform is the single source and repository for any enterprise-class data for reporting, analytics and data science. To achieve this transformation and levarage data as a true enterprise asset, we focused on a roadmap with 3 major objectives:
• API based delivery of data enabling real-time use
• Decommissioning legacy tools/environments
• Managing the data architecture for all IT investments in a Big Data model with scalability over years
Platform and framework to accomplish this roadmap include:
• Repository of ‘master’ data
• Real-time processing of data for the enterprise
• Best-in-class BI tools to analyze and visualize data
• Data science tools to identify underlying trends in data
Our VISION
We enable travel & hospitality market disruption through data & analytics innovation
Our MISSION
We drive Hilton’s performance with actioned, integrated insights, through market-leading, differentiated expertise and continuous innovation.
Our STRATEGy
1. Create an aspirational and unrivaled hospitality Data & Analytics team that attracts the best talent
2. Become a trusted strategic business partner, driving untapped incremental value.
3. Provide timely access to quality data and innovative solutions.
Here is a case study that I developed to explain the different sets of functionality with the Pentaho Suite. I focused on the functionality, features, illustrative tools and key strengths. I've provided an understanding toward evaluating BI tools when selecting vendors. Enjoy!
Watch full webinar here: https://bit.ly/3eEax0F
iOCO enables companies to overcome challenges, so they can engage in successful digital transformations, gain all of their expected benefits, and remain competitive in the face of digital disruption. Listen to this session to learn how iOCO facilitates digital transformations that accelerate time-to-market while improving agility.
A modern approach to streaming data integration, event processing with a big data (kappa style) data architecture. Key patterns are discussed with pros/cons of newer approaches and open source technologies. Focus on Oracle and GoldenGate technology. OpenWorld 2018 presentation.
Explore how data integration (or “mashups”) can maximize analytic value and help business teams create streamlined data pipelines that enables ad-hoc analytic inquiries. You’ll learn why businesses increasingly focused on blending data on demand and at the source, the concrete analytic advantages that this approach delivers, and the type of architectures required for delivering trusted, blended data. We provide a checklist to assess your data integration needs and capabilities, and review some real-world examples of how blending various data types has created significant analytic value and concrete business impact.
Filling the Data Lake - Strata + HadoopWorld San Jose 2016 Preview PresentationPentaho
Preview of the Strata + Hadoop World Strata San Jose 2016 session about truly scalable and automated data onboarding for Hadoop
Attend the presentation at the conference to learn how to tackle repeatable, self-service Hadoop ingestion without coding
Filling the Data Lake
Thursday, March 31 11:50a-12:30p
Room 230B
http://conferences.oreilly.com/strata/hadoop-big-data-ca/public/schedule/detail/50677
30 for 30: Quick Start Your Pentaho EvaluationPentaho
These slides are from our recent 30 for 30 webinar tailored towards people that have downloaded the Pentaho evaluation and want to know more about all the data integration and business analytics components part of the trial, how to easily integrate data, and best practices for installing/developing content.
With the combination of Pentaho and MongoDB, it’s drastically simpler and faster to build single analytical views of clients by aggregating and blending data from a variety of internal sources (customer, transaction, position data) and external sources (social networking, central bank, news, pricing) with fast response times.
Webinar covers:
An insider’s view of new ways financial services companies are using MongoDB to rapidly store and consume unlimited shapes and sizes of data
How Pentaho makes it easy to enrich data in MongoDB with predictive scoring, visual data integration tools, reports, interactive dashboards, and data visualizations
A live demo of blending Twitter, equity pricing, and news data into a single analytical view that unlocks market intelligence to create investment opportunities
Open Analytics 2014 - Pedro Alves - Innovation though Open SourceOpenAnalytics Spain
Delivering the Future of Analytics: Innovation through Open Source Pentaho was born out of the desire to achieve positive, disruptive change in the business analytics market, dominated by bureaucratic megavendors offering expensive heavy-weight products built on outdated technology platforms. Pentaho’s open, embeddable data integration and analytics platform was developed with a strong open source heritage. This provided Pentaho a first-mover advantage to engage early with adopters of big data technologies and solve the difficult challenges of integrating both established and emerging data types to drive analytics. Continued technology innovations to support the big data ecosystem, have kept customers ahead of the big data curve. With the ability to drastically reduce the time to design, develop and deploy big data solutions, Pentaho counts numerous big data customers, both large and small, across the financial services, retail, travel, healthcare and government industries around the world.
Informatica Becomes Part of the Business Data Lake EcosystemCapgemini
Informatica is now part of the Business Data Lake ecosystem developed by Capgemini and Pivotal. Customers worldwide will now be able to leverage Informatica’s data integration software in addition to Pivotal’s advanced big data, analytics and application software, and Capgemini’s industry and implementation expertise. Informatica will deliver certified technologies for Data Integration, Data Quality and Master Data Management (MDM) to help enterprises distill raw data into actionable insights.
http://www.capgemini.com/resources/the-business-data-lake-delivering-the-speed-and-accuracy-to-solve-your-big-data-problems
MongoDB IoT City Tour EINDHOVEN: Analysing the Internet of Things: Davy Nys, ...MongoDB
Drawing on Pentaho's wide experience in solving customers' big data issues, Davy Nys will position the importance of analytics in the IoT:
[-] Understanding the challenges behind data integration & analytics for IoT
[-] Future proofing your information architecture for IoT
[-] Delivering IoT analytics, now and tomorrow
[-] Real customer examples of where Pentaho can help
Traditional BI vs. Business Data Lake – A ComparisonCapgemini
Traditional Business Intelligence (BI) systems provide various levels and kinds of analyses on structured data but they are not designed to handle unstructured data.
For these systems Big Data brings big problems because the data that flows in may be either structured or unstructured. That makes them hugely limited when it comes to delivering Big Data benefits.
The way forward is a complete rethink of the way we use BI - in terms of how the data is ingested, stored and analyzed.
More information: http://www.capgemini.com/big-data-analytics/pivotal
BI congres 2016-2: Diving into weblog data with SAS on Hadoop - Lisa Truyers...BICC Thomas More
9de BI congres van het BICC-Thomas More: 24 maart 2016
De hoeveelheid data die via weblogs verzameld wordt, neemt steeds meer toe. Lisa Truyers zet aan de hand van een praktische case uiteen hoe Keyrus hiermee aan de slag ging
Unlocking data science in the enterprise - with Oracle and ClouderaCloudera, Inc.
Today, leading organizations struggle to make their data scientists productive in their modern data platforms. Data scientists find it difficult to use their existing open source languages (e.g. Python, R) and libraries with Hadoop, especially when the clusters are secured with Kerberos. At the same time, IT doesn't want to give special access to these users, who require very diverse and specific environment configurations to run their experiments. As a result, most data science teams work away from the big data cluster, often on their laptops or in other data silos. The negative business impacts are a lack of insight and agility for the most advanced users, and the security, governance, and cost issues that arise from data silos.
Sudhir Menon, Vice President of Enterprise Information Management on Hilton’s innovation/renovation journey to create data as an enterprise asset .The data framework using HortonWorks Hadoop as the platform is the single source and repository for any enterprise-class data for reporting, analytics and data science. To achieve this transformation and levarage data as a true enterprise asset, we focused on a roadmap with 3 major objectives:
• API based delivery of data enabling real-time use
• Decommissioning legacy tools/environments
• Managing the data architecture for all IT investments in a Big Data model with scalability over years
Platform and framework to accomplish this roadmap include:
• Repository of ‘master’ data
• Real-time processing of data for the enterprise
• Best-in-class BI tools to analyze and visualize data
• Data science tools to identify underlying trends in data
Our VISION
We enable travel & hospitality market disruption through data & analytics innovation
Our MISSION
We drive Hilton’s performance with actioned, integrated insights, through market-leading, differentiated expertise and continuous innovation.
Our STRATEGy
1. Create an aspirational and unrivaled hospitality Data & Analytics team that attracts the best talent
2. Become a trusted strategic business partner, driving untapped incremental value.
3. Provide timely access to quality data and innovative solutions.
Here is a case study that I developed to explain the different sets of functionality with the Pentaho Suite. I focused on the functionality, features, illustrative tools and key strengths. I've provided an understanding toward evaluating BI tools when selecting vendors. Enjoy!
Watch full webinar here: https://bit.ly/3eEax0F
iOCO enables companies to overcome challenges, so they can engage in successful digital transformations, gain all of their expected benefits, and remain competitive in the face of digital disruption. Listen to this session to learn how iOCO facilitates digital transformations that accelerate time-to-market while improving agility.
A modern approach to streaming data integration, event processing with a big data (kappa style) data architecture. Key patterns are discussed with pros/cons of newer approaches and open source technologies. Focus on Oracle and GoldenGate technology. OpenWorld 2018 presentation.
Explore how data integration (or “mashups”) can maximize analytic value and help business teams create streamlined data pipelines that enables ad-hoc analytic inquiries. You’ll learn why businesses increasingly focused on blending data on demand and at the source, the concrete analytic advantages that this approach delivers, and the type of architectures required for delivering trusted, blended data. We provide a checklist to assess your data integration needs and capabilities, and review some real-world examples of how blending various data types has created significant analytic value and concrete business impact.
Filling the Data Lake - Strata + HadoopWorld San Jose 2016 Preview PresentationPentaho
Preview of the Strata + Hadoop World Strata San Jose 2016 session about truly scalable and automated data onboarding for Hadoop
Attend the presentation at the conference to learn how to tackle repeatable, self-service Hadoop ingestion without coding
Filling the Data Lake
Thursday, March 31 11:50a-12:30p
Room 230B
http://conferences.oreilly.com/strata/hadoop-big-data-ca/public/schedule/detail/50677
Users and customers don't just want products and services anymore - they also want the data and analytics that are under the hood! The good news is that delivering value with data is more achievable than ever before thanks to greater access to diverse data sources and the ability to process, blend, and refine data at unprecedented scale.
Venha ser uma de nossas consultoras! L'acqua di Fiori! Agora você não precisa mais ter uma loja para trabalhar com os produtos L'acqua di Fiori. Venha ser uma de nossas consultoras! Faça seu cadastro através do Link https://escritoriolacqua.com/lacquadifiorishow Whatsapp 14 99828-3655
OneMap project works with government, civil society, ethnic groups and communities, to produce, enhance, and share high quality and accuracy data on land and other natural resources.
The open-access, online OneMap spatial data platform democratizes access to multi-sectoral data. It aims to function as an effective basis for transparent and accountable land governance and development planning by government and citizens.
By supporting government and civil society alike, OneMap provides space for multi-stakeholders based production and verification of key datasets, and thereby allows different perspectives on land to be equally represented.
Sono molteplici le sfide che una banca deve essere in grado di fronteggiare per riuscire a valorizzare al meglio i propri dati generando nuove opportunità di Business. In quest’ottica i Big Data e l’Analitycs svolgono un ruolo chiave nella definizione di un’azienda “data driver”. Il proseguimento di questi obiettivi passa attraverso l’evoluzione architettura, l’introduzione di nuove metodologie e l’inserimento di nuove competenze. L’obiettivo di questo intervento è quello di presentare come queste tematiche si calano all’interno del contesto BNL.
Talk given at Fronteers 2015 in Amsterdam.
In a world where many of our digital spaces are becoming more closed than ever, open data is a concept that is rapidly on the rise.
In this talk we'll explore what open data is (and what it isn't), and why we should care about it. We'll look at how you can introduce it into your projects with regards to practical publication and consumption, and discuss some useful tools and reference points.
Open data isn't just dry and technical - it gives us great scope to be creative, and throughout this talk we'll go through some of the amazing things that it has been used for globally in the hope that it will inspire you to create something amazing yourself.
This presentation reports on data governance best practices. Based on a definition of fundamental terms and the business rationale for data governance, a set of case studies from leading companies is presented. The content of this presentation is a result of the Competence Center Corporate Data Quality (CC CDQ) at the University of St. Gallen, Switzerland.
Bridging the gap from data science to servicedmoisset
Recent years have brought an explosion of algorithms, models and software libraries for doing data science that allow unprecedented possibilities for solving problems. But providing a data science service as a consultant or a company involves more than just tools. In this talk, I will share the most useful lessons that I learned while working at a company providing these services.
Big Data has been a "buzz word" for a few years now, and it's generated a fair amount of hype. But, while the technology landscape is still evolving, product companies in the software, web, and hardware areas have actually led the way in delivering real value from data sources like weblogs, sensors, and social media as well as systems like Hadoop, NoSQL, and Analytical Databases. These organizations have built "Big Data Apps" that leverage fast, flexible data frameworks to solve a wide array of user problems, scale to massive audiences, and deliver superior predictive intelligence.
Join this webinar to learn why product managers should understand Big Data and hear about real-life products that have been elevated with these innovative technologies. You will hear from:
- Ben Hopkins, Product Marketing Manager at Pentaho, who will discuss what Big Data means for product strategy and why it represents a new toolset for product teams to meet user needs and build competitive advantage
- Jim Stascavage, VP of Engineering at ESRG, who will discuss how his company has innovated with Big Data and predictive analytics to deliver technology products that optimize fuel consumption and maintenance cycles in the maritime and heavy industry sectors, leveraging trillions of sensor data points a year.
Who Should Attend
Product Managers, Product Marketing Managers, Project Managers, Development Managers, Product Executives, and anyone responsible for addressing customer needs & influencing product strategy.
Real life use cases from across Europe (Walid Aoudi - Cognizant)
This presentation will present some Cognizant Big Data clients return on experiences on continental Europe and UK. The main focus will be centered on use cases through the presentation of the business drivers behind these projects. Key highlights around the big data architecture and approach solutions will be presented. Finally, the business outcomes in terms of ROI provided by the solutions implementations will be discussed.
ADV Slides: How to Improve Your Analytic Data Architecture MaturityDATAVERSITY
Many organizations are immature when it comes to data use. The answer lies in delivering a greater level of insight from data, straight to the point of need. Enter: machine learning.
In this webinar, William will look at categories of organizational response to the challenge across strategy, architecture, modeling, processes, and ethics. Machine learning maturity levels tend to move in harmony across these categories. As a general principle of maturity models, you can’t skip levels in any category, nor can you advance in one category well beyond the others.
Vis-à-vis ML, attaining and retaining momentum up the model is paramount for success. You will ascend the model through concerted efforts delivering business wins utilizing progressive elements of the model, and thereby increasing your machine learning maturity. The model will evolve. No plateaus are comfortable for long.
With ML maturity markers, sequencing, and tactics, this webinar provides a plan for how to build analytic Data Architecture maturity in your organization.
Is your big data journey stalling? Take the Leap with Capgemini and ClouderaCloudera, Inc.
Transitioning to a Big Data architecture is a big step; and the complexity of moving existing analytical services onto modern platforms like Cloudera, can seem overwhelming.
How Data Virtualization Puts Enterprise Machine Learning Programs into Produc...Denodo
Watch full webinar here: https://bit.ly/3offv7G
Presented at AI Live APAC
Advanced data science techniques, like machine learning, have proven an extremely useful tool to derive valuable insights from existing data. Platforms like Spark, and complex libraries for R, Python and Scala put advanced techniques at the fingertips of the data scientists. However, these data scientists spend most of their time looking for the right data and massaging it into a usable format. Data virtualization offers a new alternative to address these issues in a more efficient and agile way.
Watch this on-demand session to learn how companies can use data virtualization to:
- Create a logical architecture to make all enterprise data available for advanced analytics exercise
- Accelerate data acquisition and massaging, providing the data scientist with a powerful tool to complement their practice
- Integrate popular tools from the data science ecosystem: Spark, Python, Zeppelin, Jupyter, etc.
MongoDB IoT City Tour STUTTGART: Analysing the Internet of Things. By, PentahoMongoDB
Dominik Claßen, Sales Engineering Team Laed at Pentaho
Drawing on Pentaho's wide experience in solving customers' big data issues, Dominik positions the importance of analytics in the IoT.
[-] Understanding the challenges behind data integration & analytics for IoT
[-] Future proofing your information architecture for IoT
[-] Delivering IoT analytics, now and tomorrow
[-] Real customer examples of where Pentaho can help
Watch full webinar here: https://bit.ly/3mdj9i7
You will often hear that "data is the new gold"? In this context, data management is one of the areas that has received more attention from the software community in recent years. From Artificial Intelligence and Machine Learning to new ways to store and process data, the landscape for data management is in constant evolution. From the privileged perspective of an enterprise middleware platform, we at Denodo have the advantage of seeing many of these changes happen.
In this webinar, we will discuss the technology trends that will drive the enterprise data strategies in the years to come. Don't miss it if you want to keep yourself informed about how to convert your data to strategic assets in order to complete the data-driven transformation in your company.
Watch this on-demand webinar as we cover:
- The most interesting trends in data management
- How to build a data fabric architecture?
- How to manage your data integration strategy in the new hybrid world
- Our predictions on how those trends will change the data management world
- How can companies monetize the data through data-as-a-service infrastructure?
- What is the role of voice computing in future data analytic
Topics including: The transformative value of real-time data and analytics, and current barriers to adoption. The importance of an end-to-end solution for data-in-motion that includes ingestion, processing, and serving. Apache Kudu’s role in simplifying real-time architectures.
Capgemini Leap Data Transformation Framework with ClouderaCapgemini
https://www.capgemini.com/insights-data/data/leap-data-transformation-framework
The complexity of moving existing analytical services onto modern platforms like Cloudera can seem overwhelming. Capgemini’s Leap Data Transformation Framework helps clients by industrializing the entire process of bringing existing BI assets and capabilities to next-generation big data management platforms.
During this webinar, you will learn:
• The key drivers for industrializing your transformation to big data at all stages of the lifecycle – estimation, design, implementation, and testing
• How one of our largest clients reduced the transition to modern data architecture by over 30%
• How an end-to-end, fact-based transformation framework can deliver IT rationalization on top of big data architectures
Why Your Data Science Architecture Should Include a Data Virtualization Tool ...Denodo
Watch full webinar here: https://bit.ly/35FUn32
Presented at CDAO New Zealand
Advanced data science techniques, like machine learning, have proven an extremely useful tool to derive valuable insights from existing data. Platforms like Spark, and complex libraries for R, Python, and Scala put advanced techniques at the fingertips of the data scientists.
However, most architecture laid out to enable data scientists miss two key challenges:
- Data scientists spend most of their time looking for the right data and massaging it into a usable format
- Results and algorithms created by data scientists often stay out of the reach of regular data analysts and business users
Watch this session on-demand to understand how data virtualization offers an alternative to address these issues and can accelerate data acquisition and massaging. And a customer story on the use of Machine Learning with data virtualization.
The CSC Big Data Analytics Insights service enables clients who do not have an analytics capability to implement the business, data and technology changes to gain business benefit from an initial set of analytics based on a roadmap of changes created by CSC or provided from a compatible set of inputs.
CSC Analytic Insights Implementation has four phases:
Stage 1: Analytic Engagement
Stage 2: Analytic Discovery
Stage 3: Implementation Planning
Stage 4: Embedding Analysis .
The CSC Big Data Analytics Insights service enables clients who do not have an analytics capability to implement the business, data and technology changes to gain business benefit from an initial set of analytics based on a roadmap of changes created by CSC or provided from a compatible set of inputs.
CSC Analytic Insights Implementation has four phases:
Stage 1: Analytic Engagement
Stage 2: Analytic Discovery
Stage 3: Implementation Planning
Stage 4: Embedding Analysis
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
Search and Society: Reimagining Information Access for Radical FuturesBhaskar Mitra
The field of Information retrieval (IR) is currently undergoing a transformative shift, at least partly due to the emerging applications of generative AI to information access. In this talk, we will deliberate on the sociotechnical implications of generative AI for information access. We will argue that there is both a critical necessity and an exciting opportunity for the IR community to re-center our research agendas on societal needs while dismantling the artificial separation between the work on fairness, accountability, transparency, and ethics in IR and the rest of IR research. Instead of adopting a reactionary strategy of trying to mitigate potential social harms from emerging technologies, the community should aim to proactively set the research agenda for the kinds of systems we should build inspired by diverse explicitly stated sociotechnical imaginaries. The sociotechnical imaginaries that underpin the design and development of information access technologies needs to be explicitly articulated, and we need to develop theories of change in context of these diverse perspectives. Our guiding future imaginaries must be informed by other academic fields, such as democratic theory and critical theory, and should be co-developed with social science scholars, legal scholars, civil rights and social justice activists, and artists, among others.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
14. ESRG Customer Case Example
Industrial Predictive Analytics
Jim Stascavage – VP, Engineering
15. Presentation overview
• Introduc)on
• Os)aEdge
industrial
predic)ve
analy)cs
and
Pentaho
integra)on
• Marine
case
example:
Saving
fuel
and
avoiding
failure
• Liquid
packaging
case
example:
Predic)ng
)me
to
failure
• Conclusion
15
16. Founded
2000
10+
years
product
development
History
Exper6se
Product
Reliability
engineering
So>ware
&
architecture
Big-‐Data
Currently
remotely
monitoring
3,000+
assets
daily
Comprehensive,
6ered
solu)on
Markets
served
Defense
&
Commercial
Mari6me,
Process,
Power
Gen
ESRG overview
Focus
&
results
Turn
data
into
ac6onable
informa6on;
examples:
• $70,000
annual
savings
per
gas
turbine
• Defer
~60%
of
in-‐person
equipment
assessments
17. Industrial Analytics to create
significant value…
17
Cisco
es)mates
50B
devices
connected,
crea)ng
addi)onal
$14T
in
profits
over
next
decade
McKinsey
&
Co.
es)mates
up
to
$6.2T
in
annual
value
created
annually
by
2025
General
Electric
es)mates
the
market
for
industrial
internet
technology
and
services
to
grow
to
$500B
by
2020
Sources: Manyika, James, others, “Disruptive technologies: Advances that will transform life, business and the global economy” McKinsey
Global Institute, May 2013
Chambers, John, “Internet of Evertyhing”, Cisco, February 21, 2013
General Electric press release, June 18, 2013
18. …ESRG uses data to improve
return on industrial assets
Source: Bringing the industrial internet to the maritime industry and ships into the cloud
;http://www.esrgtech.com/company/ESRGcontent/ 18
Avoid
breakdowns
&
down6me
Op6mize
maintenance
Improve
environmental
compliance
Improve
opera6ons
Improve
energy
efficiency
Marine
Vessel
$500K-‐$1.5M
Per
year
savings
poten6al
Liquid
filling
10-‐20%
up6me
10-‐50%
maint
cost
Per
year
savings
poten6al
19. Our technology monitors 3,000+ assets
~3,000
total
assets
monitored
AC
plants
Hydraulic
systems
Compressors
GT
engines
GT
lube
oil
systems
Diesel
engines
GT
Generators
Reduc)on
gears/transmissions
Refrigera)on
Desaliniza)on
Fuel
flow
meters
Misc
3/24/15 ESRG Confidential 19
20. 20
Mechanical
equipment
Fuel/energy
consump)on
Control
&
opera)ons
Emissions,
Discharges,
etc
1000s
of
sensors
per
asset…
…across
an
enterprise
…automated
analy6cs
avoid
need
for
large
Department
to
analyze
“Big
Data”
Automated
analy6cs
&
Experts
overcome
Big
Data
challenges
Once
per
second
data
=
86,400
points/day
1000
data
points
=
2.6B
points/month
100
ships
=
3.1
trillion
data
points
per
year
Automated
analy)cs
provide
fuel/energy
efficiency
and
equipment
health
ESRG uses analytics to turn
Big Data into actionable information
21. OstiaEdge overview
21
On-‐site
/
Onboard
Local
Central
/
Shore
Central
or
Cloud
Plant
Edi)on
Central
Edi)on
Business
Intelligence
• Local
data
acquisi)on
&
qualifica)on
• Local
analy)cs
and
presenta)on
• Real-‐)me
data
viewer
• Machine
state
and
condi)on
analy)cs
• Embedded
Keble
(PDI)
• Web/cloud
presenta)on
• Workflow
management
• Security/user
mgt
• Dashboards
&
analyzers
• ETL
• Email
reports
• Mobile
• NEW:
PDI:
R
&
Weka
embedded
22. Example data flow and integration
3/24/15 ESRG Business Sensitive 22
Data
Input
External
ETL
PDI
RUL
Algorithm
&
Engine
DSP
R-‐Plugin
Customer
maintenance
system
External
Os)aEdge
Analy)cs
Os)aEdge
Presenta)on
Analyzers
&
Dashboards
Custom
Dashboards
BI
Cube(s)
Email
Reports
Configura)on
Management
New
23. Saving Fuel and Avoiding Failure
Case Example: Maritime
23
Op)mize
generators
$50K-‐$250K
Tune
equipment
$50K-‐$150K
Avoid
failure
$10K-‐$500K
Large
ship
owner
trying
to
reduce
fuel
&
failures/down)me:
• $5-‐10M
in
fuel
cost
per
year
• $10,000
per
day
for
vessel
down)me
Solu6on
Situa6on
Use
Os)aEdge
&
embedded
Pentaho
ETL
to
make
beber
opera)ons
&
maintenance
decisions
• Embedded
Pentaho
ETL
for
generator
op)miza)on
&
dashboards
• Os)aEdge
analy)cs
for
failure
avoidance
24. New: Predicting Time to Failure
Case Example: Liquid packaging
24
R
&
Weka
based
RUL
algorithms
Predict
Failure
Standard
&
Custom
Dashboards
Exec
Transparency
Global
liquid
packaging
OEM
with
two
goals:
• Improve
customer
up)me
• Reduce
unnecessary
maintenance
and
extra
parts
consump)on
Solu6on
Situa6on
Leverage
Pentaho
PDI
+
Os)aEdge
to
predict
Remaining
Useful
Life
(RUL)
• ETL
to
bring
in
enterprise
level
data
• Data
Science
Pack
(R
&
Weka)
used
to
design
algorithms
to
predict
RUL
• Customized
embedded
dashboard
25. Industrial Analytics Opportunity
25
Os6aEdge
+
Pentaho
• ETL
• Diagnos)cs
• Analyzers
&
Dashboards
• New:
Prognos?cs
with
R
&
Weka
Lower
Cost
&
Faster
• Small
team
• Rapid
&
agile
algorithm
development
• Easy
integra)on
• Flexible
implementa)on
Avoid
breakdowns
&
down6me
Op6mize
maintenance
Improve
environmental
compliance
Improve
opera6ons
Improve
energy
efficiency
26. Q and A …
Ask Questions. Our team is standing
by to help.
The webinar slides will be posted to
our website and our
Slideshare.net/aipmm page.
The webinar recording will be posted
at AIPMM.net for members.