This document discusses getting started with big data analytics using Hadoop and Pentaho. It provides an overview of installing and configuring Hadoop and Pentaho on a single machine or cluster. Dell's Crowbar tool is presented as a way to quickly deploy Hadoop clusters on Dell hardware in about two hours. The document also covers best practices like leveraging different technologies, starting with small datasets, and not overloading networks. A demo is given and contact information provided.
Check out this presentation from Pentaho and ESRG to learn why product managers should understand Big Data and hear about real-life products that have been elevated with these innovative technologies.
Learn more in the brief that inspired the presentation, Product Innovation with Big Data: http://www.pentaho.com/resources/whitepaper/product-innovation-big-data
Explore how data integration (or “mashups”) can maximize analytic value and help business teams create streamlined data pipelines that enables ad-hoc analytic inquiries. You’ll learn why businesses increasingly focused on blending data on demand and at the source, the concrete analytic advantages that this approach delivers, and the type of architectures required for delivering trusted, blended data. We provide a checklist to assess your data integration needs and capabilities, and review some real-world examples of how blending various data types has created significant analytic value and concrete business impact.
Check out this presentation from Pentaho and ESRG to learn why product managers should understand Big Data and hear about real-life products that have been elevated with these innovative technologies.
Learn more in the brief that inspired the presentation, Product Innovation with Big Data: http://www.pentaho.com/resources/whitepaper/product-innovation-big-data
Explore how data integration (or “mashups”) can maximize analytic value and help business teams create streamlined data pipelines that enables ad-hoc analytic inquiries. You’ll learn why businesses increasingly focused on blending data on demand and at the source, the concrete analytic advantages that this approach delivers, and the type of architectures required for delivering trusted, blended data. We provide a checklist to assess your data integration needs and capabilities, and review some real-world examples of how blending various data types has created significant analytic value and concrete business impact.
30 for 30: Quick Start Your Pentaho EvaluationPentaho
These slides are from our recent 30 for 30 webinar tailored towards people that have downloaded the Pentaho evaluation and want to know more about all the data integration and business analytics components part of the trial, how to easily integrate data, and best practices for installing/developing content.
What's in store for Big Data in 2015? Will the 'Internet of Things' fuel the Industrial Internet? Will Big Data get Cloudy? Check out the top five Big Data predictions for 2015 according to Quentin Gallivan, CEO, Pentah0
Pentaho Analytics for MongoDB - presentation from MongoDB World 2014Pentaho
Bo Borland presentation at MongoDB World in NYC, June 24, 2014. Data Integration and Advanced Analytics for MongoDB: Blend, Enrich and Analyze Disparate Data in a Single MongoDB View
Up Your Analytics Game with Pentaho and Vertica Pentaho
Big Data is a game-changer.
In the face of exploding volumes and varieties of data, traditional data management and ETL systems just aren’t cutting it anymore. A new way of sifting through vast volumes of data to find the most relevant info, combining this data with other data sources to extract faster insights is desperately needed. Enter HP|Vertica and Pentaho with a proven solution for lightning fast queries and blended data and analytics capabilities for your business users.
MongoDB IoT City Tour EINDHOVEN: Analysing the Internet of Things: Davy Nys, ...MongoDB
Drawing on Pentaho's wide experience in solving customers' big data issues, Davy Nys will position the importance of analytics in the IoT:
[-] Understanding the challenges behind data integration & analytics for IoT
[-] Future proofing your information architecture for IoT
[-] Delivering IoT analytics, now and tomorrow
[-] Real customer examples of where Pentaho can help
Pentaho - Jake Cornelius - Hadoop World 2010Cloudera, Inc.
Putting Analytics in Big Data Analytics
Jake Cornelius
Director of Product Management, Pentaho Corporation
Learn more @ http://www.cloudera.com/hadoop/
Webinar | Using Hadoop Analytics to Gain a Big Data AdvantageCloudera, Inc.
Learn about:
Why big data matters to your business: realize revenue, increase customer loyalty, and pinpoint effective strategies
The business and technical challenges of big data solutions
How to leverage big data for competitive advantage
The “must haves” of an effective big data solution
Real-world examples of Cloudera, Pentaho and Dell big data solutions in action
With the combination of Pentaho and MongoDB, it’s drastically simpler and faster to build single analytical views of clients by aggregating and blending data from a variety of internal sources (customer, transaction, position data) and external sources (social networking, central bank, news, pricing) with fast response times.
Webinar covers:
An insider’s view of new ways financial services companies are using MongoDB to rapidly store and consume unlimited shapes and sizes of data
How Pentaho makes it easy to enrich data in MongoDB with predictive scoring, visual data integration tools, reports, interactive dashboards, and data visualizations
A live demo of blending Twitter, equity pricing, and news data into a single analytical view that unlocks market intelligence to create investment opportunities
RCG has developed a unique approach to helping its clients solve business problems using data. Whether you are interested in learning how to use technology to expose more value from your data through analytics solutions or understanding whether statistical analysis would surface new insights, RCG is ready to help with its Data & Analytics Practice.
BI congres 2016-2: Diving into weblog data with SAS on Hadoop - Lisa Truyers...BICC Thomas More
9de BI congres van het BICC-Thomas More: 24 maart 2016
De hoeveelheid data die via weblogs verzameld wordt, neemt steeds meer toe. Lisa Truyers zet aan de hand van een praktische case uiteen hoe Keyrus hiermee aan de slag ging
The Value of the Modern Data Architecture with Apache Hadoop and Teradata Hortonworks
This webinar discusses why Apache Hadoop most typically the technology underpinning "Big Data". How it fits in a modern data architecture and the current landscape of databases and data warehouses that are already in use.
Open Analytics 2014 - Pedro Alves - Innovation though Open SourceOpenAnalytics Spain
Delivering the Future of Analytics: Innovation through Open Source Pentaho was born out of the desire to achieve positive, disruptive change in the business analytics market, dominated by bureaucratic megavendors offering expensive heavy-weight products built on outdated technology platforms. Pentaho’s open, embeddable data integration and analytics platform was developed with a strong open source heritage. This provided Pentaho a first-mover advantage to engage early with adopters of big data technologies and solve the difficult challenges of integrating both established and emerging data types to drive analytics. Continued technology innovations to support the big data ecosystem, have kept customers ahead of the big data curve. With the ability to drastically reduce the time to design, develop and deploy big data solutions, Pentaho counts numerous big data customers, both large and small, across the financial services, retail, travel, healthcare and government industries around the world.
Cisco cdr reporting it’s easy if you do it smartReza Mousavi
The use of call reporting and analytics services in all the major organizations and companies has been getting commoner with the passage of time. If you happen to be someone who is planning to introduce such a piece of software in your business as well and are wondering how exactly the software would be able to improve your customer support department, then you have landed on the right page. The below-mentioned information is going to help a great deal in this regard. Without further ado, let’s take a closer look to let you make an informed decision.
30 for 30: Quick Start Your Pentaho EvaluationPentaho
These slides are from our recent 30 for 30 webinar tailored towards people that have downloaded the Pentaho evaluation and want to know more about all the data integration and business analytics components part of the trial, how to easily integrate data, and best practices for installing/developing content.
What's in store for Big Data in 2015? Will the 'Internet of Things' fuel the Industrial Internet? Will Big Data get Cloudy? Check out the top five Big Data predictions for 2015 according to Quentin Gallivan, CEO, Pentah0
Pentaho Analytics for MongoDB - presentation from MongoDB World 2014Pentaho
Bo Borland presentation at MongoDB World in NYC, June 24, 2014. Data Integration and Advanced Analytics for MongoDB: Blend, Enrich and Analyze Disparate Data in a Single MongoDB View
Up Your Analytics Game with Pentaho and Vertica Pentaho
Big Data is a game-changer.
In the face of exploding volumes and varieties of data, traditional data management and ETL systems just aren’t cutting it anymore. A new way of sifting through vast volumes of data to find the most relevant info, combining this data with other data sources to extract faster insights is desperately needed. Enter HP|Vertica and Pentaho with a proven solution for lightning fast queries and blended data and analytics capabilities for your business users.
MongoDB IoT City Tour EINDHOVEN: Analysing the Internet of Things: Davy Nys, ...MongoDB
Drawing on Pentaho's wide experience in solving customers' big data issues, Davy Nys will position the importance of analytics in the IoT:
[-] Understanding the challenges behind data integration & analytics for IoT
[-] Future proofing your information architecture for IoT
[-] Delivering IoT analytics, now and tomorrow
[-] Real customer examples of where Pentaho can help
Pentaho - Jake Cornelius - Hadoop World 2010Cloudera, Inc.
Putting Analytics in Big Data Analytics
Jake Cornelius
Director of Product Management, Pentaho Corporation
Learn more @ http://www.cloudera.com/hadoop/
Webinar | Using Hadoop Analytics to Gain a Big Data AdvantageCloudera, Inc.
Learn about:
Why big data matters to your business: realize revenue, increase customer loyalty, and pinpoint effective strategies
The business and technical challenges of big data solutions
How to leverage big data for competitive advantage
The “must haves” of an effective big data solution
Real-world examples of Cloudera, Pentaho and Dell big data solutions in action
With the combination of Pentaho and MongoDB, it’s drastically simpler and faster to build single analytical views of clients by aggregating and blending data from a variety of internal sources (customer, transaction, position data) and external sources (social networking, central bank, news, pricing) with fast response times.
Webinar covers:
An insider’s view of new ways financial services companies are using MongoDB to rapidly store and consume unlimited shapes and sizes of data
How Pentaho makes it easy to enrich data in MongoDB with predictive scoring, visual data integration tools, reports, interactive dashboards, and data visualizations
A live demo of blending Twitter, equity pricing, and news data into a single analytical view that unlocks market intelligence to create investment opportunities
RCG has developed a unique approach to helping its clients solve business problems using data. Whether you are interested in learning how to use technology to expose more value from your data through analytics solutions or understanding whether statistical analysis would surface new insights, RCG is ready to help with its Data & Analytics Practice.
BI congres 2016-2: Diving into weblog data with SAS on Hadoop - Lisa Truyers...BICC Thomas More
9de BI congres van het BICC-Thomas More: 24 maart 2016
De hoeveelheid data die via weblogs verzameld wordt, neemt steeds meer toe. Lisa Truyers zet aan de hand van een praktische case uiteen hoe Keyrus hiermee aan de slag ging
The Value of the Modern Data Architecture with Apache Hadoop and Teradata Hortonworks
This webinar discusses why Apache Hadoop most typically the technology underpinning "Big Data". How it fits in a modern data architecture and the current landscape of databases and data warehouses that are already in use.
Open Analytics 2014 - Pedro Alves - Innovation though Open SourceOpenAnalytics Spain
Delivering the Future of Analytics: Innovation through Open Source Pentaho was born out of the desire to achieve positive, disruptive change in the business analytics market, dominated by bureaucratic megavendors offering expensive heavy-weight products built on outdated technology platforms. Pentaho’s open, embeddable data integration and analytics platform was developed with a strong open source heritage. This provided Pentaho a first-mover advantage to engage early with adopters of big data technologies and solve the difficult challenges of integrating both established and emerging data types to drive analytics. Continued technology innovations to support the big data ecosystem, have kept customers ahead of the big data curve. With the ability to drastically reduce the time to design, develop and deploy big data solutions, Pentaho counts numerous big data customers, both large and small, across the financial services, retail, travel, healthcare and government industries around the world.
Cisco cdr reporting it’s easy if you do it smartReza Mousavi
The use of call reporting and analytics services in all the major organizations and companies has been getting commoner with the passage of time. If you happen to be someone who is planning to introduce such a piece of software in your business as well and are wondering how exactly the software would be able to improve your customer support department, then you have landed on the right page. The below-mentioned information is going to help a great deal in this regard. Without further ado, let’s take a closer look to let you make an informed decision.
Hadoop World 2011: Data Ingestion, Egression, and Preparation for Hadoop - Sa...Cloudera, Inc.
One of the first challenges Hadoop developers face is accessing all the data they need and getting it into Hadoop for analysis. Informatica PowerExchange accesses a variety of data types and structures at different latencies (e.g. batch, real-time, or near real-time) and ingests data directly into Hadoop. The next step is to parse the data in preparation for analysis in Hadoop. Informatica provides a visual IDE to deploy pre-built parsers or design specific parsers for complex data formats and deploy them on Hadoop. Once the analysis is complete, Informatica PowerExhange delivers the resulting output to other information management systems such as a data warehouse. Learn in this session from Informatica and one of their customers, how to get all the data you need into Hadoop, parse a variety of data formats and structures, and egress the resultant output to other systems.
Big Data Expo 2015 - Pentaho The Future of AnalyticsBigDataExpo
Leer hoe Pentaho kan helpen om zowel legacy data en ongestructureerde (Big) data van verschillende bronnen te blenden en te verrijken om zo waarde te creeeren voor uw organisatie. Praktische voorbeelden illustreren hoe Pentaho dit al bij vele organisaties heeft weten te bereiken.
Zie hoe organisaties Pentaho onder andere inzetten om:
• problemen met te lange ETL jobs op te lossen waardoor Data Warehouse loads weer doorgaan,
• de kosten van data-integratie te verlagen,
• het overlopen van traditionele Data Warehouses en bijkomende kosten doet voorkomen,
• Data Quality en Data Governence in uw process inbrengt en
• hoe dit vervolgens embedded in uw applicaties kan worden geanalyseerd.
CDR-Stats : VoIP Analytics Solution for Asterisk and FreeSWITCH with MongoDBAreski Belaid
CDR-Stats is a free and open source call detail record analysis and reporting software for Freeswitch, Asterisk and other types of VoIP Switch. It allows you to interrogate CDR to provide reports and statistics via a simple to use powerful web interface.
It is based on the Django Python Framework, Celery, SocketIO, Gevent and MongoDB.
This is the presentation at the successful completion of 'Kanthaka'- Big Data CDR (Caller Detail Record) Analyzer, a system to support near real time complex promotion at telecom operators. This includes the details of technology selection, system architecture and final test results on a dual core machine with 3GB RAM and a cluster with two such nodes.
Hadoop Summit San Jose 2015: What it Takes to Run Hadoop at Scale Yahoo Persp...Sumeet Singh
Since 2006, Hadoop and its ecosystem components have evolved into a platform that Yahoo has begun to trust for running its businesses globally. In this talk, we will take a broad look at some of the top software, hardware, and services considerations that have gone in to make the platform indispensable for nearly 1,000 active developers, including the challenges that come from scale, security and multi-tenancy. We will cover the current technology stack that we have built or assembled, infrastructure elements such as configurations, deployment models, and network, and and what it takes to offer hosted Hadoop services to a large customer base.
Oracle Big Data Appliance and Big Data SQL for advanced analyticsjdijcks
Overview presentation showing Oracle Big Data Appliance and Oracle Big Data SQL in combination with why this really matters. Big Data SQL brings you the unique ability to analyze data across the entire spectrum of system, NoSQL, Hadoop and Oracle Database.
Oracle Openworld Presentation with Paul Kent (SAS) on Big Data Appliance and ...jdijcks
Learn about the benefits of Oracle Big Data Appliance and how it can drive business value underneath applications and tools. This includes a section by Paul Kent, VP Big Data SAS describing how SAS runs well on Oracle Engineered Systems and on Oracle Big Data Appliance specifically.
What it takes to bring Hadoop to a production-ready stateClouderaUserGroups
While Hadoop may be a hot topic and is probably the buzziest big data term, the fact is that many Hadoop projects get stuck in pilot mode. We hear a number of reasons for this.
• “It’s too complicated.”
• “I don’t have the right resources.”
• “Security and compliance are never going to approve this.”
This session digs deep into why certain projects seem destined to remain in development. We’ll also cover what it takes to bring Hadoop to a production-ready state and convince management that it’s time to start using Hadoop to store and analyze real business data.
Journey to SAS Analytics Grid with SAS, R, PythonSumit Sarkar
Big data, compliance and a highly skilled workforce are driving organizations to transform their current analytical infrastructure to deliver enterprise computing environments that can support the latest in data science and analytics practices. SAS remains a popular choice for statistical programming languages, but there is growing demand for R and Python. Data engineers are now being tasked to deliver scalable and highly available computing resources to support analytics for a growing number of users and increasing data volumes while maintaining security for their customers.
Hitachi Data Systems Hadoop Solution. Customers are seeing exponential growth of unstructured data from their social media websites to operational sources. Their enterprise data warehouses are not designed to handle such high volumes and varieties of data. Hadoop, the latest software platform that scales to process massive volumes of unstructured and semi-structured data by distributing the workload through clusters of servers, is giving customers new option to tackle data growth and deploy big data analysis to help better understand their business. Hitachi Data Systems is launching its latest Hadoop reference architecture, which is pre-tested with Cloudera Hadoop distribution to provide a faster time to market for customers deploying Hadoop applications. HDS, Cloudera and Hitachi Consulting will present together and explain how to get you there. Attend this WebTech and learn how to: Solve big-data problems with Hadoop. Deploy Hadoop in your data warehouse environment to better manage your unstructured and structured data. Implement Hadoop using HDS Hadoop reference architecture. For more information on Hitachi Data Systems Hadoop Solution please read our blog: http://blogs.hds.com/hdsblog/2012/07/a-series-on-hadoop-architecture.html
Hp Converged Systems and Hortonworks - Webinar SlidesHortonworks
Our experts will walk you through some key design considerations when deploying a Hadoop cluster in production. We'll also share practical best practices around HP and Hortonworks Data Platform to get you started on building your modern data architecture.
Learn how to:
- Leverage best practices for deployment
- Choose a deployment model
- Design your Hadoop cluster
- Build a Modern Data Architecture and vision for the Data Lake
Pentaho Big Data Analytics with Vertica and HadoopMark Kromer
Overview of the Pentaho Big Data Analytics Suite from the Pentaho + Vertica presentation at Big Data Techcon 2014 in Boston for the session called "The Ultimate Selfie | Picture Yourself with the Fastest Analytics on Hadoop with HP Vertica and Pentaho"
Turning Petabytes of Data into Profit with Hadoop for the World’s Biggest Ret...Cloudera, Inc.
PRGX is the world's leading provider of accounts payable audit services and works with leading global retailers. As new forms of data started to flow into their organizations, standard RDBMS systems were not allowing them to scale. Now, by using Talend with Cloudera Enterprise, they are able to acheive a 9-10x performance benefit in processing data, reduce errors, and now provide more innovative products and services to end customers.
Watch this webinar to learn how PRGX worked with Cloudera and Talend to create a high-performance computing platform for data analytics and discovery that rapidly allows them to process, model, and serve massive amount of structured and unstructured data.
Explore how data integration (or “mashups”) can maximize analytic value and help business teams create streamlined data pipelines that enables ad-hoc analytic inquiries. You’ll learn why businesses increasingly focused on blending data on demand and at the source, the concrete analytic advantages that this approach delivers, and the type of architectures required for delivering trusted, blended data. We provide a checklist to assess your data integration needs and capabilities, and review some real-world examples of how blending various data types has created significant analytic value and concrete business impact.
Filling the Data Lake - Strata + HadoopWorld San Jose 2016 Preview PresentationPentaho
Preview of the Strata + Hadoop World Strata San Jose 2016 session about truly scalable and automated data onboarding for Hadoop
Attend the presentation at the conference to learn how to tackle repeatable, self-service Hadoop ingestion without coding
Filling the Data Lake
Thursday, March 31 11:50a-12:30p
Room 230B
http://conferences.oreilly.com/strata/hadoop-big-data-ca/public/schedule/detail/50677
James Dixon has a unique perspective on the big data space - he coined the term "data lake." In this on-demand webinar the Big Data Maverick talks big data - watch to learn more about the technology landscape and evolving use cases. He covers topics such as:
- What are today's technologies of choice - where did they come from and why?
- Why is the emergence and definition of these use cases so important?
- What technologies are likely to come next?
- Why did the data explosion start and will it continue?
- Why are data scientists in such huge demand?
- What is the role of open source in big data, and the role of big data in open source?
Users and customers don't just want products and services anymore - they also want the data and analytics that are under the hood! The good news is that delivering value with data is more achievable than ever before thanks to greater access to diverse data sources and the ability to process, blend, and refine data at unprecedented scale.
Predictive Analytics with Pentaho Data Mining - Análisis Predictivo con Penta...Pentaho
This webinar is in Spanish -
El uso de análisis predictivo o minería de datos está en auge. A nivel mundial, cada vez más, las empresas contratan servicios especializados de análisis de información que ayuden a marcar una diferencia con la competencia. Por otro lado, el volumen creciente de data así como su naturaleza cambiante y compleja, hacen inmanejable el proceso de análisis de forma tradicional y está siendo necesario incorporar tecnología y consultoría de punta, basada en el uso de modelos matemáticos avanzados. Pentaho Corporation y Matrix CPM Solutions los invita a participar en el seminario en línea “Análisis Predictivo con Pentaho Data Mining”, en donde se revisarán las grandes oportunidades que existen para su uso y aplicación.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Generating a custom Ruby SDK for your web service or Rails API using Smithyg2nightmarescribd
Have you ever wanted a Ruby client API to communicate with your web service? Smithy is a protocol-agnostic language for defining services and SDKs. Smithy Ruby is an implementation of Smithy that generates a Ruby SDK using a Smithy model. In this talk, we will explore Smithy and Smithy Ruby to learn how to generate custom feature-rich SDKs that can communicate with any web service, such as a Rails JSON API.
6. Data Warehouse Optimization
Data Sources Big Data Architecture
Data Warehouse
(Master & Transactional Data)
ERP
CRM
CDR
Analytic
Data Mart(s)
Analytic
Data Mart(s)
Analytic
Data Mart(s)
Logs
Logs
Other Data
Raw Data
Parsed Data
Analytic Datasets
Master Data
Tape
Archive
8. Data Architecture and Integration Challenges
Data Sources Big Data Architecture
Data Warehouse
(Master & Transactional Data)
ERP
CRM
CDR
Analytic
Data Mart(s)
Analytic
Data Mart(s)
Analytic
Data Mart(s)
Logs
Logs
Other Data
Raw Data
Parsed Data
Analytic Datasets
Master Data
NOSQL
9. Data Architecture and Integration Challenges
Data Sources Big Data Architecture
Data Warehouse
(Master & Transactional Data)
ERP
CRM
CDR
Analytic
Data Mart(s)
Analytic
Data Mart(s)
Analytic
Data Mart(s)
Logs
Logs
Other Data
Raw Data
Parsed Data
Analytic Datasets
Master Data
NOSQL
Extract
Transform
Load
Orchestration & Integration
MR
10. Data Architecture and Integration Challenges
Data Sources Big Data Architecture
Data Warehouse
(Master & Transactional Data)
ERP
CRM
CDR
Analytic
Data Mart(s)
Analytic
Data Mart(s)
Analytic
Data Mart(s)
Logs
Logs
Other Data
Raw Data
Parsed Data
Analytic Datasets
Master Data
NOSQL
Extract
Transform
Load
Orchestration & Integration
MR
13. Global Marketing
Well we are ready, but
how will the Hardware
Team know how to size
and design the Hadoop
cluster……..?
I don’t know….and it
may take a long time to
build the Hadoop cluster
Time is a critical
factor, we need to get
this project moving
14. Global Marketing
Reduce time to Cluster Sizing, Design &
Deployment
Faster time to productive operations
Optimize and adapt for your needs
Deliver the best return on investment
Reduce risk &
increase
flexibility
with
Dell
Dell.com/Crowbar
15. Global Marketing
Dell | Hadoop
Solution
“Dell … was one of the first of
the hardware vendors to grasp
the fact that cloud is about
provisioning services, not about
the hardware.”
Maxwell Cooter, Cloud Pro
Excels at supporting complex big data
analyses across large collections of
structured and unstructured data
• Hadoop handles a variety of workloads,
including search, log processing, data
warehousing, recommendation systems and
video/image analysis
• Work on the most modern scale-out
architectures using a clean-sheet design data
framework
• Without vendor lock-in
Apache Hadoop software
Crowbar software framework with a
Hadoop barclamp
PowerEdge C8000 Series, C6220, R720, R720XD
Force10 or PowerConnect switches
Reference Architecture
Deployment Guide
Joint Service and Support
Proven solutions
Proven components
Partner Ecosystem
16. Global Marketing
Crowbar
• Accelerates
multi-node
deployments
• Simplifies
maintenance
• Streamlines
ongoing
updates
Built with DevOps
• Provides an operational model for managing big
data clusters and cloud
Field-proven technologies
• Build on locally deployed Chef Server
• Raw servers to full cluster in <2 hours
• Hardened with more than a year of deployments
Apache 2 open source
• Multi-apps (Hadoop & OpenStack)
• Multi-OS (Ubuntu, RHEL, CentOS, SUSE)
NOT limited to Dell hardware
Crowbar Software Framework
A modular, open source framework
17. Global Marketing
Deploy a Hadoop cluster in ~2 hours
Reduce software
licensing fees
100%
Use Crowbar to:
• Automate the deployment and
configuration of a Hadoop cluster
• Quickly provision bare-metal
servers from box to cluster with
minimal intervention
• Maintain, upgrade and evolve your
Hadoop cluster over time
• Leverage an open source
framework backed by a growing
global developer ecosystem
Reduce
development time
4-6 mo.
Crowbar
software
framewo
rk
Evolve to meet your needs over time with built in DevOps
19. Global Marketing
Leverage developer expertise worldwide
Download the open source software:
https://github.com/dellcloudedge/crowb
ar
Participate in an active community
http://lists.us.dell.com/mailman/listinfo
/crowbar
Get resources on the Wiki:
https://github.com/dellcloudedge/crowb
ar/wiki
Visit Dell.com/Crowbar,
Crowbar@Dell.com
TAKE-AWAYSPentaho provides complete integrated DI+BI for every leading big data platform.
The company decided to invest in Hadoop to ingest the raw CDR data into Hadoop along with other data. This frees DW capacity for high value transactional data; while also, lowering cost and meeting compliance requirements. And since the data is rescued from tape, it becomes available for reporting and analysis.
The company decided to invest in Hadoop to ingest the raw CDR data into Hadoop along with other data. This frees DW capacity for high value transactional data; while also, lowering cost and meeting compliance requirements. And since the data is rescued from tape, it becomes available for reporting and analysis.
The company decided to invest in Hadoop to ingest the raw CDR data into Hadoop along with other data. This frees DW capacity for high value transactional data; while also, lowering cost and meeting compliance requirements. And since the data is rescued from tape, it becomes available for reporting and analysis.
The company decided to invest in Hadoop to ingest the raw CDR data into Hadoop along with other data. This frees DW capacity for high value transactional data; while also, lowering cost and meeting compliance requirements. And since the data is rescued from tape, it becomes available for reporting and analysis.
Delivered as a hardware, software, and services Reference Architecture (RA) which can scale from 6-nodes up to 720-nodesCurrently utilizes PowerEdge C 2100/C6100/C6105 R720, R720XD servers and PowerConnect 6248 or Force 10 switchesDell CrowbarAutomated solution deployment and configuration (Bare metal, OS, Solution Stack, and Monitoring)CDH3 EnterpriseCloudera Hadoop DistributionCloudera Management ToolsCloudera SupportPartner EcosystemSoftware and services capabilities to address broader customer needs around HadoopEnabling non-technical business users to leverage HadoopSimplify getting data into HadoopIntuitive analytics reporting and dashboardsSolution Provided viaReference ArchitectureDeployment GuideDell Digital LockerDell Deployment Services
First OpenStack cloud solution provider in marketPioneer OpenStack partner (Only Day 1 hardware provider)Most history with the OpenStack technology = expertize + RA’s that have been tested longer and fuller than newcomersDell offers a deep partnership ecosystemSingle point of support and purchase to reduce the problem of dealing with multiple vendorsONLY company providing automated software to do multi-node OpenStack provisioning: CrowbarDell developed software that we opensourced in the community.OpenStack expertsize