This document discusses JPMorgan Chase's consideration of using Hadoop in the enterprise. It outlines the potential for Hadoop to reduce costs through lower hardware expenses and more efficient use of resources. Hadoop could also enable new types of data analysis and disrupt existing technologies. The document then describes JPMorgan Chase's active proof-of-concept projects evaluating Hadoop and how it positions Hadoop relative to traditional data warehousing. It concludes by identifying additional features needed to better support enterprise use of Hadoop.
Learn why 451 Research believes Infochimps is well-positioned with an easy-to-consume managed service for those without Hadoop expertise, as well as a stack of technologically interesting projects for the 'devops' crowd.
Opening with a market positioning statement and ending with a competitive and SWOT analysis, Matt Aslett provides a comprehensive impact report.
Data Discovery, Visualization, and Apache HadoopHortonworks
In this webinar, we will discuss how Apache Hadoop works with your current infrastructure and how you can use data discovery and visualization tools to gain deeper insights from new data types stored in Hadoop and your existing data center investments.
How pig and hadoop fit in data processing architectureKovid Academy
Pig, developed by Yahoo research in 2006, enables programmers to write data transformation programs for Hadoop quickly and easily without the cost and complexity of map-reduce programs.
Learn why 451 Research believes Infochimps is well-positioned with an easy-to-consume managed service for those without Hadoop expertise, as well as a stack of technologically interesting projects for the 'devops' crowd.
Opening with a market positioning statement and ending with a competitive and SWOT analysis, Matt Aslett provides a comprehensive impact report.
Data Discovery, Visualization, and Apache HadoopHortonworks
In this webinar, we will discuss how Apache Hadoop works with your current infrastructure and how you can use data discovery and visualization tools to gain deeper insights from new data types stored in Hadoop and your existing data center investments.
How pig and hadoop fit in data processing architectureKovid Academy
Pig, developed by Yahoo research in 2006, enables programmers to write data transformation programs for Hadoop quickly and easily without the cost and complexity of map-reduce programs.
The past year was punctuated by significant advancements in Apache Hadoop and increasingly wider adoption of Hadoop technology across the enterprise. Companies are continuing to use Hadoop in exciting new ways to better serve their customers, inform product development and drive operational efficiency like never before. Join Mike Olson, founder and CEO of Cloudera, as he shares his twelve major predictions for Hadoop in 2012. He will also unveil predictions from key industry analysts.
Olson will discuss predictions for:
- Where new opportunities for Hadoop will be found within the enterprise
- How new projects being developed for and on Apache Hadoop will expand data analysis capabilities
- Ways that Apache Hadoop will help companies solve short term and long term business challenges
Enterprise Apache Hadoop: State of the UnionHortonworks
So what's in store for 2014? This deck was from Shaun Connolly's (VP of Strategy, Hortonworks) State of the Union webinar.
In this deck, you'll find:
- Reflection on Enterprise Hadoop Market in 2013
- The latest releases and innovations within the open source community
- Highlights of what's in store for Apache Hadoop and Big Data in 2014
IT @ Intel: Preparing the Future Enterprise with the Internet of ThingsIntel IT Center
The Internet of Things (IoT) is the concept of diverse machines, devices, and technologies connecting, interacting, and negotiating with each other to help improve and enrich our lives. No longer is this limited to just computer or smart phone technology. Everyday items such as household appliance, cars and even toys can connect to the internet to integrate with other computing things, processes and services. This new paradigm is changing how data is used and collected, and introducing new challenges for enterprises.
Big Data, Hadoop, Hortonworks and Microsoft HDInsightHortonworks
Big Data is everywhere. And at the center of the big data discussion is Apache Hadoop, a next-generation enterprise data platform that allows you to capture, process and share the enormous amounts of new, multi-structured data that doesn’t fit into transitional systems.
With Microsoft HDInsight, powered by Hortonworks Data Platform, you can bridge this new world of unstructured content with the structured data we manage today. Together, we bring Hadoop to the masses as an addition to your current enterprise data architectures so that you can amass net new insight without net new headache.
Neustar is a fast growing provider of enterprise services in telecommunications, online advertising, Internet infrastructure, and advanced technology. Neustar has engaged Think Big Analytics to leverage Hadoop to expand their data analysis capacity. This session describes how Hadoop has expanded their data warehouse capacity, agility for data analysis, reduced costs, and enabled new data products. We look at the challenges and opportunities in capturing 100′s of TB’s of compact binary network data, ad hoc analysis, integration with a scale out relational database, more agile data development, and building new products integrating multiple big data sets.
Ομιλία- Παρουσίαση: Ανδρέας Τσαγκάρης, VP & Chief Technology Officer, Performance Technologies
Τίτλος Παρουσίασης: “Big Data on Linux on Power Systems”
Originally Published on Sep 23, 2014
IBM InfoSphere BigInsights, an enterprise-ready distribution of Hadoop, is designed to address the challenges of big data and modern IT by analyzing larger volumes of data more cost-effectively. Deployed on the cloud, it enables rapid deployment of clusters and real-time analytics.
FYI: The value of Hadoop and many more questions will be pondered at this year’s Strata/Hadoop World event in NYC (October 15-17, 2014) and certainly at IBM Insight (October 26-30, 2014).
The Value of the Modern Data Architecture with Apache Hadoop and Teradata Hortonworks
This webinar discusses why Apache Hadoop most typically the technology underpinning "Big Data". How it fits in a modern data architecture and the current landscape of databases and data warehouses that are already in use.
Better Total Value of Ownership (TVO) for Complex Analytic Workflows with the...ModusOptimum
Customers are looking for ways to streamline analytic decisioning, looking for quicker deployments, faster time to value, lower risks of failure and higher revenues/profits. The IBM & Hortonworks solution delivers on these customer needs.
https://event.on24.com/eventRegistration/EventLobbyServlet?target=reg20.jsp&eventid=1789452&sessionid=1&eventid=1789452&sessionid=1&mode=preview&key=E0F94DE1191C59223B6522A075023215
What if you could get over $3 back for every $1 you invest in big data technology? Recent research* by IDC shows that big data ROI is for real, and it can be huge, at an average of 382% 3-year ROI for the organizations that were studied.
In this deck, Carl Olofson, Research Vice President, Data Management Software Research for IDC, shares his findings on nine MapR customers and discusses:
+ The business value they gained from their big data deployments
+ An average of 42% reduction in cost over alternative big data systems
+ 31% higher productivity for data scientists
+ 39% increased productivity for application developers
Dale Kim, Sr. Director of Industry Solutions at MapR Technologies, then explains how the MapR Converged Data Platform advantages drive significant ROI for customers.
*Research comes from IDC Document #US40870615
Get the report here: http://www.mapr.com/idc-researches-business-value-mapr?source=Social&campaign=2016_Content_IDCReportMapRBusinessValue&utm_source=Social&utm_medium=Slideshare&utm_campaign=IDC+Report
Emulex Presents Why I/O is Strategic Global Survey ResultsEmulex Corporation
This webcast is the first in a monthly series on why I/O is strategic for the data center. Emulex will present findings from a global survey of more than 1,500 IT professionals that demonstrate the strategic importance of I/O in the data center across four key technology trends: virtualization, cloud, big data and convergence.
Hadoop World 2011: The Blind Men and the Elephant - Matthew Aslett - The 451 ...Cloudera, Inc.
Who is contributing to the Hadoop ecosystem, what are they contributing, and why? Who are the vendors that are supplying Hadoop-related products and services and what do they want from Hadoop? How is the expanding ecosystem benefiting or damaging the Apache Hadoop project? What are the emerging alternatives to Hadoop and what chance do they have? In this session, the 451 Group will seek to answer these questions based on their latest research and present their perspective of where Hadoop fits in the total data management landscape.
The past year was punctuated by significant advancements in Apache Hadoop and increasingly wider adoption of Hadoop technology across the enterprise. Companies are continuing to use Hadoop in exciting new ways to better serve their customers, inform product development and drive operational efficiency like never before. Join Mike Olson, founder and CEO of Cloudera, as he shares his twelve major predictions for Hadoop in 2012. He will also unveil predictions from key industry analysts.
Olson will discuss predictions for:
- Where new opportunities for Hadoop will be found within the enterprise
- How new projects being developed for and on Apache Hadoop will expand data analysis capabilities
- Ways that Apache Hadoop will help companies solve short term and long term business challenges
Enterprise Apache Hadoop: State of the UnionHortonworks
So what's in store for 2014? This deck was from Shaun Connolly's (VP of Strategy, Hortonworks) State of the Union webinar.
In this deck, you'll find:
- Reflection on Enterprise Hadoop Market in 2013
- The latest releases and innovations within the open source community
- Highlights of what's in store for Apache Hadoop and Big Data in 2014
IT @ Intel: Preparing the Future Enterprise with the Internet of ThingsIntel IT Center
The Internet of Things (IoT) is the concept of diverse machines, devices, and technologies connecting, interacting, and negotiating with each other to help improve and enrich our lives. No longer is this limited to just computer or smart phone technology. Everyday items such as household appliance, cars and even toys can connect to the internet to integrate with other computing things, processes and services. This new paradigm is changing how data is used and collected, and introducing new challenges for enterprises.
Big Data, Hadoop, Hortonworks and Microsoft HDInsightHortonworks
Big Data is everywhere. And at the center of the big data discussion is Apache Hadoop, a next-generation enterprise data platform that allows you to capture, process and share the enormous amounts of new, multi-structured data that doesn’t fit into transitional systems.
With Microsoft HDInsight, powered by Hortonworks Data Platform, you can bridge this new world of unstructured content with the structured data we manage today. Together, we bring Hadoop to the masses as an addition to your current enterprise data architectures so that you can amass net new insight without net new headache.
Neustar is a fast growing provider of enterprise services in telecommunications, online advertising, Internet infrastructure, and advanced technology. Neustar has engaged Think Big Analytics to leverage Hadoop to expand their data analysis capacity. This session describes how Hadoop has expanded their data warehouse capacity, agility for data analysis, reduced costs, and enabled new data products. We look at the challenges and opportunities in capturing 100′s of TB’s of compact binary network data, ad hoc analysis, integration with a scale out relational database, more agile data development, and building new products integrating multiple big data sets.
Ομιλία- Παρουσίαση: Ανδρέας Τσαγκάρης, VP & Chief Technology Officer, Performance Technologies
Τίτλος Παρουσίασης: “Big Data on Linux on Power Systems”
Originally Published on Sep 23, 2014
IBM InfoSphere BigInsights, an enterprise-ready distribution of Hadoop, is designed to address the challenges of big data and modern IT by analyzing larger volumes of data more cost-effectively. Deployed on the cloud, it enables rapid deployment of clusters and real-time analytics.
FYI: The value of Hadoop and many more questions will be pondered at this year’s Strata/Hadoop World event in NYC (October 15-17, 2014) and certainly at IBM Insight (October 26-30, 2014).
The Value of the Modern Data Architecture with Apache Hadoop and Teradata Hortonworks
This webinar discusses why Apache Hadoop most typically the technology underpinning "Big Data". How it fits in a modern data architecture and the current landscape of databases and data warehouses that are already in use.
Better Total Value of Ownership (TVO) for Complex Analytic Workflows with the...ModusOptimum
Customers are looking for ways to streamline analytic decisioning, looking for quicker deployments, faster time to value, lower risks of failure and higher revenues/profits. The IBM & Hortonworks solution delivers on these customer needs.
https://event.on24.com/eventRegistration/EventLobbyServlet?target=reg20.jsp&eventid=1789452&sessionid=1&eventid=1789452&sessionid=1&mode=preview&key=E0F94DE1191C59223B6522A075023215
What if you could get over $3 back for every $1 you invest in big data technology? Recent research* by IDC shows that big data ROI is for real, and it can be huge, at an average of 382% 3-year ROI for the organizations that were studied.
In this deck, Carl Olofson, Research Vice President, Data Management Software Research for IDC, shares his findings on nine MapR customers and discusses:
+ The business value they gained from their big data deployments
+ An average of 42% reduction in cost over alternative big data systems
+ 31% higher productivity for data scientists
+ 39% increased productivity for application developers
Dale Kim, Sr. Director of Industry Solutions at MapR Technologies, then explains how the MapR Converged Data Platform advantages drive significant ROI for customers.
*Research comes from IDC Document #US40870615
Get the report here: http://www.mapr.com/idc-researches-business-value-mapr?source=Social&campaign=2016_Content_IDCReportMapRBusinessValue&utm_source=Social&utm_medium=Slideshare&utm_campaign=IDC+Report
Emulex Presents Why I/O is Strategic Global Survey ResultsEmulex Corporation
This webcast is the first in a monthly series on why I/O is strategic for the data center. Emulex will present findings from a global survey of more than 1,500 IT professionals that demonstrate the strategic importance of I/O in the data center across four key technology trends: virtualization, cloud, big data and convergence.
Hadoop World 2011: The Blind Men and the Elephant - Matthew Aslett - The 451 ...Cloudera, Inc.
Who is contributing to the Hadoop ecosystem, what are they contributing, and why? Who are the vendors that are supplying Hadoop-related products and services and what do they want from Hadoop? How is the expanding ecosystem benefiting or damaging the Apache Hadoop project? What are the emerging alternatives to Hadoop and what chance do they have? In this session, the 451 Group will seek to answer these questions based on their latest research and present their perspective of where Hadoop fits in the total data management landscape.
Comprehensive Security for the Enterprise III: Protecting Data at Rest and In...Cloudera, Inc.
This webinar discusses how you can use Navigator capabilities such as Encrypt and Key Trustee to secure data and enable compliance. Additionally, we will discuss our joint work with Intel on Project Rhino (an initiative to improve data security in Hadoop). We also hear from a security architect at a financial services company that is using encryption and key management to meet financial regulatory requirements.
Explores the notion of "Hadoop as a Data Refinery" within an organisation, be it one with an existing Business Intelligence system or none - looks at 'agile data' as a a benefit of using Hadoop as the store for historical, unstructured and very-large-scale datasets.
The final slides look at the challenge of an organisation becoming "data driven"
Hadoop as Data Refinery - Steve LoughranJAX London
Apache Hadoop is often described as a "Big Data Platform" but what does that mean? One way to better understand Hadoop is to talk about how Hadoop is used. This talk discusses using Hadoop as a "Data Refinery", which is a common use case. The concept is very much like a traditional oil refinery except with data, pulling in large quantities of "crude data" over pipelines, refining some into useful business intelligence; refining other pieces into slightly less crude data that stays in the cluster until needed later. This metaphor proves useful when considering how Hadoop could be adopted in an organisation that already has data warehousing and business intelligence systems -and when contemplating how to hook up a Hadoop cluster to the sources of data inside and outside that organisation. A key point to remember is that storing data in Hadoop is not a means to an end any more than storing data in a database is: it is extracting information from that data. Using Hadoop as a front end "data refinery" means that it can integrate with existing Business Intelligence systems, while providing the platform for new applications.
( EMC World 2012 ) :Apache Hadoop is now enterprise ready. This session reviews the features/roadmap of Hadoop. We will review some of the key capabilities of GPHD 1.x and our plans for 2012.
Hadoop Reporting and Analysis - JaspersoftHortonworks
Hadoop is deployed for a variety of uses, including web analytics, fraud detection, security monitoring, healthcare, environmental analysis, social media monitoring, and other purposes.
Transform Your Business with Big Data and Hortonworks Pactera_US
Customer insight and marketplace predictions are a few of the profitable benefits found in big data technology. Leading companies are using the advanced analytics solution to find new revenue streams, increase customer satisfaction and optimize the supply chain.
Hadoop as a Service ( as offered by handful of niche vendors now ) is a cloud computing solution that makes medium and large-scale data processing accessible, easy, fast and inexpensive. This is achieved by eliminating the operational challenges of running Hadoop, so one can focus on business growth.
Making the Case for Hadoop in a Large Enterprise-British AirwaysDataWorks Summit
Making the Case for Hadoop in a Large Enterprise
British Airways
Alan Spanos
Data Exploitation Manager
British Airways
Jay Aubby
Architect
British Airways
Realizing the Promise of Big Data with Hadoop - Cloudera Summer Webinar Serie...Cloudera, Inc.
Apache Hadoop, an open-source platform, is increasingly gaining adoption within organizations trying to draw insight from all the big data being generated. Hadoop, and a handful of open-source tools that complement it, are promising to make gigantic and diverse datasets easily and economically available for quick analysis. A burgeoning partner ecosystem is also essential to helping organizations turn big data into business value.
One of my old presentation to our management covers the following topics
History and Milestones
Traditional Data Warehouse
Key trends breaking the traditional data warehouse
Modern Data Warehouse
Multiple parallel processing (MPP) architecture
Hadoop Ecosystem
Technical Innovation on Hadoop
With the rise of Apache Hadoop, a next-generation enterprise data architecture is emerging that connects the systems powering business transactions and business intelligence. Hadoop is uniquely capable of storing, aggregating, and refining multi-structured data sources into formats that fuel new business insights. Apache Hadoop is fast becoming the defacto platform for processing Big Data. Hadoop started from a relatively humble beginning as a point solution for small search systems. Its growth into an important technology to the broader enterprise community dates back to Yahoo’s 2006 decision to evolve Hadoop into a system for solving its internet scale big data problems. Eric will discuss the current state of Hadoop and what is coming from a development standpoint as Hadoop evolves to meet more workloads.
The Business Advantage of Hadoop: Lessons from the Field – Cloudera Summer We...Cloudera, Inc.
451 Analyst Matt Aslett, Cloudera CEO Mike Olson and Cloudera customers RIM and YP (formerly AT&T Interactive) to learn:
» Why Cloudera customers have chosen CDH to get started with Hadoop
» The business value resulting from analyzing new data sources in new ways
» How Hadoop will change these Customers’ business and industry over the next 3-5 years
Webinar | From Zero to Big Data Answers in Less Than an Hour – Live Demo SlidesCloudera, Inc.
Slides describing Cloudera and Karmasphere, and how combined their products can install a Hadoop cluster, import data, run queries and generate results.
Similar to Hw09 Data Processing In The Enterprise (20)
Cloudera Data Impact Awards 2021 - Finalists Cloudera, Inc.
This annual program recognizes organizations who are moving swiftly towards the future and building innovative solutions by making what was impossible yesterday, possible today.
The winning organizations' implementations demonstrate outstanding achievements in fulfilling their mission, technical advancement, and overall impact.
The 2021 Data Impact Awards recognize organizations' achievements with the Cloudera Data Platform in seven categories:
Data Lifecycle Connection
Data for Enterprise AI
Cloud Innovation
Security & Governance Leadership
People First
Data for Good
Industry Transformation
2020 Cloudera Data Impact Awards FinalistsCloudera, Inc.
Cloudera is proud to present the 2020 Data Impact Awards Finalists. This annual program recognizes organizations running the Cloudera platform for the applications they've built and the impact their data projects have on their organizations, their industries, and the world. Nominations were evaluated by a panel of independent thought-leaders and expert industry analysts, who then selected the finalists and winners. Winners exemplify the most-cutting edge data projects and represent innovation and leadership in their respective industries.
Machine Learning with Limited Labeled Data 4/3/19Cloudera, Inc.
Cloudera Fast Forward Labs’ latest research report and prototype explore learning with limited labeled data. This capability relaxes the stringent labeled data requirement in supervised machine learning and opens up new product possibilities. It is industry invariant, addresses the labeling pain point and enables applications to be built faster and more efficiently.
Data Driven With the Cloudera Modern Data Warehouse 3.19.19Cloudera, Inc.
In this session, we will cover how to move beyond structured, curated reports based on known questions on known data, to an ad-hoc exploration of all data to optimize business processes and into the unknown questions on unknown data, where machine learning and statistically motivated predictive analytics are shaping business strategy.
Introducing Cloudera DataFlow (CDF) 2.13.19Cloudera, Inc.
Watch this webinar to understand how Hortonworks DataFlow (HDF) has evolved into the new Cloudera DataFlow (CDF). Learn about key capabilities that CDF delivers such as -
-Powerful data ingestion powered by Apache NiFi
-Edge data collection by Apache MiNiFi
-IoT-scale streaming data processing with Apache Kafka
-Enterprise services to offer unified security and governance from edge-to-enterprise
Introducing Cloudera Data Science Workbench for HDP 2.12.19Cloudera, Inc.
Cloudera’s Data Science Workbench (CDSW) is available for Hortonworks Data Platform (HDP) clusters for secure, collaborative data science at scale. During this webinar, we provide an introductory tour of CDSW and a demonstration of a machine learning workflow using CDSW on HDP.
Shortening the Sales Cycle with a Modern Data Warehouse 1.30.19Cloudera, Inc.
Join Cloudera as we outline how we use Cloudera technology to strengthen sales engagement, minimize marketing waste, and empower line of business leaders to drive successful outcomes.
Leveraging the cloud for analytics and machine learning 1.29.19Cloudera, Inc.
Learn how organizations are deriving unique customer insights, improving product and services efficiency, and reducing business risk with a modern big data architecture powered by Cloudera on Azure. In this webinar, you see how fast and easy it is to deploy a modern data management platform—in your cloud, on your terms.
Modernizing the Legacy Data Warehouse – What, Why, and How 1.23.19Cloudera, Inc.
Join us to learn about the challenges of legacy data warehousing, the goals of modern data warehousing, and the design patterns and frameworks that help to accelerate modernization efforts.
Leveraging the Cloud for Big Data Analytics 12.11.18Cloudera, Inc.
Learn how organizations are deriving unique customer insights, improving product and services efficiency, and reducing business risk with a modern big data architecture powered by Cloudera on AWS. In this webinar, you see how fast and easy it is to deploy a modern data management platform—in your cloud, on your terms.
Explore new trends and use cases in data warehousing including exploration and discovery, self-service ad-hoc analysis, predictive analytics and more ways to get deeper business insight. Modern Data Warehousing Fundamentals will show how to modernize your data warehouse architecture and infrastructure for benefits to both traditional analytics practitioners and data scientists and engineers.
Explore new trends and use cases in data warehousing including exploration and discovery, self-service ad-hoc analysis, predictive analytics and more ways to get deeper business insight. Modern Data Warehousing Fundamentals will show how to modernize your data warehouse architecture and infrastructure for benefits to both traditional analytics practitioners and data scientists and engineers.
Explore new trends and use cases in data warehousing including exploration and discovery, self-service ad-hoc analysis, predictive analytics and more ways to get deeper business insight. Modern Data Warehousing Fundamentals will show how to modernize your data warehouse architecture and infrastructure for benefits to both traditional analytics practitioners and data scientists and engineers.
Extending Cloudera SDX beyond the PlatformCloudera, Inc.
Cloudera SDX is by no means no restricted to just the platform; it extends well beyond. In this webinar, we show you how Bardess Group’s Zero2Hero solution leverages the shared data experience to coordinate Cloudera, Trifacta, and Qlik to deliver complete customer insight.
Federated Learning: ML with Privacy on the Edge 11.15.18Cloudera, Inc.
Join Cloudera Fast Forward Labs Research Engineer, Mike Lee Williams, to hear about their latest research report and prototype on Federated Learning. Learn more about what it is, when it’s applicable, how it works, and the current landscape of tools and libraries.
Analyst Webinar: Doing a 180 on Customer 360Cloudera, Inc.
451 Research Analyst Sheryl Kingstone, and Cloudera’s Steve Totman recently discussed how a growing number of organizations are replacing legacy Customer 360 systems with Customer Insights Platforms.
Build a modern platform for anti-money laundering 9.19.18Cloudera, Inc.
In this webinar, you will learn how Cloudera and BAH riskCanvas can help you build a modern AML platform that reduces false positive rates, investigation costs, technology sprawl, and regulatory risk.
Introducing the data science sandbox as a service 8.30.18Cloudera, Inc.
How can companies integrate data science into their businesses more effectively? Watch this recorded webinar and demonstration to hear more about operationalizing data science with Cloudera Data Science Workbench on Cazena’s fully-managed cloud platform.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
By Design, not by Accident - Agile Venture Bolzano 2024
Hw09 Data Processing In The Enterprise
1. Hadoop In the Enterprise?
Sih Lee & Peter Krey, Innovation & Shared Services
Firmwide Engineering & Architecture
Hadoop World, New York City, October 2nd, 2009
2009 JPMorgan Chase & Co.
All rights reserved.
Confidential and proprietary to JPMorgan Chase & Co.
2. Agenda
Page
JPMorgan Chase + Open Source 2
Hadoop In The Enterprise? 3
Active POC Pipeline 6
Hadoop Positioning 7
Cost Comparisons 8
Hadoop Additions & Must Haves 10
Hadoop In The Enterprise ?
Q&A 11
1
3. JPMorgan Chase + Open Source
Established Multi-Year Open Source History
Big Supporter of Industry Standards & Open Source Projects
Numerous Production Open Source Implementations
QPID (AMQP) - Top Level Apache Project (http://qpid.apache.org/)
Tyger - Apache + Tomcat + Spring - Fully Integrated App
Server Environment 30+ OS Components
Compute Backbone (CBB) HPC Grid - 1000's of Linux Based Compute
Hadoop In The Enterprise ?
Servers
MuleSoft.org (a.k.a. MuleSource) Enterprise Message Bus
others …
2
4. Hadoop In The Enterprise – Economics Driven
Many Big Data Lessons Learned From Web 2.0 Community
Potential For Large Capex and Opex "Dislocation"
Reduced Consumption of Enterprise Premium Resources
Grid Computing Economics Brought To Data Intensive Computing
Stagnant Data Innovation
Enabling & Potentially Disruptive Platform
Many Historical Similarities
Java, Linux, Tomcat, Web / Internet, …
Hadoop In The Enterprise ?
Mini's to Client / Server, Client / Server to Web, Solaris to Linux, …
Key Question: What Can Be Built On Top of and Enabled by Hadoop?
3
5. Hadoop In The Enterprise – Choice Driven
Overuse of Relational Database Containers
Institutional “Muscle Memory” … Not Much Else to Choose From
Increasing Large Percentage of Static Data Stored In Proprietary
Transactional DB's
Over-Normalized Schemas … Still Makes Sense With Cheap
Compute & Storage?
Enterprise Storage "Prisoners"
Hadoop In The Enterprise ?
Captive To The Economics & Technology of "A Few" Vendors
Developers Need More Choice
Too Much Proprietary, Single-Source Data Infrastructure
Increasing Need For Minimal / No System + Storage Admins
4
6. Hadoop In The Enterprise – Other Drivers
Growing Developer Interest In "No SQL" Data Technologies
Open Source, Distributed, Non-relational Databases
Growing Influence Of Web 2.0 Technologies & Thinking On Enterprise
Hadoop, Cassandra, HBase, Hive, CouchDB, HadoopDB, …, others
memcached For Caching
FSI Industry Drivers
Increased Regulatory Oversight + Reporting =
Hadoop In The Enterprise ?
More Data Needed Over Longer Period Of Time
Growing Need For Less Expensive Data Repository / Store
Increasing Need To Support "One Off" Analysis On Large Data
5
7. Active POC Pipeline
Growing Stream of Real Projects To Gauge Hadoop "Goodness of Fit"
Broad Spectrum of Use Cases
Driven By Need To Impact / Dislocate OPEX + CAPEX
Evaluated On Metric Based Performance, Functional, And
Economic Measures
Hadoop In The Enterprise ?
6
8. Hadoop Positioning
Semi-Structured
Analysis
Higher-Latency
• Map/Reduce + HDFS
• DW7
• DW6
• DW5
• DW3
• SQLDB1 • DW4
GB’s TB’s –> PB’s
Hadoop In The Enterprise ?
• SQLDB2 • DW2
• SQLDB3 • DW1
• InMemory1 • SQLDB4
Index Based Access – Index Based Access –
Updates / XActns Analysis
Lower-Latency
7
9. Comparative Storage Cost Bar Graph Slide
“Normalized" SAN + NAS $ per gb per month versus HDFS $ per gb per month
Hadoop In The Enterprise ?
p
p
p
p
N
N
N
N
N
N
AS
AS
AS
AS
oo
oo
oo
oo
SA
SA
SA
SA
SA
SA
N
N
N
N
ad
ad
ad
ad
H
H
H
H
8
10. Enterprise Data Warehousing Costs
"normalized” bar chart utilizing retail $ per TB
Data Warehouse S/W -- $K per TB
$250
$200
$150
Hadoop In The Enterprise ?
$100
$50
$0
Products
9
11. Hadoop Additions & Must Haves
Improved SQL Front-end Tool Interoperability
Better Interop With Skills & Content That Firms Already Have
Improved Security & ACL enforcement … Kerberos integration?
Grow Developer Programming Model Skill Sets
Improve Relational Container Integration & Interop For Data Archival
Management & Monitoring Tools
Improved Developer & Debugging Tools
Hadoop In The Enterprise ?
Reduce Latency Via Integration With Open Source Data Caching
memcached, others
Invitation To FSI or Enterprise Roundtable
10
12. Q&A
Sih Lee, Head of Innovation & Shared Services
Firmwide Engineering & Architecture
W# 212-622-3038
sih.x.lee@jpmchase.com
Peter Krey, Consultant, Innovation & Shared Services
Firmwide Engineering & Architecture
W# 212-622-2926
peter.j.krey@jpmchase.com
Hadoop In The Enterprise ?
11