Hadoop Reporting and Analysis - Jaspersoft

  • 3,672 views
Uploaded on

Hadoop is deployed for a variety of uses, including web analytics, fraud detection, security monitoring, healthcare, environmental analysis, social media monitoring, and other purposes.

Hadoop is deployed for a variety of uses, including web analytics, fraud detection, security monitoring, healthcare, environmental analysis, social media monitoring, and other purposes.

More in: Technology , Business
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
3,672
On Slideshare
0
From Embeds
0
Number of Embeds
1

Actions

Shares
Downloads
117
Comments
0
Likes
6

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide
  • I can’t really talk about Hortonworks without first taking a moment to talk about the history of Hadoop.What we now know of as Hadoop really started back in 2005, when Eric Baldeschwieler – known as “E14” – started to work on a project that to build a large scale data storage and processing technology that would allow them to store and process massive amounts of data to underpin Yahoo’s most critical application, Search. The initial focus was on building out the technology – the key components being HDFS and MapReduce – that would become the Core of what we think of as Hadoop today, and continuing to innovate it to meet the needs of this specific application.By 2008, Hadoop usage had greatly expanded inside of Yahoo, to the point that many applications were now using this data management platform, and as a result the team’s focus extended to include a focus on Operations: now that applications were beginning to propagate around the organization, sophisticated capabilities for operating it at scale were necessary. It was also at this time that usage began to expand well beyond Yahoo, with many notable organizations (including Facebook and others) adopting Hadoop as the basis of their large scale data processing and storage applications and necessitating a focus on operations to support what as by now a large variety of critical business applications.In 2011, recognizing that more mainstream adoption of Hadoop was beginning to take off and with an objective of facilitating it, the core team left – with the blessing of Yahoo – to form Hortonworks. The goal of the group was to facilitate broader adoption by addressing the Enterprise capabilities that would would enable a larger number of organizations to adopt and expand their usage of Hadoop.[note: if useful as a talk track, Cloudera was formed in 2008 well BEFORE the operational expertise of running Hadoop at scale was established inside of Yahoo]
  • While overly simplistic, this graphic represents what we commonly see as a general data architecture:A set of data sources producing dataA set of data systems to capture and store that data: most typically a mix of RDBMS and data warehousesA set of applications that leverage the data stored in those data systems. These could be package BI applications (Business Objects, Tableau, etc), Enterprise Applications (e.g. SAP) or Custom Applications (e.g. custom web applications), ranging from ad-hoc reporting tools to mission-critical enterprise operations applications.Your environment is undoubtedly more complicated, but conceptually it is likely similar.
  • As the volume of data has exploded, we increasingly see organizations acknowledge that not all data belongs in a traditional database. The drivers are both cost (as volumes grow, database licensing costs can become prohibitive) and technology (databases are not optimized for very large datasets).Instead, we increasingly see Hadoop – and HDP in particular – being introduced as a complement to the traditional approaches. It is not replacing the database but rather is a complement: and as such, must integrate easily with existing tools and approaches. This means it must interoperate with:Existing applications – such as Tableau, SAS, Business Objects, etc,Existing databases and data warehouses for loading data to / from the data warehouseDevelopment tools used for building custom applicationsOperational tools for managing and monitoring
  • It is for that reason that we focus on HDP interoperability across all of these categories:Data systemsHDP is endorsed and embedded with SQL Server, Teradata and moreBI tools: HDP is certified for use with the packaged applications you already use: from Microsoft, to Tableau, Microstrategy, Business Objects and moreWith Development tools: For .Net developers: Visual studio, used to build more than half the custom applications in the world, certifies with HDP to enable microsoft app developers to build custom apps with HadoopFor Java developers: Spring for Apache Hadoop enables Java developers to quickly and easily build Hadoop based applications with HDPOperational toolsIntegration with System Center, and with Teradata viewpoint
  • In summary, by addressing these elements, we can provide an Enterprise Hadoop distribution which includes the:Core ServicesPlatform ServicesData ServicesOperational ServicesRequired by the Enterprise user.And all of this is done in 100% open source, and tested at scale by our team (together with our partner Yahoo) to bring Enterprise process to an open source approach. And finally this is the distribution that is endorsed by the ecosystem to ensure interoperability in your environment.
  • Across all of our user base, we have identified just 3 separate usage patterns – sometimes more than one is used in concert during a complex project, but the patterns are distinct nonetheless. These are Refine, Explore and Enrich.The first of these, the Refine case, is probably the most common today. It is about taking very large quantities of data and using Hadoop to distill the information down into a more manageable data set that can then be loaded into a traditional data warehouse for usage with existing tools. This is relatively straightforward and allows an organization to harness a much larger data set for their analytics applications while leveraging their existing data warehousing and analytics tools.Using the graphic here, in step 1 data is pulled from a variety of sources, into the Hadoop platform in step 2, and then in step 3 loaded into a data warehouse for analysis by existing BI tools
  • The final use case is called Application Enrichment.This is about incorporating data stored in HDP to enrich an existing application. This could be an on-line application in which we want to surface custom information to a user based on their particular profile. For example: if a user has been searching the web for information on home renovations, in the context of your application you may want to use that knowledge to surface a custom offer for a product that you sell related to that category. Large web companies such as Facebook and others are very sophisticated in the use of this approach.In the diagram, this is about pulling data from disparate sources into HDP in Step 1, storing and processing it in Step 2, and then interacting with it directly from your applications in Step 3, typically in a bi-directional manner (e.g. request data, return data, store response).
  • A second use case is what we would refer to as Data Exploration – this is the use case in question most commonly when people talk about “Data Science”.In simplest terms, it is about using Hadoop as the primary data store rather than performing the secondary step of moving data into a data warehouse. To support this use case you’ve seen all the BI tool vendor rally to add support for Hadoop – and most commonly HDP – as a peer to the database and in so doing allow for rich analytics on extremely large datasets that would be both unwieldy and also costly in a traditional data warehouse. Hadoop allows for interaction with a much richer dataset and has spawned a whole new generation of analytics tools that rely on Hadoop (HDP) as the data store.To use the graphic, in step 1 data is pulled into HDP, it is stored and processed in Step 2, before being surfaced directly into the analytics tools for the end user in Step 3.
  • We live in a world where organizations must now compete on their differential use of time and information. Because we can have no more of the former, and because we have an unlimited amount of the latter, there is a new responsibility to harness information more effectively to compete speedily and agilely. Is traditional Business Intelligence able to address this newfound opportunity to put much more information to work? We know that, today, only a small fraction of information workers actually use a traditional BI tool during the course of a day. In fact, according to most industry analysts only about 25% of information workers actually use BI today. Why? Because those tools are too complex and too costly – which prevents the widespread use of timely, actionable data. The bigger matter, though, is that most information workers do NOT spend their day inside of a BI tool . . . Nor do they want to! We simply can’t expect even the best workers to go and find the right report or data that is relevant to their question or issue at hand.
  • So, what’s the solution? Bring timely, actionable data TO the users. Information workers today truly need information that finds them, not the other way around. This information should be delivered within the software applications and business processes that are used every day by information workers. From pipeline dashboards within the CRM system to visualized compensation data within the HR system and on to interactive charts inside the native, mobile customer service application – the information generated by these business processes and (transactional) software applications should be put to greater use. At Jaspersoft, we call this a “data-driven” application and our mission is to be the Intelligence Inside.
  • To truly deliver integrated intelligence within a software application or business process, there are 3 primary requirements: 1. must be a simple self-service reporting and analysis environment that allows any user profile, from an executive to data analyst, to get the information they need; 2. must be easy to embed and integrate within the application or process, enabling different techniques to liberate the data generated by the application and encouraging widespread use of it as information; and 3. must be affordable even on a large scale, so there is no question about the value of delivering more information to any user who could benefit.Jaspersoft has become the intelligence inside tens of thousands of software applications and business processes globally, because we’ve set the standard for highly-embeddable and affordable self-service BI. Each day, our software touches millions of people and enables them to make decisions faster using timely, actionable data. Our customers have made Jaspersoft the Intelligence Inside.
  • Today Jaspersoft is the Intelligence Inside over 130,000 applications of every type in every industry. For example, Red Hat integrates Jaspersoft within its Enterprise Virtualization software (RHEV) and exposes system health and monitoring information to allow its customers to better manage their virtualization environment. Verizon embeds us inside their customer portal to share billing information with their customers. Virgin Money embeds Jaspersoft within its charitable “Giving” multi-tenant SaaS application, providing reports and analysis to describe sources and uses of funds. The Naval Safety Center embeds us inside their internally built application to report on Naval incidents. British Telecom has built a comprehensive statistical data warehouse of customer information, using Jaspersoft for customer service reports and analysis that enable reduced call times and improved service levels. Groupon uses Jaspersoft with Hadoop to drive optimized campaigns to better target users with discount offers. FICO’s Entiera division uses us with Vertica to do large scale marketing analytics. With each of these customer examples, Jaspersoft was chosen because of its modern, embeddable architecture that delivers a rich self-service experience at a fraction of the cost of the alternatives.
  • But we’re not just focused on delivering the Intelligence Inside of applications and business processes today. Our mission is to become the de facto standard for reporting and analysis in the New IT Stack. Specifically, we want to provide BI Builders with a reporting and analytic service inside their preferred Cloud platform, running on any Big Data store, so they can build Intelligence Inside their internal or commercial applications. As part of that mission, we have delivered a number of business intelligence industry first’s including being:The first and to-date, only, BI service on VMware’s PaaS, Cloud FoundryThe first and to-date, only BI service on Red Hat’s PaaS, OpenShiftThe first and to-date, only BI service on GoGrids’ IaaS marketplaceThe first BI vendor to be certified on Amazon’s new data warehouse service, Redshift. In fact, we were the first BI vendor Amazon approached to support their new service, because of our open source model and communityThe first and only BI provider to connect directly (no ETL) to non-SQL Big Data stores like MongoDB and Hadoop HbaseWhat this means is that BI Builders who are looking to build applications on these new stacks, can build in intelligence using Jaspersoft today
  • Users can also create and interact with beautiful dashboards that include charts, widgets, maps etc. These are perfect for executives or managers to get a quick understanding of the business and it’s KPIs.
  • And data analysts or power users can do analysis using traditional OLAP or using interactive visualizations to slice and dice their data to get insights into their business
  • What is unique about Jaspersoft is that all the functionality we provide can be used to power the Intelligence Inside any application, portal or website. As an example, Tata, the Indian technology giant, uses Jaspersoft to power their Mosaic product which allows media executives to track their media assets from creation all the way through to distribution. As you can see from the screen shot, this product looks nothing like the Jaspersoft product or what you’d think of as a typical business intelligence product. And that’s the point of the Intelligence Inside. You want business users to use reporting, dashboards and analytics in the context of their preferred application, without having to go to a separate BI system. Another example here is from eBuilder, a cloud based application, which helps companies to automate their business processes like travel and expenses, procurement, order fulfillment and After Sales management. All of the reporting and analysis is powered by Jaspersoft. The screen shot here shows the After Sales dashboard where managers can track the performance of different fulfillment centers geographically in delivering product to customers. Finally, the example here of Virgin Money’s non-profit Giving site which is a multi-tenant SaaS application powered by Jaspersoft that allows charities to track and optimize their fund raising activities. All of JasperReports Server capabilities are available here from interactive reporting through to ad hoc query, report and analysis. All, branded to look like the Virgin Money website.
  • We are able to do all of this because of our world-class BI platform. The platform is 100% open web standards based, from the backend Java server to the CSS (Cascading Style Sheet)-driven, HTML5 user interface. The product has a full suite of capabilities from reporting to dashboards, analysis and visualization that can be viewed and interacted with in a browser or on a tablet or mobile phone. Underlying these capabilities is a columnar-based in-memory engine that allows the user to work with an in-memory data set for faster performance. The engine is intelligent enough to push expensive aggregations down to the underlying database when that makes sense, for example if you have a high-performance analytic database like Amazon’s Redshift. We have a business metadata layer that allows BI Builders to define more business-user friendly data objects that abstract from the underlying data complexity. This layer can connect directly to our data connectors. Alternatively, customers can leverage our powerful data integration layer that allows them to extract, load and transform data to create a data mart or a data warehouse. If they don’t want to move the data but still need to merge multiple data sources they can use our data virtualization layer which allows them to federate queries across multiple data sources so that they look like a single source to the business user. The data connectivity layer provides access to any data source, from relational databases to Big Data stores like Hadoop and NoSQL stores like MongoDB and Cassandra, as well as other data stores like files. All of this power is exposed through an extensive set of APIs from HTTP to SOAP and REST-based web services that allows BI Builders to integrate the server capabilities into their applications.
  • Only Jaspersoft offers all 3 approaches, giving users the ability to meet any use case requirements
  • Jaspersoft is an active sponsor of BigDataUniversity.com. This is a FREE online learning portal to develop Big Data expertise and practical skills. We encourage everyone to register today and learn more about Big Data there.
  • So, now we’re going to see a demo of Jaspersoft 5, our most recent product release. You’ll see many of the capabilities we’ve discussed already. The thing to note about version 5 is that it highlights the strength of our architecture and vision. When creating this product we had the vision to deliver the power of what today you can only get in a desktop visualization tool like Tableau or QlikTech but to do so completely within a browser. Whereas Tableau and Qlik require the user to create the visualizations using a desktop tool, Jaspersoft allows the user to do that from a browser. There are many benefits to this approach. Apart from avoiding the obvious issue of having to manage desktop software, this approach allows BI Builders to embed this functionality inside their internal and commercial applications, portals or websites. This is not possible with the desktop tools. Now, let’s see the product in action.
  • At Hortonworks today, our focus is very clear: we Develop, Distribute and Support a 100% open source distribution of Enterprise Apache Hadoop.We employ the core architects, builders and operators of Apache Hadoop and drive the innovation in the open source community.We distribute the only 100% open source Enterprise Hadoop distribution: the Hortonworks Data PlatformGiven our operational expertise of running some of the largest Hadoop infrastructure in the world at Yahoo, our team is uniquely positioned to support youOur approach is also uniquely endorsed by some of the biggest vendors in the IT marketYahoo is both and investor and a customer, and most importantly, a development partner. We partner to develop Hadoop, and no distribution of HDP is released without first being tested on Yahoo’s infrastructure and using the same regression suite that they have used for years as they grew to have the largest production cluster in the worldMicrosoft has partnered with Hortonworks to include HDP in both their off-premise offering on Azure but also their on-premise offering under the product name HDInsight. This also includes integration with both Visual Studio for application development but also with System Center for operational management of the infrastructureTeradata includes HDP in their products in order to provide the broadest possible range of options for their customers

Transcript

  • 1. Hadoop Reporting & Analysis What Architecture is Best for Me?
  • 2. ©2013 Jaspersoft Corporation. 2 Jim Walker Director Product Marketing, Hortonworks Twenty years experience building products and bringing them to market. His expertise includes data loss prevention, master data management and now big data. Ben Connors Worldwide Head of Alliances, Jaspersoft Prior to Jaspersoft, Ben was at HP, Oracle, Viador, and other BI companies. He has over 20 years of experience in databases and business intelligence. Matt Dahlman Technical Director of Alliances, Jaspersoft Prior to Jaspersoft, Matt was with Oracle, Netonomy, and Sybase. He brings over 15 years of database and business intelligence experience to his role. Presenters
  • 3. Agenda  Hadoop in the Modern Data architecture  Hadoop Usage Patterns  Jaspersoft  Company  BI Suite  Jaspersoft/Hortonworks Integration  Demo  The Future of Interactive Hadoop  Q&A ©2013 Jaspersoft Corporation. Proprietary and Confidential 3
  • 4. © Hortonworks Inc. 2013 A Brief History of Apache Hadoop Page 4 2013 Focus on INNOVATION 2005: Yahoo! creates team under E14 to work on Hadoop Focus on OPERATIONS 2008: Yahoo team extends focus to operations to support multiple projects & growing clusters Yahoo! begins to Operate at scale Enterprise Hadoop Apache Project Established Hortonworks Data Platform 2004 2008 2010 20122006 STABILITY 2011: Hortonworks created to focus on “Enterprise Hadoop“. Starts with 24 key Hadoop engineers from Yahoo
  • 5. © Hortonworks Inc. 2013 Existing Data Architecture Page 5 APPLICATIONSDATASYSTEMS TRADITIONAL REPOS RDBMS EDW MPP DATASOURCES OLTP, PO S SYSTEMS OPERATIONAL TOOLS MANAGE & MONITOR Traditional Sources (RDBMS, OLTP, OLAP) DEV & DATA TOOLS BUILD & TEST Business Analytics Custom Applications Enterprise Applications
  • 6. © Hortonworks Inc. 2013 An Emerging Data Architecture Page 6 APPLICATIONSDATASYSTEMS TRADITIONAL REPOS RDBMS EDW MPP DATASOURCES MOBILE DATA OLTP, PO S SYSTEMS OPERATIONAL TOOLS MANAGE & MONITOR Traditional Sources (RDBMS, OLTP, OLAP) New Sources (web logs, email, sensor data, social media) DEV & DATA TOOLS BUILD & TEST Business Analytics Custom Applications Enterprise Applications HORTONWORKS DATA PLATFORM
  • 7. © Hortonworks Inc. 2013 Interoperating With Your Tools Page 7 APPLICATIONSDATASYSTEMS TRADITIONAL REPOS apps HORTONWORKS DATA PLATFORM DATASOURCES MOBILE DATA OLTP, PO S SYSTEMS Traditional Sources (RDBMS, OLTP, OLAP) New Sources (web logs, email, sensor data, social media) OPERATIONAL TOOLS MANAGE & MONITOR DEV & DATA TOOLS BUILD & TEST
  • 8. © Hortonworks Inc. 2013 OS Cloud VM Appliance HDP: Enterprise Hadoop Distribution Page 8 PLATFORM SERVICES HADOOP CORE DATA SERVICES OPERATIONAL SERVICES Manage & Operate at Scale Store, Proces s and Access Data Enterprise Readiness: HA, DR, Snapshots, Security , … HORTONWORKS DATA PLATFORM (HDP) Distributed Storage & Processing Hortonworks Data Platform (HDP) Enterprise Hadoop • The ONLY 100% open source and complete distribution • Enterprise grade, proven and tested at scale • Ecosystem endorsed to ensure interoperability HDFS YARN (in 2.0) WEBHDFS MAP REDUCE HCATALOG HIVEPIG HBASE SQOOP FLUME OOZIE AMBARI
  • 9. © Hortonworks Inc. 2013 Operational Data Refinery Page 9 DATASYSTEMSDATASOURCES 1 3 1 Capture Capture all data Process Parse, cleanse, apply structure & transform Exchange Push to existing data warehouse for use with existing analytic tools 2 3 Refine Explore Enric h 2 APPLICATIONS Collect data and apply a known algorithm to it in trusted operational process TRADITIONAL REPOS RDBMS EDW MPP HORTONWORKS DATA PLATFORM Business Analytics Custom Applications Enterprise Applications Traditional Sources (RDBMS, OLTP, OLAP) New Sources (web logs, email, sensor data, social media)
  • 10. © Hortonworks Inc. 2013 Application Enrichment Page 10 DATASYSTEMSDATASOURCES Refine Explore Enrich APPLICATIONS 1 Capture Capture all data Process Parse, cleanse, apply structure & transform Exchange Incorporate data directly into applications 2 3 Collect data, analyze and present salient results for online apps 3 1 2 TRADITIONAL REPOS RDBMS EDW MPP Traditional Sources (RDBMS, OLTP, OLAP) New Sources (web logs, email, sensor data, social media) Custom Applications Enterprise Applications NOSQL HORTONWORKS DATA PLATFORM
  • 11. © Hortonworks Inc. 2013 Big Data Exploration & Visualization Page 11 DATASYSTEMSDATASOURCES Refine Explore Enrich APPLICATIONS 1 Capture Capture all data Process Parse, cleanse, apply structure & transform Exchange Explore and visualize with analytics tools supporting Hadoop 2 3 Collect data and perform iterative investigation for value 3 2 TRADITIONAL REPOS RDBMS EDW MPP 1 HORTONWORKS DATA PLATFORM Business Analytics Traditional Sources (RDBMS, OLTP, OLAP) New Sources (web logs, email, sensor data, social media)
  • 12. The Intelligence Inside
  • 13. Competing on Time and Information ©2013 Jaspersoft Corporation. Proprietary and Confidential 13 “The New Factors of Production: Time and Information” Brian Gentile, Jaspersoft But business users don’t have access to timely, actionable data Why? Most don’t spend their day inside a BI tool …nor do they want to!
  • 14. We Need “Intelligence Inside” ©2013 Jaspersoft Corporation. Proprietary and Confidential 14 We want information to FIND US, not the other way round “We need Intelligence Inside the applications and business processes we use every day.”  Pipeline dashboard inside SaaS CRM app  Performance report inside partner portal  Salary data visualizations inside HR intranet  Portfolio analytics inside client website  Tickets crosstab inside custom helpdesk app  Interactive charts inside native mobile app
  • 15. Jaspersoft: The Intelligence Inside ©2013 Jaspersoft Corporation. Proprietary and Confidential 15 Self-Service BI + Embeddable + Affordable “We empower millions of people every day to make decisions faster by delivering timely, actionable data to them inside their apps and business process through an embeddable, cost-effective reporting and analytics platform.”
  • 16. Intelligence Inside Example Customers Commercial Apps Customer Portals Cloud Apps Internal Apps Big Data Analytics The Intelligence Inside Business ©2013 Jaspersoft Corporation. Proprietary and Confidential 16
  • 17. The Intelligence Inside the New IT Stack  Inaugural BI service:  On VMware Cloud Foundry  On Red Hat OpenShift  Jaspersoft Certified Amazon Redshift and RDS  To connect directly (no ETL) to non-SQL like MongoDB and HBase ©2013 Jaspersoft Corporation. Proprietary and Confidential 17 “Our mission is to become the de facto reporting and analytic service in the New IT Stack, enabling BI Builders to build the Intelligence Inside internal and commercial apps on the leading Cloud platforms, powered by the new Big Data stores.”
  • 18. Broad Recognition, Strong Partnerships 50%+ ACV Growth Every Year Magic Quadrants 18©2013 Jaspersoft Corporation. Proprietary and Confidential World’s Most Widely Deployed BI • Commercial Open Source BI Suite • Nearly 200 people in US, EMEA, APAC • 16,000,000 downloads • 325,000 community members • 130,000 embedded applications • 15,000 paying customers • 1,800 subscription customers Jaspersoft: High Growth and Momentum
  • 19. Product Overview
  • 20. Design Any Report . . . ©2013 Jaspersoft Corporation. Proprietary and Confidential 20
  • 21. … Dashboard 21©2013 Jaspersoft Corporation. Proprietary and Confidential
  • 22. … or Analytic View 22©2013 Jaspersoft Corporation. Proprietary and Confidential
  • 23. POJO files … using Any Data Type Relational FilesRelational Big Data Files ©2013 Jaspersoft Corporation. Proprietary and Confidential 23 Redshift BigQuery
  • 24. ©2013 Jaspersoft Corporation. Proprietary and Confidential 24 … bringing Intelligence to Any App
  • 25. … with a World-Class BI Platform ©2013 Jaspersoft Corporation. Proprietary and Confidential 25 Reporting, Dashboards, Visualization, OLA P Analysis Columnar-Based In-Memory Engine Data Connectivity to Any Data 100%WebStandards:CSS,.JS,.JSP,Java ExtensiveAPIs:HTTP,SOAP,REST HTML5 Browser, Native Mobile Apps Business Metadata Layer Data Integration Data Virtualization Direct Hadoop Other DataRDBMS
  • 26. Approach Data Exploration Operational Reporting Analytics Use Case For data analysts and data scientists who want to discover real-time patterns as they emerge from their Big Data content For executives and operational managers who want summarized, pre-built daily reports on Big Data content For data analysts and operational managers who want to analyze historical trends based upon pre-defined questions in their Big Data content Latency Low Medium High Big Data HBase, NoSQL, Analytic DBMS Hive, NoSQL, Analytic DBMS Hadoop, NoSQL, Analytic DBMS Connectivity Native Native, SQL ETL Architecture Three Approaches to Big Data Analysis BI Platform In-Memory Engine Native BI Platform Native SQL BI Platform OLAP Engine Data Mart ETL Multi-Dimensional Analysis Reports & Dashboards Multi-Dimensional Analysis ©2013 Jaspersoft Corporation. Proprietary and Confidential
  • 27. Jaspersoft’s Hadoop Difference  Advanced Hadoop integration  Only BI provider than can support 3 approaches to Hadoop analytics  Live Exploration, Batch Analysis, Batch reporting  Direct, native connectors to Hive and HBase  Broad partnerships  Deep knowledge and ecosystem 27©2013 Jaspersoft Corporation. Proprietary and Confidential
  • 28. Jaspersoft 5 Demo 28 “We've taken the desktop power of data visualization tools, built it scale on the HTML5 web, and made it embeddable within any app, device or portal” ©2013 Jaspersoft Corporation. Proprietary and Confidential
  • 29. © Hortonworks Inc. 2013 Hortonworks Snapshot Page 29 • We distribute the only 100% Open Source Enterprise Hadoop Distribution: Hortonworks Data Platform • We engineer, test & certify HDP for enterprise usage • We employ the core architects, builders and operators of Apache Hadoop • We drive innovation within Apache Software Foundation projects • We are uniquely positioned to deliver the highest quality of Hadoop support • We enable the ecosystem to work better with Hadoop Develop Distribute Support We develop, distribute and support the ONLY 100% open source Enterprise Hadoop distribution Endorsed by Strategic Partners Headquarters: Palo Alto, CA Employees: 180+ and growing Investors: Benchmark, Index, Yahoo
  • 30. © Hortonworks Inc. 2013 Hortonworks Approach Identify and introduce enterprise requirements into the pubic domain Work with the community to advance and incubate open source projects Apply Enterprise Rigor to provide the most stable and reliable distribution Community Driven Enterprise Apache Hadoop
  • 31. The Intelligence Inside Thank You www.jaspersoft.com BigData@jaspersoft.com