Consolidate and Integrate Oracle Business Processes with Enterprise and Big Data Workloads

Consolidate and Integrate Oracle Business Processes with Enterprise and Big Data Workloads






Total Views
Views on SlideShare
Embed Views



0 Embeds 0

No embeds


Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
Post Comment
Edit your comment
  • How does infrastructure affect the value of data?Having the right infrastructure is very essential to ensure that the right application has the right data at the right time. When we talk about infrastructure, we refer to the proven hardware architecture, like Cisco’s Common Platform Architecture, in combination with the automation solution, like Tidal Enterprise Scheduler, that enables seamless Data Operations. Automation solutions helps IT to deal with the complexity around managing lifecycle of data from gathering, ingestion, processing, DW and BI integrations. A wide variety of application adapters for all common structured and unstructured data sources means that data integration is an easy, three step process of where, what and when.First of all, I’d like to thank David and John for clearly articulating the necessary steps to gain the most value from the data you have.As David pointed out, There are really two parts the Big Data challenge: the management of the data which John walked us through and the infrastructure where the data is processed.If either the management of the data or the infrastructure is not aligned to Data Center operations the Big Data projects will have a difficult time delivering value.To be competitive in the market today IT needs to be ready for the deluge of data that is coming. As John pointed out, more data has been created in the last two years than in all the history of record keeping. So from the infrastructure side the critical elements for delivering the greatest value from the data are:-Having the speed and flexibility built into the data center infrastructure so that IT can quickly adapt to changing business needs which now will include more and more big data analytics requests from the business-The infrastructure needs to coordinated and orchestrated across all of its moving part. The more integrated the components and the operations software are the more efficient the throughput.-As IT move from a static delivery structure to a cloud-based service delivery model, the complexity of all of the moving part becomes difficult to manage. Big Data service delivery needs to be simplified and streamlined to lower IT OpEx costs and increase productivity allowing better alignment to the business needs-Automation plays a key role in delivering an infrastructure that increases the value of data. Automation need to be integrated into all of the different data center operations steps and integrated together.
  • Not only does the big data infrastructure management need to be automated but it also needs to be able to work across the rest of the data center landscape.Cisco has developed reference architectures for both traditional data center applications for ERP and enterprise data warehouse analytics processing, and for Big Data use cases that combines direct attach storage and an integrated network fabric.Cisco component management applications are built to be seamless across both sides of the data center. As discussed previously, Cisco has --Integrated data center network management --Both server and fabric management build into a single solution--Integrated infrastructure provisioning, process automation and service catalog capabilities to enable rapid uptime, scalability and dynamic elasticity--As well as an end to end workload automation solution that connects and runs processes for ERP and Big Data applicationsHere are some of the benefits the Cisco solutions delivers for Big Data--All configuration and identity information is part of the Service Profile--Fabricinfrastructure offers programmable, policy-based management--Reduce server provisioning time from weeks down to hours--Fully redundant active-active fabric cluster interconnect--66% Less Switch Ports and Cables
  • Weagree with David’s idea of a data “Supply Chain”. The importance of this is magnified as larger and more varied data sets are introduced into the data processing environments. And As David pointed out, data variety accounts for the greatest challenge in these large data sets.As an example of the complexity of processing these “supply chains” lets look at a customer sentiment use case workload.As we step through the different parts of the workload you see the need for a 360 degree view of all your data sources.Your ability to see and direct data sources both from within and outside your firewall is crucial to having the best opportunity to make the right decisions for your business.It’s also necessary to have a seamless operational view of each application along the chain. By integrating each step through the API layer, efficiency of programmability increases. This integration allows you to deliver services faster to the business, increasing the value of IT.Automating your workloads with a solution that has end to end connectivity and deep API integrations into the data sources and processing applications deliversFaster Time to MarketEfficient OpEx managementReduction of errors and throughput riskFocus on business user requirementsLower TCO of the entire Big Data delivery infrastructure.
  • Can you discuss a specific use case that will help our viewers?Sure. Like many organizations, Cisco is also working on translating data into business value. Cisco Value Chain IT came up with a 360 Data Foundation framework for analyzing a variety of data sources like Sales information, Partner Information and Service Requests opened by customers to identify new sales opportunity. Given the variety and volume, Big Data platform was a natural fit for this project. They stood up a Hadoop based platform on our UCS servers and used Tidal Enterprise Scheduler as the workload automation solution that triggered the Informatica ETL processing to periodically load the data into the Big Data platform and once the data is loaded, TES would trigger the MapReduce jobs for analyzing the data based on business rules and the output is then tied to Informatica for further normalization and loading it into our EDW. TES also takes care of triggering the workflow to refresh the renewal opportunity details in the partner dashboards automatically, thus making the whole integration look very simple!To unlock the valuein our customer data, Cisco IT turned to Hadoop, an open-source software framework that supports data-intensive, distributed applications. Hadoop behaves like an affordable supercomputing platform.It moves compute resources to where the data is stored, which mitigates the disk I/O bottleneck and provides almost linear scalability. Hadoop enabled us to consolidate the islands of data scattered throughout the enterprise. Cisco IT needed to design and implement an enterprise platform that could support appropriate service-level agreements (SLAs) for availability and performance.  Cisco IT built a Hadoop big data analytics platform using the Cisco® Common Platform Architecture (CPA) for Big Data, which is based on the Cisco Unified Computing System™ and Intel® Xeon® processors, and uses solutions from our ISV partners like Informatica for data consolidation and cleansing.The platform provides high performance in a multitenant environment, anticipating that internal users will continually find more use cases for big data analytics. Data is normalized and loaded into the Hadoop data processing platform. Next, the data is processed against a set of behavioral rules defined in collaboration with the business groups. The processed data sets are then delivered to our presentation layer in the form of a web-based dashboard which is easily accessed and customizable for our internal users as well as our partners.Automation of the workloads is critical in both lowering the operational costs as well as removing the human error and risks associates with manual administration. This platform takes advantage of Cisco Tidal Enterprise Scheduler (TES) to facilitate job scheduling and workload automation. With API connectors to Hadoop, TES minimizes programming and debugging tasks and saves hours developingeach job.This solution processes 1.5 billion records daily, and in the first month of operations we identified $40 million in incremental revenue from partners and new service opportunities.--The Cisco hardware and software infrastructure played a critical part in quickly developing this Big Data solution and operational automation continues to deliver value by lower the ongoing TCO and adding scalability and flexibility to the service.--Cisco IT plans to offer this Big Data platform to other business units and has over 30 Proof of Concepts in the pipeline. Without an integrated, automated infrastructure platform this would not be possible. Manually managing data sets or your infrastructure limits your ability to scale.Cisco’s big data platform has removed such limitations. Not only does it bring disparate datasets together for analytical purposes, but it processes 25 percent more data in 10 percent of the time with a lower total cost of ownership that the legacy system.

Consolidate and Integrate Oracle Business Processes with Enterprise and Big Data Workloads Consolidate and Integrate Oracle Business Processes with Enterprise and Big Data Workloads Presentation Transcript

  • Consolidate and integrate Oracle business processes with enterprise and Big Data workloads Andrew Blaisdell Product Marketing Manager Product, Industry and Solutions Marketing
  • Oracle Solutions Run On Cisco UCS Uniquely Cisco Enterprise Applications Middleware  Application choice: Cisco Validated Designs, benchmarks and sizing guides for major applications Oracle E-Business Suite, PeopleSoft, Siebel, JD Edwards, Oracle Fusion Oracle WebLogic, Oracle SOA Suite, OracleAS Database  Database choice: Scale-up/scale-out for Oracle Database 12C, 11g, 10g or Oracle NoSQL for Big Data Oracle Database, Oracle NoSQL Big Data Operating System Oracle Linux Virtualization Oracle VM  Hypervisor choice: Oracle VM, VMware, Hyper-V  Operating system choice: Oracle Linux, Solaris, Red Hat, Novell, Windows  Storage choice: EMC (Vblock), NetApp (FlexPod), HDS, Oracle, IBM Consumption Options • Sizing guides, sample configs, and tools for Oracle on UCS • 27 Oracle Record Performance Benchmarks • FlexPod and Vblock for Oracle DB/RAC • NoSQL Big Data Exclusive certification © 2013 Cisco and/or its affiliates. All rights reserved. Cisco Public
  • Workload Automation: Delivering the right data to the right systems and people at the right time. Elastic Cloud Events Secure Business Modeling Dependencies API Integrations Predictive Analytics
  • Cisco Tidal Enterprise Scheduler Adapters Cloudera, Pivotal and MapR Mobile App Self-Service Portal Client Manager Browser Based UI CLI Exposed TES API MapReduce Client Manager Database (Oracle/MS) Oracle 10/11 LDAP/ AD MS-SQL SAP ERP Primary Master Solaris HPUX PeopleSoft VMware OS Agents Windows Cognos SAP BOBJ Oracle EBS JD Edwards Private and Public Cloud Adapter Host Framework UNIX/Linux Informatica Data Center Applications Fault Monitor Backup Master DataMover BI & Database Workload Analytics Core Business Logic & High Availability Sqoop Hive Amazon EC2 Amazon S3 Standards Based Adapters AIX Tru64 Mainframe SSH REST/SOAP SMTP JMS RJA JDBC
  • Cisco Tidal Enterprise Scheduler for Oracle Database  Runs PL/SQL statements or Stored Procedures  Oracle DB, Oracle Enterprise Manager, or Cisco TES  Provides access to PL/SQL jobs output without the need to create an intermediary file  Displays familiar Oracle job names, valid commands, and steps in the Cisco TES interface
  • Cisco Tidal Enterprise Scheduler for Oracle E-Business Suite  Seamless API integration with the Oracle Concurrent Manager  Support for multiple Oracle E-Business Suite instances  Unlimited dependencies for Oracle requests and report sets
  • Cisco Tidal Enterprise Scheduler for PeopleSoft Enterprise  Consolidates PeopleSoft jobs under one solution  Enables user to monitor files that are used as event triggers for the initiation of run requests  Supports all PeopleSoft process types, including SQR Reports, Cobol SQL, Crystal Reports, PS/nVision, and the Message Agent API PS Job Search
  • Cisco Tidal Enterprise Scheduler for JD Edwards  Controls JD Edwards processes using Universal Batch Engine (UBE)  Developed using JD Edwards APIs  Easy distribution of JDE reports UBE Search
  • Adding Big Data to end-to-end Workloads
  • Infrastructure Matters More than Ever IT as a Service: Speed and Flexibility Efficient Throughput: Compute, Network, Storage Reduced complexity Integrated Automation
  • Infrastructure Management & Automation Workload Automation FTP/WS/Cloud ERP/CRM/SCM Big Data BI/ETL/DW Data Center Assurance Orchestration and Process Automation Infrastructure Provisioning Policy-based component management Infrastructure Provisioning Compute Storage Network © 2011 Cisco and/or its affiliates. All rights reserved. Cisco Confidential 11
  • Single Management Stack Data Center Applications Big Data Applications Integrated Network Management ERP CRM HR SCM EDW Cisco UCS B-series Integrated Cisco UCS Server/Fabric Management Integrated Process Automation End to End Workload Automation Cisco UCS C-series w/ Direct Attach Storage
  • End to End Workload Automation – Holistic Big Data Example: Product Sentiment Analysis Load Data Gather Data Data Feeds Call logs Web Clicks Oracle NoSQL Data Processing Data Processing Map Reduce Map Reduce Map Reduce Data Integration Analytics & Distribution Hive BI Analytics
  • Cisco CVC-IT: 360 Data Foundation Cisco Tidal Enterprise Scheduler Source Data Target Systems ETL Technica l Support Big Data Storage and Processing New Opportunity 360 Data Foundation Tables Sales Partners Validation Result Set Daily Input Data Daily Output Data EDW Partner Dashboard Data Behavioral Rules/Auditing
  • Thank you!