The challenge this session’s speaker and his colleagues faced in trying to learn more about customer experiences was that insights are fragmented across different systems such as Oracle Eloqua, CRM, and web analytics. To better understand their contacts, they started with the corporate data warehouse, which was missing a lot of this lower-value and detailed data. When they considered expanding the data warehouse, it was difficult to define what questions they wanted to answer in advance, because it varies for each campaign they run. Thus they embarked on building a Hadoop-based data lake, for the flexibility to ask any questions with an ad hoc schema on read approach, against any customer data sets in varying levels of detail, to better understand what their visitors want to consume.
Breakout Session
Wednesday, Apr 26, 5:45 p.m. | Mandalay Bay D
https://oracle.rainfocus.com/scripts/catalog/oracleCx17.jsp?search=BRK1098
Building a Hybrid Data Pipeline for Salesforce and HadoopSumit Sarkar
My team embarked on building a data lake for our sales and marketing data to better understand customer journeys. This required building a hybrid data pipeline to connect our cloud CRM with the new Hadoop Data Lake. One challenge is that IT was not in a position to provide support until we proved value and marketing did not have the experience, so we embarked on the journey ourselves within the product marketing team for our line of business within Progress. The key to delivering on this was using standard interfaces using a bi-directional data pipeline to connect the systems. On the Salesforce side, we were able to get frictionless access to the data lake using clicks-not-code via OData. On the Hadoop side, we were able to ingest data from Salesforce using JDBC for Apache Sqoop. Join us to hear best practices and lessons learned.
Data APIs Don't Discriminate [API World Stage Talk]Sumit Sarkar
The exploding API economy, combined with an advanced analytics market projected to reach $30 billion by 2019, is driving a market demand to expose more data from APIs. Business analysts, data engineers, and data scientists have been getting left behind in existing API strategies. This is because many APIs are designed to integrate with applications to extend functionality, however these data workers are looking for APIs that facilitate direct data access to support analytics. Data APIs are specifically designed to provide that frictionless data access experience to support analytics across standard interoperable interfaces such as OData (REST) or ODBC/JDBC (SQL). Consider expanding your API strategy to service the developers in this $30 billion market.
REST API debate: OData vs GraphQL vs ORDSSumit Sarkar
Learn the latest industry trends surrounding REST API standardization and what this means for your roadmap. OData is an OASIS standard REST API and has been established among tech companies such as Microsoft, SAP, CA, IBM and Salesforce. GraphQL was created by Facebook in 2015 and has already been deployed at tech companies such as Facebook, Shopify and Intuit. ORDS is the Oracle REST API and delivers similar standardization for Oracle-centric applications.
Journey to SAS Analytics Grid with SAS, R, PythonSumit Sarkar
Big data, compliance and a highly skilled workforce are driving organizations to transform their current analytical infrastructure to deliver enterprise computing environments that can support the latest in data science and analytics practices. SAS remains a popular choice for statistical programming languages, but there is growing demand for R and Python. Data engineers are now being tasked to deliver scalable and highly available computing resources to support analytics for a growing number of users and increasing data volumes while maintaining security for their customers.
Cloud applications are seeing a deluge of requests to support the exploding advanced analytics market. “Open analytics” is the emerging strategy to deliver that data through an open data access layer, in the cloud, to be directly consumed by external analytics tools and popular programming languages. An increasing number of data engineers and data scientists use a variety of platforms and advanced analytics languages such as SAS, R, Python and Java, as well as frameworks such as Hadoop and Spark. Cloud APIs are commonly designed to support application integration representing a disconnect with the analytics ecosystem. These combined trends create significant demand for a “bring-your-own-analytics” (BYOA) capability for cloud applications. Your cloud may already be smart, but giving users frictionless access to your data will make everyone smarter.
Salesforce analytics and BI continues to be a trending, hot topic as organizations implement new platforms to improve their customer intelligence. But what’s the best way to access the data? SOQL is the popular query language for Salesforce. However, SQL may be better suited for accessing data for analytics. Join us in the great SOQL vs. SQL query debate to see which one is best for your analytics project.
The document discusses a new data pipeline called Progress DataDirect Hybrid Data Pipeline. It transforms how clouds access data by providing firewall-friendly and secure connectivity to on-premises and other cloud data sources. It acts as a single interface to various cloud APIs and exposes data sources as standard SQL and REST. This allows for expanded connectivity options and helps solve challenges around hybrid cloud integration and accessing data located in different environments or clouds.
Building a Hybrid Data Pipeline for Salesforce and HadoopSumit Sarkar
My team embarked on building a data lake for our sales and marketing data to better understand customer journeys. This required building a hybrid data pipeline to connect our cloud CRM with the new Hadoop Data Lake. One challenge is that IT was not in a position to provide support until we proved value and marketing did not have the experience, so we embarked on the journey ourselves within the product marketing team for our line of business within Progress. The key to delivering on this was using standard interfaces using a bi-directional data pipeline to connect the systems. On the Salesforce side, we were able to get frictionless access to the data lake using clicks-not-code via OData. On the Hadoop side, we were able to ingest data from Salesforce using JDBC for Apache Sqoop. Join us to hear best practices and lessons learned.
Data APIs Don't Discriminate [API World Stage Talk]Sumit Sarkar
The exploding API economy, combined with an advanced analytics market projected to reach $30 billion by 2019, is driving a market demand to expose more data from APIs. Business analysts, data engineers, and data scientists have been getting left behind in existing API strategies. This is because many APIs are designed to integrate with applications to extend functionality, however these data workers are looking for APIs that facilitate direct data access to support analytics. Data APIs are specifically designed to provide that frictionless data access experience to support analytics across standard interoperable interfaces such as OData (REST) or ODBC/JDBC (SQL). Consider expanding your API strategy to service the developers in this $30 billion market.
REST API debate: OData vs GraphQL vs ORDSSumit Sarkar
Learn the latest industry trends surrounding REST API standardization and what this means for your roadmap. OData is an OASIS standard REST API and has been established among tech companies such as Microsoft, SAP, CA, IBM and Salesforce. GraphQL was created by Facebook in 2015 and has already been deployed at tech companies such as Facebook, Shopify and Intuit. ORDS is the Oracle REST API and delivers similar standardization for Oracle-centric applications.
Journey to SAS Analytics Grid with SAS, R, PythonSumit Sarkar
Big data, compliance and a highly skilled workforce are driving organizations to transform their current analytical infrastructure to deliver enterprise computing environments that can support the latest in data science and analytics practices. SAS remains a popular choice for statistical programming languages, but there is growing demand for R and Python. Data engineers are now being tasked to deliver scalable and highly available computing resources to support analytics for a growing number of users and increasing data volumes while maintaining security for their customers.
Cloud applications are seeing a deluge of requests to support the exploding advanced analytics market. “Open analytics” is the emerging strategy to deliver that data through an open data access layer, in the cloud, to be directly consumed by external analytics tools and popular programming languages. An increasing number of data engineers and data scientists use a variety of platforms and advanced analytics languages such as SAS, R, Python and Java, as well as frameworks such as Hadoop and Spark. Cloud APIs are commonly designed to support application integration representing a disconnect with the analytics ecosystem. These combined trends create significant demand for a “bring-your-own-analytics” (BYOA) capability for cloud applications. Your cloud may already be smart, but giving users frictionless access to your data will make everyone smarter.
Salesforce analytics and BI continues to be a trending, hot topic as organizations implement new platforms to improve their customer intelligence. But what’s the best way to access the data? SOQL is the popular query language for Salesforce. However, SQL may be better suited for accessing data for analytics. Join us in the great SOQL vs. SQL query debate to see which one is best for your analytics project.
The document discusses a new data pipeline called Progress DataDirect Hybrid Data Pipeline. It transforms how clouds access data by providing firewall-friendly and secure connectivity to on-premises and other cloud data sources. It acts as a single interface to various cloud APIs and exposes data sources as standard SQL and REST. This allows for expanded connectivity options and helps solve challenges around hybrid cloud integration and accessing data located in different environments or clouds.
OData External Data Integration Strategies for SaaSSumit Sarkar
This document discusses OData integration strategies for SaaS applications. It provides an overview of the OData standard and why SaaS vendors are adopting it. It then describes how Oracle Service Cloud uses OData accelerators to integrate with external data sources like Salesforce and Siebel. These accelerators allow agents to access and edit external data without leaving the Service Cloud interface.
Firewall friendly pipeline for secure data accessSumit Sarkar
This webinar discusses how to establish secure connections between cloud applications and on-premises data behind firewalls. It presents common connection options like VPNs, SSH tunneling, and reverse proxies, and recommends a vendor-agnostic service that provides a managed open data connection. The webinar covers best practices for scalability, availability, and end-to-end monitoring. It provides examples of how BOARD and Intuit leverage such a connection service to access on-premises data from their cloud applications.
Salesforce External Objects for Big DataSumit Sarkar
Transform Salesforce into the system of engagement for your big data. Discuss best practices and lessons learned in accessing external data sets in Hadoop or Spark using Salesforce Connect. Leave the big data sets behind the firewall, and get on demand access for your users to big data insights using external objects with Salesforce Connect.
In this session we will cover:
Intro to Salesforce Connect
Intro to Big Data Landscape
How to connect Salesforce to Big Data using External Data Sources
Lessons Learned accessing Big Data using External Objects for native reporting, writes, lookups, search and more
Resources (How to learn more)
Salesforce shops, including ourselves, have been eagerly anticipating external object support with reports. Starting in Winter ’17, you can build native reports with on-demand access to external data sources such as Oracle, SQL Server, Greenplum, Amazon Redshift, IBM DB2 or Hadoop Big Data Platforms. External objects are powered by Salesforce Connect and provide clicks-not-code data access for admins, devs and general users. But is all of this too good to be true?
During this webinar, you’ll learn:
- Introduce External Objects and their new capabilities for Reporting and Wave trending in Winter ‘17
- How to setup Salesforce report with external data sources
- How to produce OData from warehouses, marts, lakes or other reporting systems.
Report considerations and limitations with Salesforce Connect
The document discusses Oracle's data integration products and big data solutions. It outlines five core capabilities of Oracle's data integration platform, including data availability, data movement, data transformation, data governance, and streaming data. It then describes eight core products that address real-time and streaming integration, ELT integration, data preparation, streaming analytics, dataflow ML, metadata management, data quality, and more. The document also outlines five cloud solutions for data integration including data migrations, data warehouse integration, development and test environments, high availability, and heterogeneous cloud. Finally, it discusses pragmatic big data solutions for data ingestion, transformations, governance, connectors, and streaming big data.
How to Prepare Your Toolbox for the Future of SharePoint DevelopmentProgress
SharePoint is changing: instead of learning the Microsoft version of a technology that’s rapidly becoming outdated, developers can now use the latest and greatest in jQuery and Angular (or Knockout.js, React.js, etc.) and create great SharePoint UI.
The future of SharePoint development and customization is the SharePoint Framework (SPFx), a client-side based framework that allows JavaScript customizations to work on top of SharePoint Online/Office 365. Let’s put to work a toolset of web technologies, including Angular, Webpack and Kendo UI controls, to build a simple yet useful application and get started with the web stack today.
Download this whitepaper to:
* Get excited about the new SharePoint Framework (SPFx) and related web stack technologies
* See a great set of tools in action
* Learn how to build a practical SharePoint business application using modern web technology
This whitepaper is by SharePoint Gurus, an award-winning consultancy based in Sydney, Australia, that specializes in improving productivity through configuring and developing Microsoft SharePoint technologies.
This document discusses accessing NoSQL databases like MongoDB from SQL. It begins with an introduction to NoSQL and examples of JSON documents and key-value stores. It then covers the benefits of NoSQL like high performance, availability, and scalability. Common NoSQL implementations like MongoDB, Cassandra, and MarkLogic are described. The challenges of connecting to NoSQL databases from SQL are discussed. DataDirect connectors are presented as a solution for providing SQL access to NoSQL databases. They normalize the NoSQL data model and provide full ANSI SQL support. Performance and real-world case studies are also discussed.
Presenter: Mike Johnson
The Big Data ecosystem is disrupting things for the good and not so good. Learn how we deal with this from a connectivity perspective to get insights about the ecosystem, including the latest commercial and open source projects we’re tracking.
The document discusses Oracle Enterprise Metadata Management (OEMM) which allows users to manage metadata, data lineage, and business glossaries. It harvests metadata from popular platforms including BI tools, ETL tools, databases, and big data tools. OEMM provides vertical lineage that shows traceability from business terms to IT artifacts, and horizontal lineage that traces columns and fields across multiple systems. It allows interactive exploration of metadata relationships through zooming and filtering capabilities.
Navigating Your Product's Growth with Embedded Analytics Progress
Presenter: Guarav Verma
Learn from real life applications for embedded product analytics from Telerik. In today’s data driven world, how can you leverage analytics to know your audience, improve their experience, focus on your loyal users to drive more revenue, and optimize your engineering effort to accelerate your business results? Know what the future of Telerik Analytics is like and be a part of it.
This document provides information about Aetna, a health insurance company. It summarizes that Aetna serves about 46 million customers to help them make healthcare decisions and manage healthcare spending. Aetna offers various medical, pharmacy, dental, life, and disability insurance plans as well as Medicaid services and behavioral health programs. As of March 2015, Aetna had approximately 23.7 million medical members, 15.5 million dental members, and 15.4 million pharmacy members. Aetna works with over 1.1 million healthcare professionals across more than 674,000 primary care doctors and specialists located in 5,589 hospitals across the US and globally.
The document discusses Oracle's big data platform and how it can extend Hortonworks' data platform. It provides an overview of Oracle's enterprise big data architecture and the key components of its big data platform. It also discusses how Oracle's platform provides rich SQL access across different data sources and describes some big data solutions for adaptive marketing and predictive maintenance.
From BI Developer to Data Engineer with Oracle Analytics Cloud, Data LakeRittman Analytics
In this session, we'll look at the role of the data engineer in designing, provisioning, and enabling an Oracle Cloud data lake using Oracle Analytics Cloud Data Lake Edition. We’ll also examine the use of data flow and data pipeline authoring tools and how machine learning and AI can be applied to this task. Furthermore, we’ll explore connecting to database and SaaS sources along with sources of external data via Oracle Data-as-a-Service. Finally we’ll delve into how traditional Oracle Analytics developers can transition their skills into this role and start working as data engineers on Oracle Public Cloud data lake projects.
Flexpod with SAP HANA and SAP ApplicationsLishantian
This document discusses Cisco and NetApp solutions for implementing SAP HANA, including:
1) The FlexPod approach which provides a simplified architecture for deploying SAP HANA appliances on Cisco UCS and NetApp storage up to 48TB.
2) Implementing SAP HANA using Tailored Data Center Integration (TDI) on FlexPod, which provides more flexibility compared to appliance-based deployments.
3) Two use cases for SAP HANA TDI involving running multiple SAP HANA production systems on a single Cisco UCS, and reusing an existing data center network rather than network components included in the solution.
Hortonworks Oracle Big Data Integration Hortonworks
Slides from joint Hortonworks and Oracle webinar on November 11, 2014. Covers the Modern Data Architecture with Apache Hadoop and Oracle Data Integration products.
The document discusses extending data governance in Hadoop ecosystems using Apache Atlas and partner solutions including Waterline Data, Attivo, and Trifacta. It highlights how these vendors have adopted Apache's open source community commitment and are integrating their products with Atlas to provide a rich, innovative community with a common metadata store backed by Atlas. The session will showcase how these three vendors extend governance capabilities by integrating their products with Atlas.
Presenter: Sumit Sarkar
The CMO will overtake the CIO on technology spend by 2017. We’re entering a new era of IT and sales/marketing collaboration. Learn about the latest methods for accessing data for deeper analytics from sales and marketing cloud applications across Eloqua, Marketo, Google Analytics, Salesforce and more.
Oracle provides modern cloud applications including CX Cloud, HCM Cloud, ERP Cloud, SCM Cloud, and Data Cloud. The applications are complete, data-driven, personalized, connected, and secure. Oracle has many customers using its cloud applications worldwide. The document discusses Oracle's leadership in the SaaS market and how its applications and services provide benefits such as faster delivery, better products, feedback and innovation. It also introduces new Adaptive Intelligent applications that will add business value by providing smarter and more contextual experiences using data and machine learning.
01 sap inside_track_sapintegrationstrategyshetkars
1) SAP's integration strategy focuses on APIs and prepackaged integration content, flexible deployment models across cloud and on-premise, a rich integration technology portfolio, and a holistic methodology and governance approach.
2) Key building blocks of the strategy include pre-defined integration content to accelerate projects, a choice of integration technologies, and the SAP Integration Solution Advisor methodology to simplify integration.
3) Going forward, SAP aims to provide additional guidance to help customers transition their integration landscapes, including guidance for SAP S/4HANA integration, integration monitoring, and usage of SAP Cloud Platform Integration and SAP Data Hub.
OData External Data Integration Strategies for SaaSSumit Sarkar
This document discusses OData integration strategies for SaaS applications. It provides an overview of the OData standard and why SaaS vendors are adopting it. It then describes how Oracle Service Cloud uses OData accelerators to integrate with external data sources like Salesforce and Siebel. These accelerators allow agents to access and edit external data without leaving the Service Cloud interface.
Firewall friendly pipeline for secure data accessSumit Sarkar
This webinar discusses how to establish secure connections between cloud applications and on-premises data behind firewalls. It presents common connection options like VPNs, SSH tunneling, and reverse proxies, and recommends a vendor-agnostic service that provides a managed open data connection. The webinar covers best practices for scalability, availability, and end-to-end monitoring. It provides examples of how BOARD and Intuit leverage such a connection service to access on-premises data from their cloud applications.
Salesforce External Objects for Big DataSumit Sarkar
Transform Salesforce into the system of engagement for your big data. Discuss best practices and lessons learned in accessing external data sets in Hadoop or Spark using Salesforce Connect. Leave the big data sets behind the firewall, and get on demand access for your users to big data insights using external objects with Salesforce Connect.
In this session we will cover:
Intro to Salesforce Connect
Intro to Big Data Landscape
How to connect Salesforce to Big Data using External Data Sources
Lessons Learned accessing Big Data using External Objects for native reporting, writes, lookups, search and more
Resources (How to learn more)
Salesforce shops, including ourselves, have been eagerly anticipating external object support with reports. Starting in Winter ’17, you can build native reports with on-demand access to external data sources such as Oracle, SQL Server, Greenplum, Amazon Redshift, IBM DB2 or Hadoop Big Data Platforms. External objects are powered by Salesforce Connect and provide clicks-not-code data access for admins, devs and general users. But is all of this too good to be true?
During this webinar, you’ll learn:
- Introduce External Objects and their new capabilities for Reporting and Wave trending in Winter ‘17
- How to setup Salesforce report with external data sources
- How to produce OData from warehouses, marts, lakes or other reporting systems.
Report considerations and limitations with Salesforce Connect
The document discusses Oracle's data integration products and big data solutions. It outlines five core capabilities of Oracle's data integration platform, including data availability, data movement, data transformation, data governance, and streaming data. It then describes eight core products that address real-time and streaming integration, ELT integration, data preparation, streaming analytics, dataflow ML, metadata management, data quality, and more. The document also outlines five cloud solutions for data integration including data migrations, data warehouse integration, development and test environments, high availability, and heterogeneous cloud. Finally, it discusses pragmatic big data solutions for data ingestion, transformations, governance, connectors, and streaming big data.
How to Prepare Your Toolbox for the Future of SharePoint DevelopmentProgress
SharePoint is changing: instead of learning the Microsoft version of a technology that’s rapidly becoming outdated, developers can now use the latest and greatest in jQuery and Angular (or Knockout.js, React.js, etc.) and create great SharePoint UI.
The future of SharePoint development and customization is the SharePoint Framework (SPFx), a client-side based framework that allows JavaScript customizations to work on top of SharePoint Online/Office 365. Let’s put to work a toolset of web technologies, including Angular, Webpack and Kendo UI controls, to build a simple yet useful application and get started with the web stack today.
Download this whitepaper to:
* Get excited about the new SharePoint Framework (SPFx) and related web stack technologies
* See a great set of tools in action
* Learn how to build a practical SharePoint business application using modern web technology
This whitepaper is by SharePoint Gurus, an award-winning consultancy based in Sydney, Australia, that specializes in improving productivity through configuring and developing Microsoft SharePoint technologies.
This document discusses accessing NoSQL databases like MongoDB from SQL. It begins with an introduction to NoSQL and examples of JSON documents and key-value stores. It then covers the benefits of NoSQL like high performance, availability, and scalability. Common NoSQL implementations like MongoDB, Cassandra, and MarkLogic are described. The challenges of connecting to NoSQL databases from SQL are discussed. DataDirect connectors are presented as a solution for providing SQL access to NoSQL databases. They normalize the NoSQL data model and provide full ANSI SQL support. Performance and real-world case studies are also discussed.
Presenter: Mike Johnson
The Big Data ecosystem is disrupting things for the good and not so good. Learn how we deal with this from a connectivity perspective to get insights about the ecosystem, including the latest commercial and open source projects we’re tracking.
The document discusses Oracle Enterprise Metadata Management (OEMM) which allows users to manage metadata, data lineage, and business glossaries. It harvests metadata from popular platforms including BI tools, ETL tools, databases, and big data tools. OEMM provides vertical lineage that shows traceability from business terms to IT artifacts, and horizontal lineage that traces columns and fields across multiple systems. It allows interactive exploration of metadata relationships through zooming and filtering capabilities.
Navigating Your Product's Growth with Embedded Analytics Progress
Presenter: Guarav Verma
Learn from real life applications for embedded product analytics from Telerik. In today’s data driven world, how can you leverage analytics to know your audience, improve their experience, focus on your loyal users to drive more revenue, and optimize your engineering effort to accelerate your business results? Know what the future of Telerik Analytics is like and be a part of it.
This document provides information about Aetna, a health insurance company. It summarizes that Aetna serves about 46 million customers to help them make healthcare decisions and manage healthcare spending. Aetna offers various medical, pharmacy, dental, life, and disability insurance plans as well as Medicaid services and behavioral health programs. As of March 2015, Aetna had approximately 23.7 million medical members, 15.5 million dental members, and 15.4 million pharmacy members. Aetna works with over 1.1 million healthcare professionals across more than 674,000 primary care doctors and specialists located in 5,589 hospitals across the US and globally.
The document discusses Oracle's big data platform and how it can extend Hortonworks' data platform. It provides an overview of Oracle's enterprise big data architecture and the key components of its big data platform. It also discusses how Oracle's platform provides rich SQL access across different data sources and describes some big data solutions for adaptive marketing and predictive maintenance.
From BI Developer to Data Engineer with Oracle Analytics Cloud, Data LakeRittman Analytics
In this session, we'll look at the role of the data engineer in designing, provisioning, and enabling an Oracle Cloud data lake using Oracle Analytics Cloud Data Lake Edition. We’ll also examine the use of data flow and data pipeline authoring tools and how machine learning and AI can be applied to this task. Furthermore, we’ll explore connecting to database and SaaS sources along with sources of external data via Oracle Data-as-a-Service. Finally we’ll delve into how traditional Oracle Analytics developers can transition their skills into this role and start working as data engineers on Oracle Public Cloud data lake projects.
Flexpod with SAP HANA and SAP ApplicationsLishantian
This document discusses Cisco and NetApp solutions for implementing SAP HANA, including:
1) The FlexPod approach which provides a simplified architecture for deploying SAP HANA appliances on Cisco UCS and NetApp storage up to 48TB.
2) Implementing SAP HANA using Tailored Data Center Integration (TDI) on FlexPod, which provides more flexibility compared to appliance-based deployments.
3) Two use cases for SAP HANA TDI involving running multiple SAP HANA production systems on a single Cisco UCS, and reusing an existing data center network rather than network components included in the solution.
Hortonworks Oracle Big Data Integration Hortonworks
Slides from joint Hortonworks and Oracle webinar on November 11, 2014. Covers the Modern Data Architecture with Apache Hadoop and Oracle Data Integration products.
The document discusses extending data governance in Hadoop ecosystems using Apache Atlas and partner solutions including Waterline Data, Attivo, and Trifacta. It highlights how these vendors have adopted Apache's open source community commitment and are integrating their products with Atlas to provide a rich, innovative community with a common metadata store backed by Atlas. The session will showcase how these three vendors extend governance capabilities by integrating their products with Atlas.
Presenter: Sumit Sarkar
The CMO will overtake the CIO on technology spend by 2017. We’re entering a new era of IT and sales/marketing collaboration. Learn about the latest methods for accessing data for deeper analytics from sales and marketing cloud applications across Eloqua, Marketo, Google Analytics, Salesforce and more.
Oracle provides modern cloud applications including CX Cloud, HCM Cloud, ERP Cloud, SCM Cloud, and Data Cloud. The applications are complete, data-driven, personalized, connected, and secure. Oracle has many customers using its cloud applications worldwide. The document discusses Oracle's leadership in the SaaS market and how its applications and services provide benefits such as faster delivery, better products, feedback and innovation. It also introduces new Adaptive Intelligent applications that will add business value by providing smarter and more contextual experiences using data and machine learning.
01 sap inside_track_sapintegrationstrategyshetkars
1) SAP's integration strategy focuses on APIs and prepackaged integration content, flexible deployment models across cloud and on-premise, a rich integration technology portfolio, and a holistic methodology and governance approach.
2) Key building blocks of the strategy include pre-defined integration content to accelerate projects, a choice of integration technologies, and the SAP Integration Solution Advisor methodology to simplify integration.
3) Going forward, SAP aims to provide additional guidance to help customers transition their integration landscapes, including guidance for SAP S/4HANA integration, integration monitoring, and usage of SAP Cloud Platform Integration and SAP Data Hub.
Digital Business with SAP B1 - Introductionjzelynlim95
The document discusses how SAP Business One can help companies transform into digital businesses. It explains key technologies like analytics, cloud, mobile, machine learning, big data, IoT, and APIs that SAP Business One leverages to provide digitization. Specifically, it outlines features for analytics, reporting, dashboards, predictive analysis and more. It also provides examples of how technologies like IoT, machine learning, and big data can benefit businesses using SAP Business One.
Sapwebinar2 how 2transition2s4hanagetyourdatacleanandkeepitclean1569951002523Steffen König
This document discusses preparing data for an SAP S/4HANA transition. It outlines the data transformation journey of preparing, migrating, and governing master data. When moving to S/4HANA, data challenges include business and technical conversions. Typical scenarios that drive data migration and governance are discussed. The challenges of data migration in transformation projects are highlighted. SAP Advanced Data Migration and SAP Master Data Governance are presented as solutions to orchestrate complex data migrations and govern master data.
Unleash the Potential of Big Data on SalesforceDreamforce
Salesforce hosts billions of customer records on Salesforce App Cloud. Making timely decisions on this invaluable data demands a new set of capabilities. From interacting with data real-time to leveraging a fluid integration with Salesforce Analytics, these capabilities are just around the corner. Join us in this roadmap session to see what the near-future of Big Data on Salesforce App Cloud looks like and how you can benefit from it. Watch the video now: https://www.youtube.com/watch?v=a-wFfdfGgvM
Capgemini’s Data WARP: Accelerate your Journey to InsightsCapgemini
More data, more insights. Data is at the center of change and business value! There is no limit to volume, structure of timing. But more data brings more challenges, like cost of storage, increased complexity of the data architecture and a lack of agility. Many organizations are still faced with scattered data lying in silos across the organization. They often lack a clear business case for funding a transform of their data landscape. Or they suffer from ineffective co-ordination of Data and analytics initiatives. Finally, the dependency on legacy systems for data processing and management is still high. Data WARP (Wide Angle Rationalization Program) helps organizations improve the performance of their data & insights architecture landscape, by providing key deliverables like rationalization designs, business cases and transformation roadmap.
Presented at Informatica World 2016 by Jorgen Heizenberg, CTO- Netherlands, Capgemini Insights & Data
How to Convert Your SAP BusinessObjects Unused Licenses to SAP Analytics CloudWiiisdom
Discover SAP Analytics strategy and learn how you can easily find all your SAP BusinessObjects unused licenses to apply those resources to SAP Analytics Cloud and deliver greater agility to your organization thanks to hybrid analytics.
Oracle Eloqua Roadmap SoCal Marketing Cloud User Group February 2016Ron Corbisier
The document provides an overview of Oracle Eloqua's product roadmap for 2016. It summarizes recent innovations including a lighter user interface, A/B testing for emails, new reporting features, template management, and security enhancements. The roadmap highlights plans for improved navigation, analytics/reporting including dashboards and APIs, advanced data workflows, CRM integration, big data segmentation, and sales tools accessible from mobile devices and a Chrome extension. The document aims to outline Oracle's general product direction for information purposes only.
BIG Data & Hadoop Applications in FinanceSkillspeed
Explore the applications of BIG Data & Hadoop in Finance via Skillspeed.
BIG Data & Hadoop in Finance is a key differentiator, especially in terms of generating greater investment insights. They are used by companies & professionals for risk assessment, fraud detection & forecasting trends in financial markets.
To get more details regarding BIG Data & Hadoop, please visit - www.SkillSpeed.com
The document discusses how digital transformation presents opportunities for innovation and partnership. It outlines trends driving digital transformation in B2B, how companies can transform through both strategic and tactical changes, and examples of companies innovating their business models and customer interactions through digital initiatives. The presentation concludes by emphasizing opportunities for innovation in business structures, interactions and operations through digital transformation.
The document discusses SAP HANA Cloud Platform, which is SAP's platform-as-a-service offering. It provides everything needed to build enterprise applications in the cloud, including integration, APIs, analytics, user experience, IoT, security, collaboration, development and operations capabilities. It allows customers to increase business speed and agility by extending SAP solutions to hybrid landscapes. Example use cases and customer stories are also presented to illustrate how Walmart and EnterpriseJungle have leveraged SAP HANA Cloud Platform.
What are the options for sellers and buyers collaborating on catalog content? Join a panel of leading sellers and buyers as they discuss their catalog strategies and preference for hosted CIF versus PunchOut catalogs. Learn how leading sellers use the Ariba Network to drive exposure of their product content to procurement organizations and individuals at their key accounts, while leading procurement organizations use online catalogs to drive up compliance to contract terms and to improve the user experience.
How does SAP Analytics Cloud Drive Desired Business Outcomes_.pdfAnil
Did you know that despite rampant digitalization, many organizations still largely rely on spreadsheets to perform high-level financial planning and analysis?
Driving digital transformation in Automotive industryDebashis Majumder
This document discusses SAP's next-generation business suite, SAP S/4HANA. It highlights how SAP S/4HANA can help automotive companies simplify their technology landscape and business processes. Key benefits include reducing the total cost of ownership, increasing user productivity, and accelerating execution. The document also outlines SAP's roadmap for the automotive industry, including innovations in areas like material requirements planning, inventory management, and the universal journal. It positions SAP S/4HANA as enabling transformation for automotive companies by simplifying technology, transforming business processes, and empowering the business.
The document discusses the need for digital transformation in businesses. It notes that while most organizations see digital transformation as important, many are in denial about the need to transform. The document outlines key aspects of digital transformation like improving the customer experience, creating new markets, and improving operational efficiency. It also discusses inhibitors to digital transformation. The document advocates that both business and technology aspects must be addressed for successful digital transformation.
BIG Data & Hadoop Applications in E-CommerceSkillspeed
Explore the applications of BIG Data & Hadoop in eCommerce via Skillspeed.
BIG Data & Hadoop in eCommerce is a key differentiator, especially in terms of generating optimized customer & back-end experiences. They are used for tracking consumer behavior, optimizing logistics networks and forecasting demand - inventory cycles.
To get more details regarding BIG Data & Hadoop, please visit - www.SkillSpeed.com
This document outlines Ananth Bala's presentation on bridging data silos for business insights. The agenda includes discussing data silos, demonstrating how to derive intelligence from multiple data sources, interactive multi-device reporting, and an overview of the hybrid data pipeline. The presentation notes that data silos have grown due to different systems of record, rise of SaaS solutions, limited integration, and disparate teams. It introduces Progress Data Direct as a way to provide a single interface and standards-based connectivity to bridge these silos. Demos are shown using Data Direct to aggregate data from multiple sources and create reports accessible across devices.
Similar to Journey to Marketing Data Lake [BRK1098] (20)
What serverless means for enterprise appsSumit Sarkar
There’s a new approach to app development ripe with misconceptions and more buzzwords to translate to business sponsors. Industry analysts call it serverless, but it’s also known as backend as a service (BaaS), function as a service (FaaS), cloud-native architectures, or microservices—just to name a few. Whatever you call it, this approach is giving developers new freedom to focus on frontend functionality and deliver better, more innovative user experiences and ultimately establish value faster. Let’s discuss the pros and cons of serverless in enterprise architectures.
Digitize Enterprise Assets for MobilitySumit Sarkar
Demand for digital experiences such as mobility are putting pressure on enterprise teams and systems. Many of these systems are deployed on servers and not engineered to scale. Mobility projects across web/mobile, voice, chat and AR are increasingly running on serverless cloud native architectures. But how can organizations meet the customer demands for digital experiences on enterprise systems such as ERP systems or enterprise APIs? Join Progress Kinvey to explore four options to digitize enterprise systems to deliver experiences for the connected world.
This document summarizes a 5 day proof of concept (POC) for integrating invoice data from an on-premise system into Salesforce using Lightning Connect. On day 1, the author requested connection information and learned invoices were stored in both ERP and a data warehouse. On day 2, they planned the data model relationship and learned to consult data experts. On day 3, they set up developer and trial accounts to produce OData. On day 4, they encountered an issue building a related list and got help from an online community. On day 5, Lightning Connect was enabled and they migrated the POC to a new sandbox for testing. Future projects were discussed to integrate additional systems using Lightning Connect.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/building-and-scaling-ai-applications-with-the-nx-ai-manager-a-presentation-from-network-optix/
Robin van Emden, Senior Director of Data Science at Network Optix, presents the “Building and Scaling AI Applications with the Nx AI Manager,” tutorial at the May 2024 Embedded Vision Summit.
In this presentation, van Emden covers the basics of scaling edge AI solutions using the Nx tool kit. He emphasizes the process of developing AI models and deploying them globally. He also showcases the conversion of AI models and the creation of effective edge AI pipelines, with a focus on pre-processing, model conversion, selecting the appropriate inference engine for the target hardware and post-processing.
van Emden shows how Nx can simplify the developer’s life and facilitate a rapid transition from concept to production-ready applications.He provides valuable insights into developing scalable and efficient edge AI solutions, with a strong focus on practical implementation.
Communications Mining Series - Zero to Hero - Session 1DianaGray10
This session provides introduction to UiPath Communication Mining, importance and platform overview. You will acquire a good understand of the phases in Communication Mining as we go over the platform with you. Topics covered:
• Communication Mining Overview
• Why is it important?
• How can it help today’s business and the benefits
• Phases in Communication Mining
• Demo on Platform overview
• Q/A
Full-RAG: A modern architecture for hyper-personalizationZilliz
Mike Del Balso, CEO & Co-Founder at Tecton, presents "Full RAG," a novel approach to AI recommendation systems, aiming to push beyond the limitations of traditional models through a deep integration of contextual insights and real-time data, leveraging the Retrieval-Augmented Generation architecture. This talk will outline Full RAG's potential to significantly enhance personalization, address engineering challenges such as data management and model training, and introduce data enrichment with reranking as a key solution. Attendees will gain crucial insights into the importance of hyperpersonalization in AI, the capabilities of Full RAG for advanced personalization, and strategies for managing complex data integrations for deploying cutting-edge AI solutions.
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
GraphRAG for Life Science to increase LLM accuracyTomaz Bratanic
GraphRAG for life science domain, where you retriever information from biomedical knowledge graphs using LLMs to increase the accuracy and performance of generated answers
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
AI 101: An Introduction to the Basics and Impact of Artificial IntelligenceIndexBug
Imagine a world where machines not only perform tasks but also learn, adapt, and make decisions. This is the promise of Artificial Intelligence (AI), a technology that's not just enhancing our lives but revolutionizing entire industries.
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!SOFTTECHHUB
As the digital landscape continually evolves, operating systems play a critical role in shaping user experiences and productivity. The launch of Nitrux Linux 3.5.0 marks a significant milestone, offering a robust alternative to traditional systems such as Windows 11. This article delves into the essence of Nitrux Linux 3.5.0, exploring its unique features, advantages, and how it stands as a compelling choice for both casual users and tech enthusiasts.
Building Production Ready Search Pipelines with Spark and MilvusZilliz
Spark is the widely used ETL tool for processing, indexing and ingesting data to serving stack for search. Milvus is the production-ready open-source vector database. In this talk we will show how to use Spark to process unstructured data to extract vector representations, and push the vectors to Milvus vector database for search serving.
Driving Business Innovation: Latest Generative AI Advancements & Success StorySafe Software
Are you ready to revolutionize how you handle data? Join us for a webinar where we’ll bring you up to speed with the latest advancements in Generative AI technology and discover how leveraging FME with tools from giants like Google Gemini, Amazon, and Microsoft OpenAI can supercharge your workflow efficiency.
During the hour, we’ll take you through:
Guest Speaker Segment with Hannah Barrington: Dive into the world of dynamic real estate marketing with Hannah, the Marketing Manager at Workspace Group. Hear firsthand how their team generates engaging descriptions for thousands of office units by integrating diverse data sources—from PDF floorplans to web pages—using FME transformers, like OpenAIVisionConnector and AnthropicVisionConnector. This use case will show you how GenAI can streamline content creation for marketing across the board.
Ollama Use Case: Learn how Scenario Specialist Dmitri Bagh has utilized Ollama within FME to input data, create custom models, and enhance security protocols. This segment will include demos to illustrate the full capabilities of FME in AI-driven processes.
Custom AI Models: Discover how to leverage FME to build personalized AI models using your data. Whether it’s populating a model with local data for added security or integrating public AI tools, find out how FME facilitates a versatile and secure approach to AI.
We’ll wrap up with a live Q&A session where you can engage with our experts on your specific use cases, and learn more about optimizing your data workflows with AI.
This webinar is ideal for professionals seeking to harness the power of AI within their data management systems while ensuring high levels of customization and security. Whether you're a novice or an expert, gain actionable insights and strategies to elevate your data processes. Join us to see how FME and AI can revolutionize how you work with data!
TrustArc Webinar - 2024 Global Privacy SurveyTrustArc
How does your privacy program stack up against your peers? What challenges are privacy teams tackling and prioritizing in 2024?
In the fifth annual Global Privacy Benchmarks Survey, we asked over 1,800 global privacy professionals and business executives to share their perspectives on the current state of privacy inside and outside of their organizations. This year’s report focused on emerging areas of importance for privacy and compliance professionals, including considerations and implications of Artificial Intelligence (AI) technologies, building brand trust, and different approaches for achieving higher privacy competence scores.
See how organizational priorities and strategic approaches to data security and privacy are evolving around the globe.
This webinar will review:
- The top 10 privacy insights from the fifth annual Global Privacy Benchmarks Survey
- The top challenges for privacy leaders, practitioners, and organizations in 2024
- Key themes to consider in developing and maintaining your privacy program
Infrastructure Challenges in Scaling RAG with Custom AI modelsZilliz
Building Retrieval-Augmented Generation (RAG) systems with open-source and custom AI models is a complex task. This talk explores the challenges in productionizing RAG systems, including retrieval performance, response synthesis, and evaluation. We’ll discuss how to leverage open-source models like text embeddings, language models, and custom fine-tuned models to enhance RAG performance. Additionally, we’ll cover how BentoML can help orchestrate and scale these AI components efficiently, ensuring seamless deployment and management of RAG systems in the cloud.
Infrastructure Challenges in Scaling RAG with Custom AI models
Journey to Marketing Data Lake [BRK1098]
1. Journey to
Marketing Data Lake
BRK1098
Oracle Modern Marketing
Experience | Las Vegas | Apr 26-28
Sumit Sarkar
Product Marketing
@SAsInSumit
linkedin.com/in/meetsumit
The challenge this session’s speaker and his colleagues faced in trying to learn more about customer experiences was that insights are fragmented across different systems such as Oracle Eloqua, CRM, and web analytics. To better understand their contacts, they started with the corporate data warehouse, which was missing a lot of this lower-value and detailed data. When they considered expanding the data warehouse, it was difficult to define what questions they wanted to answer in advance, because it varies for each campaign they run. Thus they embarked on building a Hadoop-based data lake, for the flexibility to ask any questions with an ad hoc schema on read approach, against any customer data sets in varying levels of detail, to better understand what their visitors want to consume.
Breakout Session
Wednesday, Apr 26, 5:45 p.m. | Mandalay Bay D
Pillar: Marketing
Marketing Track: Data-Driven Marketing
Product: Oracle Data Management Platform (Oracle BlueKai)
Level: Intermediate
Session Type: Breakout Session
https://go.oracle.com/moderncx-speaker-information
https://oracle.rainfocus.com/scripts/catalog/oracleCx17.jsp?search=BRK1098
Data Lakes are loaded with raw data (no “T”) and create the “Schema on Read” on business demand
To really get big data value, you need to store all types of structured and semi-structured data in a data lake, from CRM data, to social media posts.
You don’t have to have all the answers upfront, or even the questions. Lakes store raw data that can be transformed as questions arise.
Use a variety of tools based on what you’re asking.
Everyone talks about a single, unified view of data
http://info.zaloni.com/hubfs/Architecting_Data_Lakes_Zaloni.pdf
By Ben Sharma and Alice LaPlante
The project is still run by marketing engineers, so we don’t get the perks of an IT driven project. We expect to continue expanding use and value to then elevate this to production and start looking to do more buying than building.
Statistical analysis of detailed data
Log data to study application characteristics
Performance lab data
The project is still run by marketing engineers, so we don’t get the perks of an IT driven project. We expect to continue expanding use and value to then elevate this to production and start looking to do more buying than building.
Statistical analysis of detailed data
Log data to study application characteristics
Performance lab data