Transform Salesforce into the system of engagement for your big data. Discuss best practices and lessons learned in accessing external data sets in Hadoop or Spark using Salesforce Connect. Leave the big data sets behind the firewall, and get on demand access for your users to big data insights using external objects with Salesforce Connect.
In this session we will cover:
Intro to Salesforce Connect
Intro to Big Data Landscape
How to connect Salesforce to Big Data using External Data Sources
Lessons Learned accessing Big Data using External Objects for native reporting, writes, lookups, search and more
Resources (How to learn more)
Salesforce shops, including ourselves, have been eagerly anticipating external object support with reports. Starting in Winter ’17, you can build native reports with on-demand access to external data sources such as Oracle, SQL Server, Greenplum, Amazon Redshift, IBM DB2 or Hadoop Big Data Platforms. External objects are powered by Salesforce Connect and provide clicks-not-code data access for admins, devs and general users. But is all of this too good to be true?
During this webinar, you’ll learn:
- Introduce External Objects and their new capabilities for Reporting and Wave trending in Winter ‘17
- How to setup Salesforce report with external data sources
- How to produce OData from warehouses, marts, lakes or other reporting systems.
Report considerations and limitations with Salesforce Connect
Completely transform the way Cloud apps access data
Progress® DataDirect® Hybrid Data Pipeline is the industry’s first hybrid data pipeline that can run independently and integrate with any single or multi-vendor technology stack connected by open standards for SQL and REST.
Building a Hybrid Data Pipeline for Salesforce and HadoopSumit Sarkar
My team embarked on building a data lake for our sales and marketing data to better understand customer journeys. This required building a hybrid data pipeline to connect our cloud CRM with the new Hadoop Data Lake. One challenge is that IT was not in a position to provide support until we proved value and marketing did not have the experience, so we embarked on the journey ourselves within the product marketing team for our line of business within Progress. The key to delivering on this was using standard interfaces using a bi-directional data pipeline to connect the systems. On the Salesforce side, we were able to get frictionless access to the data lake using clicks-not-code via OData. On the Hadoop side, we were able to ingest data from Salesforce using JDBC for Apache Sqoop. Join us to hear best practices and lessons learned.
Salesforce analytics and BI continues to be a trending, hot topic as organizations implement new platforms to improve their customer intelligence. But what’s the best way to access the data? SOQL is the popular query language for Salesforce. However, SQL may be better suited for accessing data for analytics. Join us in the great SOQL vs. SQL query debate to see which one is best for your analytics project.
REST API debate: OData vs GraphQL vs ORDSSumit Sarkar
Learn the latest industry trends surrounding REST API standardization and what this means for your roadmap. OData is an OASIS standard REST API and has been established among tech companies such as Microsoft, SAP, CA, IBM and Salesforce. GraphQL was created by Facebook in 2015 and has already been deployed at tech companies such as Facebook, Shopify and Intuit. ORDS is the Oracle REST API and delivers similar standardization for Oracle-centric applications.
Data APIs Don't Discriminate [API World Stage Talk]Sumit Sarkar
The exploding API economy, combined with an advanced analytics market projected to reach $30 billion by 2019, is driving a market demand to expose more data from APIs. Business analysts, data engineers, and data scientists have been getting left behind in existing API strategies. This is because many APIs are designed to integrate with applications to extend functionality, however these data workers are looking for APIs that facilitate direct data access to support analytics. Data APIs are specifically designed to provide that frictionless data access experience to support analytics across standard interoperable interfaces such as OData (REST) or ODBC/JDBC (SQL). Consider expanding your API strategy to service the developers in this $30 billion market.
Salesforce shops, including ourselves, have been eagerly anticipating external object support with reports. Starting in Winter ’17, you can build native reports with on-demand access to external data sources such as Oracle, SQL Server, Greenplum, Amazon Redshift, IBM DB2 or Hadoop Big Data Platforms. External objects are powered by Salesforce Connect and provide clicks-not-code data access for admins, devs and general users. But is all of this too good to be true?
During this webinar, you’ll learn:
- Introduce External Objects and their new capabilities for Reporting and Wave trending in Winter ‘17
- How to setup Salesforce report with external data sources
- How to produce OData from warehouses, marts, lakes or other reporting systems.
Report considerations and limitations with Salesforce Connect
Completely transform the way Cloud apps access data
Progress® DataDirect® Hybrid Data Pipeline is the industry’s first hybrid data pipeline that can run independently and integrate with any single or multi-vendor technology stack connected by open standards for SQL and REST.
Building a Hybrid Data Pipeline for Salesforce and HadoopSumit Sarkar
My team embarked on building a data lake for our sales and marketing data to better understand customer journeys. This required building a hybrid data pipeline to connect our cloud CRM with the new Hadoop Data Lake. One challenge is that IT was not in a position to provide support until we proved value and marketing did not have the experience, so we embarked on the journey ourselves within the product marketing team for our line of business within Progress. The key to delivering on this was using standard interfaces using a bi-directional data pipeline to connect the systems. On the Salesforce side, we were able to get frictionless access to the data lake using clicks-not-code via OData. On the Hadoop side, we were able to ingest data from Salesforce using JDBC for Apache Sqoop. Join us to hear best practices and lessons learned.
Salesforce analytics and BI continues to be a trending, hot topic as organizations implement new platforms to improve their customer intelligence. But what’s the best way to access the data? SOQL is the popular query language for Salesforce. However, SQL may be better suited for accessing data for analytics. Join us in the great SOQL vs. SQL query debate to see which one is best for your analytics project.
REST API debate: OData vs GraphQL vs ORDSSumit Sarkar
Learn the latest industry trends surrounding REST API standardization and what this means for your roadmap. OData is an OASIS standard REST API and has been established among tech companies such as Microsoft, SAP, CA, IBM and Salesforce. GraphQL was created by Facebook in 2015 and has already been deployed at tech companies such as Facebook, Shopify and Intuit. ORDS is the Oracle REST API and delivers similar standardization for Oracle-centric applications.
Data APIs Don't Discriminate [API World Stage Talk]Sumit Sarkar
The exploding API economy, combined with an advanced analytics market projected to reach $30 billion by 2019, is driving a market demand to expose more data from APIs. Business analysts, data engineers, and data scientists have been getting left behind in existing API strategies. This is because many APIs are designed to integrate with applications to extend functionality, however these data workers are looking for APIs that facilitate direct data access to support analytics. Data APIs are specifically designed to provide that frictionless data access experience to support analytics across standard interoperable interfaces such as OData (REST) or ODBC/JDBC (SQL). Consider expanding your API strategy to service the developers in this $30 billion market.
Journey to Marketing Data Lake [BRK1098]Sumit Sarkar
The challenge this session’s speaker and his colleagues faced in trying to learn more about customer experiences was that insights are fragmented across different systems such as Oracle Eloqua, CRM, and web analytics. To better understand their contacts, they started with the corporate data warehouse, which was missing a lot of this lower-value and detailed data. When they considered expanding the data warehouse, it was difficult to define what questions they wanted to answer in advance, because it varies for each campaign they run. Thus they embarked on building a Hadoop-based data lake, for the flexibility to ask any questions with an ad hoc schema on read approach, against any customer data sets in varying levels of detail, to better understand what their visitors want to consume.
Breakout Session
Wednesday, Apr 26, 5:45 p.m. | Mandalay Bay D
https://oracle.rainfocus.com/scripts/catalog/oracleCx17.jsp?search=BRK1098
Journey to SAS Analytics Grid with SAS, R, PythonSumit Sarkar
Big data, compliance and a highly skilled workforce are driving organizations to transform their current analytical infrastructure to deliver enterprise computing environments that can support the latest in data science and analytics practices. SAS remains a popular choice for statistical programming languages, but there is growing demand for R and Python. Data engineers are now being tasked to deliver scalable and highly available computing resources to support analytics for a growing number of users and increasing data volumes while maintaining security for their customers.
High Scale Relational Storage at Salesforce Built with Apache HBase and Apach...Salesforce Engineering
Apache HBase is an open source, non-relational, distributed datastore modeled after Google’s Bigtable, that runs on top of the Apache Hadoop Distributed Filesystem and provides low-latency random-access storage for HDFS-based compute platforms like Apache Hadoop and Apache Spark. Apache Phoenix is a high performance relational database layer over HBase optimized for low latency applications. This session will explore how the Data Platform and Services group at Salesforce.com supports teams of application developers accustomed to structured relational data access, while surfacing additional advantages of the underlying flexible scale-out datastore.
GDPR represents a transformational shift for organizations that store or process data on EU residents. As companies scramble to comply, data governance will play a major role in managing risk and gaining competitive advantage. In this demo, learn how open source technologies, such as Apache Atlas and Apache Ranger, can be used to identify and classify personal data, understand its lineage, and centralize access for consent, erasure, and portability.
Speakers
Jamie Engesser, Senior Vice President Product Management, Hortonworks
Srikanth Venkat, Senior Director Product Management, Hortonworks
The Open Data Protocol, or OData for short, provides a RESTful interface for CRUD operations against data services. OData services, such as Microsoft Azure, SAP, and WebSphere expose data and metadata as typed name/value pairs in JSON or XML, allowing 'off-the-shelf' data consumers to integrate with services without custom code. This session gives an overview of OData, and explains why salesforce.com selected it as a protocol to integrate with external data services.
Webcast slides for "Low Risk and High Reward in App Decomm with InfoArchive a...Tom Rieger
Platform 3 Solutions presented these slides on January 17, 2019 with Opentext to give everyone an opportunity to understand the value in removing systems from their operations
Right IT Services BrownBag about Microsoft's SharePoint.
SharePoint combines various functions which are traditionally separate applications AKA Services.
From BI Developer to Data Engineer with Oracle Analytics Cloud, Data LakeRittman Analytics
In this session, we'll look at the role of the data engineer in designing, provisioning, and enabling an Oracle Cloud data lake using Oracle Analytics Cloud Data Lake Edition. We’ll also examine the use of data flow and data pipeline authoring tools and how machine learning and AI can be applied to this task. Furthermore, we’ll explore connecting to database and SaaS sources along with sources of external data via Oracle Data-as-a-Service. Finally we’ll delve into how traditional Oracle Analytics developers can transition their skills into this role and start working as data engineers on Oracle Public Cloud data lake projects.
- Product Strategy update by Benjamin Arnulf
- Forrester Wave Leader by Isabelle Nuage
- Product Overview by Alex Toothman
- Viz best practices by Kate Strachnyi
- Quick viz tour by Barry Mostert
- Platform demo by Gabby Rubin
- Finance demo by Khanh Tran, CPA, MBA
- Healthcare demo by Nada Maguid
- James Bond Viz by Jamie Anderson, CPA
Register: Oracle.com/goto/OAwebinar
YouTube Video: https://youtu.be/yWfc0g2GxI4
Live & Webcam On!
Lightning Connect lets you seamlessly access data from external sources, side-by-side with your Salesforce data. You can pull data from legacy systems such as SAP, Microsoft and Oracle in real time, without making a copy of the data in Salesforce. And its all easily configured by a simple yet powerful point and click interface.
During this webinar, you will learn how Salesforce1 Lightning Connect helps Salesforce Admins and Business Analysts:
:: Connect and access data from external sources with point and click simplicity
:: Dramatically reduce integration time to unlock and modernize back-office systems such as SAP, Oracle, Microsoft and more
:: View external data side-by-side with existing Salesforce objects
Key Takeaways
:: Integrate external data into your Salesforce environment in real-time
:: Connect to any OData 2.0 data source as well as SAP, SharePoint and more
:: Data is accessed directly - no duplication or sync delays
ExactTarget's Fuel Platform opens the ExactTarget Marketing Cloud to third-party development, enabling you to create apps that fully integrate with the Marketing Cloud?s common calendaring, campaign management, and analytics frameworks. Join us to learn how to build Marketing Cloud apps using the Fuel Platform and take those apps to market alongside ExactTarget in HubExchange, the Marketing Cloud?s integrated app marketplace.
Journey to Marketing Data Lake [BRK1098]Sumit Sarkar
The challenge this session’s speaker and his colleagues faced in trying to learn more about customer experiences was that insights are fragmented across different systems such as Oracle Eloqua, CRM, and web analytics. To better understand their contacts, they started with the corporate data warehouse, which was missing a lot of this lower-value and detailed data. When they considered expanding the data warehouse, it was difficult to define what questions they wanted to answer in advance, because it varies for each campaign they run. Thus they embarked on building a Hadoop-based data lake, for the flexibility to ask any questions with an ad hoc schema on read approach, against any customer data sets in varying levels of detail, to better understand what their visitors want to consume.
Breakout Session
Wednesday, Apr 26, 5:45 p.m. | Mandalay Bay D
https://oracle.rainfocus.com/scripts/catalog/oracleCx17.jsp?search=BRK1098
Journey to SAS Analytics Grid with SAS, R, PythonSumit Sarkar
Big data, compliance and a highly skilled workforce are driving organizations to transform their current analytical infrastructure to deliver enterprise computing environments that can support the latest in data science and analytics practices. SAS remains a popular choice for statistical programming languages, but there is growing demand for R and Python. Data engineers are now being tasked to deliver scalable and highly available computing resources to support analytics for a growing number of users and increasing data volumes while maintaining security for their customers.
High Scale Relational Storage at Salesforce Built with Apache HBase and Apach...Salesforce Engineering
Apache HBase is an open source, non-relational, distributed datastore modeled after Google’s Bigtable, that runs on top of the Apache Hadoop Distributed Filesystem and provides low-latency random-access storage for HDFS-based compute platforms like Apache Hadoop and Apache Spark. Apache Phoenix is a high performance relational database layer over HBase optimized for low latency applications. This session will explore how the Data Platform and Services group at Salesforce.com supports teams of application developers accustomed to structured relational data access, while surfacing additional advantages of the underlying flexible scale-out datastore.
GDPR represents a transformational shift for organizations that store or process data on EU residents. As companies scramble to comply, data governance will play a major role in managing risk and gaining competitive advantage. In this demo, learn how open source technologies, such as Apache Atlas and Apache Ranger, can be used to identify and classify personal data, understand its lineage, and centralize access for consent, erasure, and portability.
Speakers
Jamie Engesser, Senior Vice President Product Management, Hortonworks
Srikanth Venkat, Senior Director Product Management, Hortonworks
The Open Data Protocol, or OData for short, provides a RESTful interface for CRUD operations against data services. OData services, such as Microsoft Azure, SAP, and WebSphere expose data and metadata as typed name/value pairs in JSON or XML, allowing 'off-the-shelf' data consumers to integrate with services without custom code. This session gives an overview of OData, and explains why salesforce.com selected it as a protocol to integrate with external data services.
Webcast slides for "Low Risk and High Reward in App Decomm with InfoArchive a...Tom Rieger
Platform 3 Solutions presented these slides on January 17, 2019 with Opentext to give everyone an opportunity to understand the value in removing systems from their operations
Right IT Services BrownBag about Microsoft's SharePoint.
SharePoint combines various functions which are traditionally separate applications AKA Services.
From BI Developer to Data Engineer with Oracle Analytics Cloud, Data LakeRittman Analytics
In this session, we'll look at the role of the data engineer in designing, provisioning, and enabling an Oracle Cloud data lake using Oracle Analytics Cloud Data Lake Edition. We’ll also examine the use of data flow and data pipeline authoring tools and how machine learning and AI can be applied to this task. Furthermore, we’ll explore connecting to database and SaaS sources along with sources of external data via Oracle Data-as-a-Service. Finally we’ll delve into how traditional Oracle Analytics developers can transition their skills into this role and start working as data engineers on Oracle Public Cloud data lake projects.
- Product Strategy update by Benjamin Arnulf
- Forrester Wave Leader by Isabelle Nuage
- Product Overview by Alex Toothman
- Viz best practices by Kate Strachnyi
- Quick viz tour by Barry Mostert
- Platform demo by Gabby Rubin
- Finance demo by Khanh Tran, CPA, MBA
- Healthcare demo by Nada Maguid
- James Bond Viz by Jamie Anderson, CPA
Register: Oracle.com/goto/OAwebinar
YouTube Video: https://youtu.be/yWfc0g2GxI4
Live & Webcam On!
Lightning Connect lets you seamlessly access data from external sources, side-by-side with your Salesforce data. You can pull data from legacy systems such as SAP, Microsoft and Oracle in real time, without making a copy of the data in Salesforce. And its all easily configured by a simple yet powerful point and click interface.
During this webinar, you will learn how Salesforce1 Lightning Connect helps Salesforce Admins and Business Analysts:
:: Connect and access data from external sources with point and click simplicity
:: Dramatically reduce integration time to unlock and modernize back-office systems such as SAP, Oracle, Microsoft and more
:: View external data side-by-side with existing Salesforce objects
Key Takeaways
:: Integrate external data into your Salesforce environment in real-time
:: Connect to any OData 2.0 data source as well as SAP, SharePoint and more
:: Data is accessed directly - no duplication or sync delays
ExactTarget's Fuel Platform opens the ExactTarget Marketing Cloud to third-party development, enabling you to create apps that fully integrate with the Marketing Cloud?s common calendaring, campaign management, and analytics frameworks. Join us to learn how to build Marketing Cloud apps using the Fuel Platform and take those apps to market alongside ExactTarget in HubExchange, the Marketing Cloud?s integrated app marketplace.
Salesforce Admin's guide : the data loader from the command lineCyrille Coeurjoly
Hacks, Habits and Helpful Hints : The salesforce Admin's reference guide. This short guide explain how to use the salesforce data loader in a command line; No more clics, no more errors.
In our expanding world of hyper-connectivity and massive data volumes, you need to be able to quickly and consistently be able to track information in the App Cloud. Join us as we discuss events as a first-class Salesforce object, strategies for a high volume of data streams on the platform, and how we are building out messaging, starting with adding durability to the current Streaming API.
Apache Phoenix with Actor Model (Akka.io) for real-time Big Data Programming...Trieu Nguyen
Apache Phoenix with Actor Model (Akka.io) for real-time Big Data Programming Stack
Why we still need SQL for Big Data ?
How to make Big Data more responsive and faster ?
Marketing, Technology, and the Empowered CustomerJay Henderson
In an increasingly connected world, consumers are in control. Connected customers demand connected marketers. Marketers must move beyond their silo and focus on business results. They will only accomplish this if they embrace a holistic approach of customer centricity beyond marketing, a long term plan for a system of engagement, and the right marketing culture. This combination creates customer loyalty, better marketing ROI and ultimately, better business results.
Today's customers demand applications which integrate intelligently with data from mobile, social media, and cloud sources. A system of engagement meets these expectations by applying data and analytics drawn from an array of master systems. Relational databases are overwhelmed by the enormous variety in data structures, scale and performance required, but in this webinar you'll learn how to use MongoDB to meet the challenge.
How Intuit Implented Lightning Connect with Progress DataDirectSalesforce Developers
When Intuit switched to Salesforce, they had to leave behind essential data. Rather than have their users constantly ?swivel? between two different systems, they were able to gain visibility into the Siebel data for over 6000 users thanks to Lightning Connect and the OData service provided by Progress_ DataDirect Cloud_. Join us to learn how they did it in just 3 months with no changes to their existing environment.
[2016 kmac 채널 커뮤니케이션 컨퍼런스]클라우드기반의 컨택센터 도입을 통한 고객경험 관리전략 아이투맥스 세일즈포스_salesforc...i2max
KMAC(한국능률협회컨설팅)주관의 국내 최고, 최대규모의 채널커뮤니케이션 컨퍼런스 발표자료입니다. Seamless Channel Integration & New CX Chai 세션의 "클라우드 기반의 Contact Center 도입을 통한 고객 경험 관리 전략"발표자료 쉐어내용입니다.
- 2016’ Contact Center Trends
- Why Customer Experience is Important
- The Smarter way to Maximize Customer Experience Value
- Case Studies
발표자료는 http://goo.gl/ZzCcpU 를 통해 다운받으실 수 있습니다.
salesforce의 2015 최신 발표자료입니다.
salesforce 국내 최대 파트너사(한국 단일총판)인 아이투맥스에서 주관하고 Salesforce 와 D.camp가 후원하여 Salesforce 1 platform d-day 개발행사를 진행하였습니다.
150여명의 참석자들이 함께하였고, 클라우드 동향에서 세일즈포스의 다양한 교육, 더불어 실전으로 앱을 만들어보고 모바일로 연동하여 시연하는 등의 체감형 프로그램으로 진행되었습니다.
당일 행사에 참석하지 못하신 분들이나 Salesforce에 관심있는 분들에게 의미있는 자료가 될 것 같습니다.
관련문의 : 아이투맥스 마케팅팀, soo@i2max.co.kr
How OData Opens Your Data To Enterprise Mobile ApplicationsProgress
OData unlocks data and makes it easier to consume by both information workers and developers. Learn more about OData and find out how Progress DataDirect cloud uses OData to unlock your data and simplify your interactions with data on the web.
The Open Data Protocol (OData) is an open protocol for sharing data. It provides a way to break down data silos and increase the shared value of data by creating an ecosystem in which data consumers can interoperate with data producers in a way that is far more powerful than currently possible, enabling more applications to make sense of a broader set of data. Every producer and consumer of data that participates in this ecosystem increases its overall value.
OData is consistent with the way the Web works – it makes a deep commitment to URIs for resource identification and commits to an HTTP-based, uniform interface for interacting with those resources (just like the Web). This commitment to core Web principles allows OData to enable a new level of data integration and interoperability across a broad range of clients, servers, services, and tools.
Copy of the slides from the Advanced Web Development Workshop presented by Ed Bachta, Charlie Moad and Robert Stein of the Indianapolis Museum of Art during the Museums and the Web 2008 conference in Montreal
We're in a data-driven economy. Web API designers need to define what and how to expose data from a variety of apps, services, and stores. What are challenges of unlocking data and opening up access in a straightforward and standards-compliant manner? Is OData the right tool for the job?
Join Anant, Brian, and Greg for a discussion of OData, its API design implications, and the pros and cons of OData as an enabler of data integration and interoperability across Data APIs.
We Will Discuss »
- OData, SQL, and the "RESTification" of data - providing a uniform way to expose, structure, query and manipulate data using REST principles.
- Opportunity and challenges for OData.
- The questions of Web standards and proprietary versus open tools and protocols.
Slides from Salesforce bangalore developer group event organised at UrbanLadder on "Salesforce Connect".
Salesforce Connect is a framework that enables you to view, search, and modify data that’s stored outside your Salesforce org.
Granite state #spug The #microsoftGraph and #SPFx on steroids with #AzureFunc...Vincent Biret
Slides supporting the session at the granite state user group meeting of January 2019. Talking as well about #Azure Active directory and lots of other things
With the new External Data Sources and External Objects feature, data repositories like SAP can be accessed in Salesforce by reference without the need for data replication. One of the standard connectors provided with External Data Sources is based on the Open Data protocol (OData) that provides for a standardized way of creating and consuming data APIs. Join us for an introduction to External Data Sources, External Objects, and the OData Connector. We'll cover both sides of data connectivity, including an overview of basic characteristics of External Objects, their setup and usage, as well as a survey of the OData ecosystem. You'll learn best practices for implementing OData-backed External Objects for integration solutions.
OData is an OASIS standard REST API and has been established among tech companies such as Microsoft, SAP, CA, IBM and Salesforce. In this presentation, Nishanth Kadiyala will speak about the following:
1. What is OData?
2. Why do we need OData?
3. Adopters of OData
4. Basics of OData
5. Evolution of OData and its limitations
6. How to produce OData?
You can create your own declarative developer frameworks for internal teams, partners, and customers. Rather than building apps from data, you can build apps that are defined and driven by their own types of metadata. Metadata is the information that describes the configuration of each customer’s organization.
Consuming Data From Many Platforms: The Benefits of OData - St. Louis Day of ...Eric D. Boyd
The amount of data stored today is growing at a rapid rate. However, data is only valuable if it is accessible and can be consumed by people and systems. OData is an open protocol for sharing data that is positioned to solve this problem. OData uses the standard HTTP protocol using REST principles to make data accessible and has huge industry momentum with rapid adoption growth. In this session, we will explore what OData is all about and how to expose relational and non-relational data as OData using WCF Data Services. We will then walkthrough developing apps to consume the OData feeds from multiple clients including mobile devices. Finally, we will take a look at how you can benefit from using Azure to publish your data with OData services.
Do you have a true Big Data Analytics platform? What's a true Big Data Analytics platform? How can it help capitalize big data? What's needed to build one? This short introductory presentation can help understand what's a true Big Data Analytics platform and how it really helps building Big Data Analytics applications.
Similar to Salesforce External Objects for Big Data (20)
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
Salesforce External Objects for Big Data
1. External Objects for External Big Data
Sumit Sarkar
Chief Data Evangelist
Progress
sumit.sarkar@progress.com
@SAsInSumit
2. Agenda
1. Intro to External Objects for Big Data
• What is Salesforce Connect?
• What is Big Data?
• What is OData?
2. How to access Big Data from Salesforce
3. Live Demo (fingers crossed)
4. Lessons Learned
External Objects for External Big Data
4. Salesforce Connect maps Salesforce external
objects to data tables in external systems.
Instead of copying the data into your
organization, Salesforce Connect accesses the
data on demand and in real time. The data is
never stale, and we access only what you
need.
Recommended when:
•You have a large amount of data that you don’t want to
copy into your Salesforce organization.
•You need small amounts of data at any one time.
•You want real-time access to the latest data.
What is Salesforce Connect?
6. Salesforce uses Big Data technologies within their
platform (HBase, Pig, Phoenix, etc) to support
technologies such as Einstein or exposed through
different APIs such as BigObjects.
However most organizations have their own Big Data
technologies to analyze and expose data sets similar in
scale or variety…
Don’t ETL Big Data sets into the
platform!
What is Big Data?
7. How my team uses Big Data Technology
Enhance the Customer
Journey
What support resources were
consumed for each evaluation?
What other product or solution
pages were visited to create
smarter targeted campaigns?
What is the success score for
leads and contacts evaluating
products?
9. An open protocol to allow the creation and consumption of
queryable and interoperable RESTful APIs in a simple and standard way.
OASIS Standard REST API (“SQL for the web”)
Ratified as an OASIS standard February, 2014
Operations built on REST principles
Uniform URL conventions
Surface metadata in standard way
What is OData?
First member to join OData Technical Committee
12. How to produce OData from Big Data?
Use Salesforce Connect External Objects (intro’d Winter ‘15)
Open Source TechnologiesDataDirect Cloud Apex Connector Framework
13. Connect Salesforce to Big Data
Success Scoring
Personalization
Archived Insight
360 Reporting
Corporate Firewall
17. Lessons Learned
1. Mapping OData entities to Big Data objects
2. Primary keys for Big Data entities
3. HiveServer1 vs HiverServer2 for concurrency
4. External Objects have limits and 2 minute max timeout
5. Native Reporting support being added in Winter ‘17
6. Search considerations
7. Need agile OData service with Data Lake
8. Data Governance and Masking
9. CRM User Experience (strategies to improve performance)
Accessing external Big Data objects
18. Design Patterns for external objects
Enable Separate Loading of Related Lists of External Objects
Performance Tuning Tips for
Related Lists in Account
000148978
19. Decrease latency accessing Big Data over Hive
Stuff Salesforce devs don’t care about
1. Use Apache Tez as execution engine for Hive
2. Use ORCfile, new storage format
3. Use vectorization query execution (Hive 0.13)
4. Performance Tuning (Partitions, Indexes, Buckets, Block Sizes, etc)
5. Consider another query interface (i.e. Apache Hawq)