Three Steps to Accelerating Your Billing Reconciliation Process in Online Adv...Connotate
Digital and mobile display companies in online advertising face significant challenges reconciling billing. The process of collecting usage statistics from disparate ad servers can be convoluted and prone to error.
Master Data Management - Aligning Data, Process and Governance Precisely
Master Data Management (MDM) provides organizations with an accurate and comprehensive view of their business-critical data such as customers, products, vendors, and more. While mastering these key data areas can be a complex task, the value of doing so can be tremendous – from real-time operational integration to data warehousing and analytic reporting. This webinar will provide practical strategies for gaining value from your MDM initiative, while at the same time assuring a solid architectural and governance foundation that will ensure long-term, enterprise-wide success.
Amp Your Customer Service Statistics by Improving Data in Salesforce Service ...Informatica Cloud
Your customers deserve great service, but nothing destroys goodwill more than long wait times, unanswered questions and being treated like a number. So how can you deploy CRM to fix these issues and keep delighting your best customers?
Salesforce is undoubtedly one of your most crucial CRM investments and one of today's most powerful cloud ecosystems — but poor deployment choices and massive app proliferation can introduce integration complexity that can dramatically impact the Salesforce data and ultimately drive down customer satisfaction scores. Join Clive Bearman, Director of Product Marketing at Informatica, Justin Donlon, Business Intelligence Solution Architect at Carbonite, and Mike McDermott, SVP of Business Development at Primitive Logic, as they discuss how to best tame the data integration complexity and amp your customer service.
Clive will discuss connectivity and integration scenarios across the Salesforce portfolio, not just Sales Cloud. Justin will explain how Carbonite attained an independent 9.5 out of 10 satisfaction score. Mike will conclude with practical implementation advice to enable you to deepen your relationships with your customers
At the end of the webinar, you’ll understand where and when to use the most appropriate techniques, and how to score quick wins for better customer service.
Simply Business is a leading insurance provider for small business in the UK and we are now growing to the USA. In this presentation, I explain how our data platform is evolving to keep delivering value and adapting to a company that changes really fast.
Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confl...Precisely
Mainframes sind immer noch weit verbreitet im Einsatz und verarbeiten täglich über 70 Prozent der wichtigsten Rechentransaktionen der Welt. Sehr hohe Kosten, monolithische Architekturen und fehlende Experten sind die größten Herausforderungen für Mainframe-Anwendungen. Es ist an der Zeit, innovativer zu werden, auch mit dem Mainframe! Stellen wir uns gemeinsam dem Dinosaurier!
Mainframe Offloading mit Confluent, Apache Kafka und dem zugehörigen Ökosystem kann genutzt werden, um moderne Dateninfrastrukturen in Echtzeit mit dem Mainframe synchron zu halten. Dabei ermöglich Kafka sowohl die Datenverarbeitung als auch die Integration mit Systemen wie Data Warehouses und Analytics-Plattformen. Dabei können via Change Data Capture (CDC) permanent Mainframe-Änderungen im hochvoluminösen Bereich nach Kafka gepusht werden.
In dieser on-demand-präsentation zeigen Confluent und Precisely, wie Unternehmen diesen Schritt zur Legacy-Migration machen, Kosten sparen, eine skalierbare und offene Architektur schaffen und so neue Dienste und Anwendungen ermöglichen.
Kofax Connect: Making the First Mile of Business SmarterKofax
With the explosive growth of mobile devices, today's customers desire rapid response from the companies they do business with from the first moments of engagement. This presentation discusses how you can leverage mobile technology during these initial interactions – the First Mile – to gain competitive advantage, streamline business processes, and improve customer response time.
Hybrid Cloud Journey - Maximizing Private and Public CloudRyan Lynn
This presentation walks through the elements of private and public cloud and how to start looking at use cases for hybrid cloud architectures. It covers benefits, statistics, trends and practical next steps for your hybrid cloud journey.
Live presentation of some of this content: https://www.youtube.com/watch?v=9_5yJr0HKw4&t=13s
Three Steps to Accelerating Your Billing Reconciliation Process in Online Adv...Connotate
Digital and mobile display companies in online advertising face significant challenges reconciling billing. The process of collecting usage statistics from disparate ad servers can be convoluted and prone to error.
Master Data Management - Aligning Data, Process and Governance Precisely
Master Data Management (MDM) provides organizations with an accurate and comprehensive view of their business-critical data such as customers, products, vendors, and more. While mastering these key data areas can be a complex task, the value of doing so can be tremendous – from real-time operational integration to data warehousing and analytic reporting. This webinar will provide practical strategies for gaining value from your MDM initiative, while at the same time assuring a solid architectural and governance foundation that will ensure long-term, enterprise-wide success.
Amp Your Customer Service Statistics by Improving Data in Salesforce Service ...Informatica Cloud
Your customers deserve great service, but nothing destroys goodwill more than long wait times, unanswered questions and being treated like a number. So how can you deploy CRM to fix these issues and keep delighting your best customers?
Salesforce is undoubtedly one of your most crucial CRM investments and one of today's most powerful cloud ecosystems — but poor deployment choices and massive app proliferation can introduce integration complexity that can dramatically impact the Salesforce data and ultimately drive down customer satisfaction scores. Join Clive Bearman, Director of Product Marketing at Informatica, Justin Donlon, Business Intelligence Solution Architect at Carbonite, and Mike McDermott, SVP of Business Development at Primitive Logic, as they discuss how to best tame the data integration complexity and amp your customer service.
Clive will discuss connectivity and integration scenarios across the Salesforce portfolio, not just Sales Cloud. Justin will explain how Carbonite attained an independent 9.5 out of 10 satisfaction score. Mike will conclude with practical implementation advice to enable you to deepen your relationships with your customers
At the end of the webinar, you’ll understand where and when to use the most appropriate techniques, and how to score quick wins for better customer service.
Simply Business is a leading insurance provider for small business in the UK and we are now growing to the USA. In this presentation, I explain how our data platform is evolving to keep delivering value and adapting to a company that changes really fast.
Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confl...Precisely
Mainframes sind immer noch weit verbreitet im Einsatz und verarbeiten täglich über 70 Prozent der wichtigsten Rechentransaktionen der Welt. Sehr hohe Kosten, monolithische Architekturen und fehlende Experten sind die größten Herausforderungen für Mainframe-Anwendungen. Es ist an der Zeit, innovativer zu werden, auch mit dem Mainframe! Stellen wir uns gemeinsam dem Dinosaurier!
Mainframe Offloading mit Confluent, Apache Kafka und dem zugehörigen Ökosystem kann genutzt werden, um moderne Dateninfrastrukturen in Echtzeit mit dem Mainframe synchron zu halten. Dabei ermöglich Kafka sowohl die Datenverarbeitung als auch die Integration mit Systemen wie Data Warehouses und Analytics-Plattformen. Dabei können via Change Data Capture (CDC) permanent Mainframe-Änderungen im hochvoluminösen Bereich nach Kafka gepusht werden.
In dieser on-demand-präsentation zeigen Confluent und Precisely, wie Unternehmen diesen Schritt zur Legacy-Migration machen, Kosten sparen, eine skalierbare und offene Architektur schaffen und so neue Dienste und Anwendungen ermöglichen.
Kofax Connect: Making the First Mile of Business SmarterKofax
With the explosive growth of mobile devices, today's customers desire rapid response from the companies they do business with from the first moments of engagement. This presentation discusses how you can leverage mobile technology during these initial interactions – the First Mile – to gain competitive advantage, streamline business processes, and improve customer response time.
Hybrid Cloud Journey - Maximizing Private and Public CloudRyan Lynn
This presentation walks through the elements of private and public cloud and how to start looking at use cases for hybrid cloud architectures. It covers benefits, statistics, trends and practical next steps for your hybrid cloud journey.
Live presentation of some of this content: https://www.youtube.com/watch?v=9_5yJr0HKw4&t=13s
Operational Process Analytics - Why traditional analytics and monitoring are ...Elmar Weber
A talk from the Activiti Global User Day 2015 in Paris. It covers the topic of operational intelligence. Why it is an important topic, specifically for Business Process Management, why current BPM vendors don't cover it and the typical reaction to use Business Intelligence methods are not enough. I then go into how Cupenya is solving this and how easy it is to get started with the Open Source Activiti Process engine with one line of code to provide real-time and predictive, operational analytics to business users.
Modernize your Infrastructure and Mobilize Your DataPrecisely
Modernizing your infrastructure can get complicated really fast. The keys to success involve breaking down data silos and moving data to the cloud in real time. But building data pipelines to mobilize your data in the cloud can be time consuming. You need solutions that decrease bandwidth, ensure data consistency, and enable data migration and replication in real-time; solutions that help you build data pipelines in hours, not days.
Watch this on-demand webinar to learn about the trends and pitfalls related to modernizing your infrastructure to cloud, how the pace of on-prem data growth demands accelerating data streaming to analytics platforms, and why mobilizing your data for the cloud improves business outcomes.
You are not Facebook or Google? Why you should still care about Big Data and ...Kai Wähner
Big data represents a significant paradigm shift in enterprise technology. Big data radically changes the nature of the data management profession as it introduces new concerns about the volume, velocity and variety of corporate data.
This session goes beyond the well-known examples of huge companies such as Facebook or Google with millions of users. Instead, this session explains the "big" paradigm and technology shift for your company. See several use cases how big data enables small / medium-sized companies to gain insight into new business opportunities (and threats) and how big data stands to transform much of what the modern enterprise is today.
Learn about solving the unique challenges of big data without an own research lab or several big data experts in your company. Learn how to implement the relevant use cases for your company with low costs and efforts by using open source frameworks, which simplify working with big data a lot.
What if all members of your software development team from Project Managers, Business Analysts, Testing and documentation members could create and modify web applications and web services? With traditional SQL solutions this was difficult because of the need to convert web pages to objects, objects to tables as well as the reverse functions. But now with native XML databases and drag-and-drop forms builders, data can flow from the XML model of a web form to the database and back again without translation. This radically simpler process combined with standardized query languages makes it easier for non-programmers to build and maintain their own applications and web services.
Code Red for CODE-P: What’s a Customer Omnichannel Digital Experience Platfor...Precisely
As customer communications management (CCM) continues to evolve to a more customer experience management (CXM) focus for companies at a rapid pace, we are seeing a new category emerge in the market. CODE-P’s, or Customer Omnichannel Digital Experience Platforms, enable organizations to offer a holistic approach to managing all customer communications – transactional, servicing and marketing – while simultaneously creating a more seamless and cohesive end-customer experience. The impact of this shift is significant – 63% of bank customers for example, say they would switch banks if communications don’t meet expectations. But if you’re trying to combine legacy systems it’s not an easy task, as fragmented systems were not designed to evolve together leaving you at risk for significant customer attrition.
With the recent announcement of the acquisition of CEDAR CX by Precisely, this new CODE-P category is taking shape, combining the expertise of hosted managed service platforms with legacy CCM service providers. And Aspire believes this cloud-based solution will help companies further fast-track changes and innovation in regulated communications. Join this Aspire-hosted webinar to hear from experts Kaspar Roos, founder & CEO of Aspire; Greg Van den Heuvel, EVP & GM of Precisely; and Richard Bishop, Sales EVP of Precisely, on the future of this space and the impact customer communications can make on the success of your CX.
This on-demand webinar presents real-world customer examples that illustrate the 4 key steps to close the gap between raw data, meaningful insights and real-time action, including:
Collect and reconcile customer data about identities, profiles, purchase history, preferences, and transactions
Transform and augment this data into a 360° view of the customer with context, intentions, relationships, and interactions
Turn data into insights, with segments, scores, forecasts and recommendations
Connect in real time to the customer touch-points and turn those insights into increased conversion rates and customer loyalty
Liberate Your Data: Integrate Data From Traditional On-Prem Systems to Next-G...Precisely
Hear from our data experts on how to modernize your data infrastructure and integrate siloed data into cloud-based enterprise data hubs. A modern data infrastructure helps to power high-performance analytics, AI and machine learning.
Watch this on-demand webinar to learn how to liberate your data from traditional on-premise data systems and migrate it to next-gen cloud platforms, such as Snowflake.
See a live demonstration of Connect which has already helped thousands of organisations to liberate their data in order to improve customer experience, reduce churn, lower risk, and meet regulatory requirements.
SOPRIS Technologies' David Stevenson (Chief Strategy Officer) presents Zenoss as Core Element for Video QOS.
Access the full presentation recordings for GalaxZ17 here: http://ow.ly/WyBu30cakk0
The Heart of Data Modeling: The Best Data Modeler is a Lazy Data ModelerDATAVERSITY
We're under pressure to do more with fewer resources. And organizations are often short on experienced data modelers. So why should we spend time doing things that can be done by robots. Well, not robots, but automation.
In this month's webinar, Karen demonstrates the types of automation techniques available in leading data modeling tools such as ERwin, ER/Studio and PowerDesigner. She will also leave you with 10 tips on being more lazy. What webinar last promised that, anyway?
Preparing for Major Disruptions in Digital Asset ManagementNuxeo
Nuxeo's guest speaker, Anjali Yakkundi of Forrester Research, Inc., discusses the latest trends in digital asset management (DAM) and how to select a DAM vendor.
How to Streamline Complex, Data-Intensive SAP Materials and Product Data Proc...Precisely
Drive success in 2022 with automation
More than ever, product success is reliant on complex, data-intensive processes to compete in today’s highly dynamic marketplace. Driving this complexity is an increasing reliance on SAP product data and the processes that create, manage, and use materials, customer, and vendor data.
In this on-demand webinar, we will discuss how Winshuttle (now a part of Precisely), the leader in data integrity, can help you:
- Automate manual SAP tasks to deliver quick results for mass data management challenges
- Automate complex, SAP-centric product data processes to help you get products to market faster while achieving higher data quality and better process governance
Do You Trust Your Machine Learning Outcomes? Precisely
How to improve trust in advanced analytics, AI, and machine learning
With the volume, velocity, and variety of data coming into the enterprise, IT teams are turning to artificial intelligence and machine learning to improve the efficiency and accuracy of their data management processes. But if you have underlying data integrity challenges, and you’re using that faulty data to train your machine learning algorithms, your machine learning is now fueled by faulty data. How does that impact your business decisions?
View this on-demand webinar with Dr. Tendü Yoğurtçu, Precisely CTO, for this informative discussion where she will examine various use cases for machine learning and advanced analytics. We will also explore the root causes of data integrity challenges, including:
- Poor data quality
- Data silos
- Lack of context that enriches the understanding of your data
A quick presentation that explains Import.io. The presentation outlines the problem that Import.io solves using screenshots of the product and three real life use cases from different industries: health, retail and recruitment.
Power Up Competitive Price Intelligence with Web DataConnotate
Unprecedented price transparency has shifted the balance of power to the consumer, compressing margins and shattering the strongholds of premium brands.
Operational Process Analytics - Why traditional analytics and monitoring are ...Elmar Weber
A talk from the Activiti Global User Day 2015 in Paris. It covers the topic of operational intelligence. Why it is an important topic, specifically for Business Process Management, why current BPM vendors don't cover it and the typical reaction to use Business Intelligence methods are not enough. I then go into how Cupenya is solving this and how easy it is to get started with the Open Source Activiti Process engine with one line of code to provide real-time and predictive, operational analytics to business users.
Modernize your Infrastructure and Mobilize Your DataPrecisely
Modernizing your infrastructure can get complicated really fast. The keys to success involve breaking down data silos and moving data to the cloud in real time. But building data pipelines to mobilize your data in the cloud can be time consuming. You need solutions that decrease bandwidth, ensure data consistency, and enable data migration and replication in real-time; solutions that help you build data pipelines in hours, not days.
Watch this on-demand webinar to learn about the trends and pitfalls related to modernizing your infrastructure to cloud, how the pace of on-prem data growth demands accelerating data streaming to analytics platforms, and why mobilizing your data for the cloud improves business outcomes.
You are not Facebook or Google? Why you should still care about Big Data and ...Kai Wähner
Big data represents a significant paradigm shift in enterprise technology. Big data radically changes the nature of the data management profession as it introduces new concerns about the volume, velocity and variety of corporate data.
This session goes beyond the well-known examples of huge companies such as Facebook or Google with millions of users. Instead, this session explains the "big" paradigm and technology shift for your company. See several use cases how big data enables small / medium-sized companies to gain insight into new business opportunities (and threats) and how big data stands to transform much of what the modern enterprise is today.
Learn about solving the unique challenges of big data without an own research lab or several big data experts in your company. Learn how to implement the relevant use cases for your company with low costs and efforts by using open source frameworks, which simplify working with big data a lot.
What if all members of your software development team from Project Managers, Business Analysts, Testing and documentation members could create and modify web applications and web services? With traditional SQL solutions this was difficult because of the need to convert web pages to objects, objects to tables as well as the reverse functions. But now with native XML databases and drag-and-drop forms builders, data can flow from the XML model of a web form to the database and back again without translation. This radically simpler process combined with standardized query languages makes it easier for non-programmers to build and maintain their own applications and web services.
Code Red for CODE-P: What’s a Customer Omnichannel Digital Experience Platfor...Precisely
As customer communications management (CCM) continues to evolve to a more customer experience management (CXM) focus for companies at a rapid pace, we are seeing a new category emerge in the market. CODE-P’s, or Customer Omnichannel Digital Experience Platforms, enable organizations to offer a holistic approach to managing all customer communications – transactional, servicing and marketing – while simultaneously creating a more seamless and cohesive end-customer experience. The impact of this shift is significant – 63% of bank customers for example, say they would switch banks if communications don’t meet expectations. But if you’re trying to combine legacy systems it’s not an easy task, as fragmented systems were not designed to evolve together leaving you at risk for significant customer attrition.
With the recent announcement of the acquisition of CEDAR CX by Precisely, this new CODE-P category is taking shape, combining the expertise of hosted managed service platforms with legacy CCM service providers. And Aspire believes this cloud-based solution will help companies further fast-track changes and innovation in regulated communications. Join this Aspire-hosted webinar to hear from experts Kaspar Roos, founder & CEO of Aspire; Greg Van den Heuvel, EVP & GM of Precisely; and Richard Bishop, Sales EVP of Precisely, on the future of this space and the impact customer communications can make on the success of your CX.
This on-demand webinar presents real-world customer examples that illustrate the 4 key steps to close the gap between raw data, meaningful insights and real-time action, including:
Collect and reconcile customer data about identities, profiles, purchase history, preferences, and transactions
Transform and augment this data into a 360° view of the customer with context, intentions, relationships, and interactions
Turn data into insights, with segments, scores, forecasts and recommendations
Connect in real time to the customer touch-points and turn those insights into increased conversion rates and customer loyalty
Liberate Your Data: Integrate Data From Traditional On-Prem Systems to Next-G...Precisely
Hear from our data experts on how to modernize your data infrastructure and integrate siloed data into cloud-based enterprise data hubs. A modern data infrastructure helps to power high-performance analytics, AI and machine learning.
Watch this on-demand webinar to learn how to liberate your data from traditional on-premise data systems and migrate it to next-gen cloud platforms, such as Snowflake.
See a live demonstration of Connect which has already helped thousands of organisations to liberate their data in order to improve customer experience, reduce churn, lower risk, and meet regulatory requirements.
SOPRIS Technologies' David Stevenson (Chief Strategy Officer) presents Zenoss as Core Element for Video QOS.
Access the full presentation recordings for GalaxZ17 here: http://ow.ly/WyBu30cakk0
The Heart of Data Modeling: The Best Data Modeler is a Lazy Data ModelerDATAVERSITY
We're under pressure to do more with fewer resources. And organizations are often short on experienced data modelers. So why should we spend time doing things that can be done by robots. Well, not robots, but automation.
In this month's webinar, Karen demonstrates the types of automation techniques available in leading data modeling tools such as ERwin, ER/Studio and PowerDesigner. She will also leave you with 10 tips on being more lazy. What webinar last promised that, anyway?
Preparing for Major Disruptions in Digital Asset ManagementNuxeo
Nuxeo's guest speaker, Anjali Yakkundi of Forrester Research, Inc., discusses the latest trends in digital asset management (DAM) and how to select a DAM vendor.
How to Streamline Complex, Data-Intensive SAP Materials and Product Data Proc...Precisely
Drive success in 2022 with automation
More than ever, product success is reliant on complex, data-intensive processes to compete in today’s highly dynamic marketplace. Driving this complexity is an increasing reliance on SAP product data and the processes that create, manage, and use materials, customer, and vendor data.
In this on-demand webinar, we will discuss how Winshuttle (now a part of Precisely), the leader in data integrity, can help you:
- Automate manual SAP tasks to deliver quick results for mass data management challenges
- Automate complex, SAP-centric product data processes to help you get products to market faster while achieving higher data quality and better process governance
Do You Trust Your Machine Learning Outcomes? Precisely
How to improve trust in advanced analytics, AI, and machine learning
With the volume, velocity, and variety of data coming into the enterprise, IT teams are turning to artificial intelligence and machine learning to improve the efficiency and accuracy of their data management processes. But if you have underlying data integrity challenges, and you’re using that faulty data to train your machine learning algorithms, your machine learning is now fueled by faulty data. How does that impact your business decisions?
View this on-demand webinar with Dr. Tendü Yoğurtçu, Precisely CTO, for this informative discussion where she will examine various use cases for machine learning and advanced analytics. We will also explore the root causes of data integrity challenges, including:
- Poor data quality
- Data silos
- Lack of context that enriches the understanding of your data
A quick presentation that explains Import.io. The presentation outlines the problem that Import.io solves using screenshots of the product and three real life use cases from different industries: health, retail and recruitment.
Power Up Competitive Price Intelligence with Web DataConnotate
Unprecedented price transparency has shifted the balance of power to the consumer, compressing margins and shattering the strongholds of premium brands.
Netbiscuits recently made free versions of its Mobile Analytics and Device Detection tools available to businesses of all sizes. The following presentation provides information on the product, its place in the market and examples of specific use cases.
CGI probes are sent against web servers. This tool provides an ability to turn them off and if the user is running an audit from a proxy server, he/she can configure the scanner to run CGI probes through that proxy.
Different hats, same needs: Marketing compliance from doctor’s office to boar...Prolifiq Software
At the Marcus Evans 2011 Pharma Marketing Summit on May 4th, GPP blogger Jonathan Sackier presented a physician’s take on Good Promotional Practices for pharmaceutical and medical device sales, including four actionable strategies for marketing compliance.
From Prep to racing, see how managing your business is a lot like driving a racecar.
Check fuel, tire pressure, electrical, etc.
Test brakes, transmission, steering, etc.
Warm up the engine
Review plan of attack
Team review of plan, roles, responsibilities, and goals (metrics)
Play position strategy
Know when to draft and when to pass
Anticipate turns – start high, move to low and tight
Anticipate obstacles
Watch the gauges – frequent fast reads
Drive with “field awareness”
Avoid desperate moves – leave that to the rookies
Using Web Data to Drive Revenue and Reduce CostsConnotate
This presentation is designed to help companies strengthen their competitive advantage by leveraging publicly available Web sources.
Entrepreneurs, global industry leaders and enterprises of all sizes are turning Web data into lucrative opportunities – creating new revenue-generating products, reducing costs and re-engineering workflows to optimize pricing, streamline reporting, ensure compliance, engage interactively with clients and more.
This presentation uses a variety of success stories to illustrate ways in which businesses can use Web data to drive revenue and streamline operations.
Maximize ROI of Insurance Digital Transformation Initiatives with Proven Data...Precisely
Many insurance carriers are transforming the way they do business by deploying new software technologies, migrating data and services to the cloud, and leveraging artificial intelligence (AI) to speed decision-making. Data is at the heart of all these initiatives, and it has a direct impact on success or failure. When that data is integrated into upstream or downstream processes, it can also have a broader impact on the operational, analytical, and compliance needs of the organization. The traditional, and often ad-hoc, tools and processes that organizations employ to support data quality, data integrity, transaction reconciliation, and exception management are often inadequate. They do not provide the speed, technical agility, and intelligence demanded by digital transformation initiatives.
Join us to explore proven methods of how insurance carriers are maximizing ROI and minimizing the time-to-value of digital transformation initiatives by:
• Aligning data governance with organizational and project objectives to reduce implementation effort and duration
• Leveraging automated controls for data quality, including balance and reconciliation of data in motion to avoid operational disruptions and maintain regulatory compliance
• Increasing efficiency and capability through centralized data integrity solution
Power Up Your Competitive Price Intelligence With Web DataConnotate
Unprecedented price transparency has shifted the balance of power to the consumer, compressing margins and shattering the strongholds of premium brands. If you are a manufacturer, distributor or retailer, how can you respond to this challenge?
This presentation will reveal strategies for regaining control and improving margins – with real-world success stories that clearly illustrate how manufacturers and retailers are gaining greater visibility into pricing and product positioning throughout the entire supply chain.
Foundational Strategies for Trust in Big Data Part 1: Getting Data to the Pla...Precisely
Teams working on new business initiatives, whether for enhancing customer engagement, creating new value, or addressing compliance considerations, know that a successful strategy starts with the synchronization of operational and reporting data from across the organization into a centralized repository for use in advanced analytics and other projects. However, the range and complexity of data sources as well as the lack of specialized skills needed to extract data from critical legacy systems often causes inefficiencies and gaps in the data being used by the business.
The first part of our webcast series on Foundation Strategies for Trust in Big Data provides insight into how Syncsort Connect with its design once, deploy anywhere approach supports a repeatable pattern for data integration by enabling enterprise architects and developers to ensure data from ALL enterprise data sources– from mainframe to cloud – is available in the downstream data lakes for use in these key business initiatives.
Increase Profits with Better Vehicle Listing DataConnotate
Auto dealers and providers of dealer support systems face big challenges obtaining vehicle listing data at an affordable price. Licensed data feeds are very expensive, and using internal IT resources to “do it yourself” creates operational headaches. Connotate’s Vehicle Listings webinar presents an innovative solution.
Predictive Conversion Modeling - Lifting Web Analytics to the next levelPetri Mertanen
Annalect presentation at Superweek 2017: Predictive Conversion Modeling - Lifting Web Analytics to the next level. Presented by Petri Mertanen, Director of Digital Analytics and Ron Luhtanen, Data Science Analyst. #SPWK
Analytics in the Cloud and the ROI for B2BVeronica Kirn
Veronica Kirn Global Market Manager presents the shift in Analytics with Jeannine Calandra providing in depth product specifics for B2B Services Reporting & Analytics. This was presented at the InterConnect event in Las Vegas, NV for Technology professionals interesting in addressing their Business to Business (B2B) need for turning data into insight.
OpenWorld: 4 Real-world Cloud Migration Case StudiesDatavail
In this presentation, get answers to these questions and more by exploring four different successful real-world Oracle EPM Cloud migration and implementation case studies for Oracle Enterprise Planning and Budgeting Cloud Service, Oracle Financial Consolidation and Close Cloud Service, and Oracle Account Reconciliation Cloud Service. Attendees get a birds-eye view into the practicalities of moving to the cloud and making the business case for their own company.
ROI and Economic Value of Data VirtualizationDenodo
Watch full webinar here: https://bit.ly/3oaKSzu
Gartner has predicted that organizations using Data Virtualization will spend 40% less on data integration than those using traditional technologies. Denodo customers have experienced time-to-deliver improvements of up to 90% within their data provisioning processes and cost savings of 50% or more. Join us for this webinar to discover how Data Virtualization can help accelerate your time-to-value from data while reducing the costs at the same time. As Rod Tidwell (Cuba Gooding Jr.) said in the movie 'Jerry Maguire', "Show me the money!"
Register to attend and learn how Data Virtualization can:
- Accelerate the delivery of data to users
- Drive digital transformation initiatives
- Reduce project costs and timelines
- Quickly deliver value to your organization
Gain a Holistic View of your Customer's JourneyPlatfora
Today, companies are capturing information about customers at every touchpoint, but the reality is that most companies are working with siloed marketing data because they’re using disparate tools to track online, offline, web, social, mobile, and advertising data.
In this presentation, Rod Fontecilla, VP of Application Modernization at Unisys, explains how his team uses Platfora to analyze, interact and understand data to drive customer success at Unisys.
Rod will highlight three specific Unisys use cases of Platfora, one of which involved a timely text survey sentiment analysis that produced insights enabling a course correction in favor of improved customer satisfaction.
Making the Case for Legacy Data in Modern Data Analytics PlatformsPrecisely
Modern data analytics platforms that fuel enterprise-wide data hubs are critical for decision making and information sharing. The problem? Integrating legacy data stores into these hubs is just plain hard, and there is no magic bullet. However, the best data hubs include ALL enterprise data.
So how can you ensure that you are building the best modern data analytics platform possible?
Join this webinar to learn more on:
- Best practices for integrating legacy data sources, such as mainframe and IBM i, into modern data analytics platforms such as Cloudera, Databricks, and Snowflake
- How Syncsort Connect customers are incorporating legacy data sources into enterprise data hubs to inform strategic use cases such as claims, banking, and shipping experiences
Watch full webinar here: [https://buff.ly/2R4JjBX]
Organizations today are data rich and insights poor. There is data everywhere. ERP systems, CRM systems, external data, data lakes and ponds. The real question to ask is “Are the users getting the insights they need when they need where they need to drive successful business outcomes”. Data Integration is a core pillar of the “Data to Value” journey. In this session you will hear how enterprises across industries are grappling with data, insights challenges and how organizations have adopted data virtualization to accelerate their "data to value" journeys.
Watch this Denodo DataFest 2018 session to learn:
How to reduce effort to get from data to value
Hope to gain faster time to Insights
How to reduce overall cost of ownership
how i managed to Develop a Analytics story for services about 4 years back. Contains
Maturity Model, Business Potential, Services Structures Areas that analytics can be applied to
20150108 create time stamp
20/10 Vision: Building A 21st Century Market Research OrganizationGregory Weiss
A strategic vision to create a 21st century market research organization, leveraging technology to provide value-added services and get more return from staff research efforts
Digital World Class Performance of O2C Shared Services | Order To Cash (O2C) ...Emagia
Digital World Class Performance of O2C Shared Services | Order To Cash (O2C) Automation
https://www.emagia.com/resources/ebooks/digital-world-class-performance-of-o2c-shared-services/
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
JMeter webinar - integration with InfluxDB and Grafana
Using Web Data to Drive Revenue and Reduce Costs
1. Using Web Data to Drive Revenue
and Reduce Costs
Presenters: Keith Cooper, CEO, Connotate
Christian Giarretta, VP of Sales Engineering,
Connotate
Moderator: Gina Cerami, VP of Marketing, Connotate
Date: March 12, 2013
3. Today’s Discussion
• Why Web Data?
• Drive revenue
• Reduce costs and streamline processes
• Automation Options
• Scoping Your Project
• Five steps to success
• Evaluating Providers
• Five questions to ask
• Q&A
3
4. The Web Provides the Largest Source of Data Ever
Assembled…
4
News sites
Government sites
Job boards
Financial sites Corporate sites
eCommerce sites
Regulatory sites Retail sites
Social forums
Healthcare sites
State and local court sites
5. …and the Data Continues to Grow and Change
at Unprecedented Rates
• 1.2 zettabytes of new digital content created in 2011* (zettabyte = 1B
terrabytes)
• The Internet will double in size every 5.32 years **
5
* IDC’s The Digital Universe
** PhysOrg.com
6. 6
In Order to Use All of That Information…You
Need to Find It, Filter It and Format It…
7. 7
…Then You Can Turn Web Data into Profits
Sample Use Cases:
Competitive intelligence
News aggregation
Background check
Price optimization
Investment research
Online ad usage reports
Market research
Regulatory updates
Sales intelligence
Business risk assessment
Data directories
Aggregate construction bids
Supply chain monitoring
Brand monitoring
Voice of the Customer
Social media monitoring
9. Deliver High-Value Directories: HG Data
• Challenge/Opportunity
• Build the largest, most accurate database of B2B tech customer intelligence
• Combine public and private content in unique ways to reveal new insights
• Solution: Use automation to cost-effectively extract
business intelligence from millions of Web documents
• 10,000+ agents built to date
• Highly granular database of 1M + profiles of enterprise technology users
• Business Benefit
• Successful go-to-market: disruptive technology replaces manual process
• Extracting new value in business area long hobbled by stale sources
9
10. IDEX
QualcommAlcatel-Lucent
AT&T Swisscom
Verizon
Vodafone
Bell Canada
Casio Computer
Research In Motion
BandRich
Inc.
Kyocera
Fujitsu
Panasonic
Hitachi
Asahi Glass
Canon Inc.
Lion Corporation
Supplier to Customer Value Added
Reseller
Fishbowl Solutions
Cintra Software
Tata Consultancy
Law Firm of William Koy LLP
Distributor
Nalco
Avnet Technology
Solutions
Imation
Strategic Alliance
Reveals Customer Relationships Between
Business Entities: HG Data
11. Gain Transparency Mid-Quarter to Better
Predict Company Performance: Financial Firm
• Challenge/Opportunity
• Gain daily/weekly/monthly visibility into inventory/sales of companies and
market segments where data is made public only on a quarterly basis
• Solution: Continually monitor available inventory and other
data posted on websites in those markets
• Use automation to capture precise indicators on an ongoing basis
• Analyze trends and make predictions
• Business Benefit
• Transparency supports more accurate predictions of financial results to
support smarter investment decisions
11
12. Gain Transparency Mid-Quarter: Web Page
12
Camera sales:
•Check camera prices daily
•Full-sweep of camera
inventory weekly
•Map trends, spot anomalies
•Compare one or two
targeted suppliers to overall
segment averages
14. Enhance Risk Assessment:
Business Information Provider
• Challenge/Opportunity
• Deliver updates/alerts on changes in assigned risk status of counterparties to
a financial transaction instead of just producing a static report
• Solution: Use automation to monitor websites for updates
• Monitors sites for changes that affect a business entity’s assigned risk status
– mergers, acquisitions, bankruptcies, de-listings, regulatory changes,
sanctions
• Business Benefit:
• First-to-market with a risk assessment service offering continual monitoring
• Fresh Web data is integrated into customer (financial institution) workflow –
enhancing customer “stickiness”
• Automated Web data extraction solution delivered a 6-month payback
14
17. Price Optimization: Sigma-Aldrich
• Challenge/Opportunity
• Optimize product positioning in B2B market where buying decisions can be
motivated by a few dollars or cents
• Competitors’ prices are changing constantly
• Solution: Replace manual spot checking of prices with
precise automated Web data extraction
• Continually extracts sizing/pricing on more than 150,000 products
• Acquired usable data for historical trend analysis
• Business Benefit
• Optimizes prices to improve profit margins
• Reduced manpower devoted to data collection by 50%
17
18. Price Optimization Pays Off
18
Increase revenue
2%- 4%
$8.75M - $16B for Fortune 500 Company
2012 Study:
Companies that
successfully
implement price
optimization will
realize 2 to 4%
improvement in
total revenue
20. Automated Records Check Improves Speed
and Accuracy: Tandem Select
• Challenge/Opportunity
• Criminal records are highly structured; accuracy and reliability is key for
people making hiring decisions
• Deliver guaranteed turnaround time on accurate checks without adding staff
• Solution: Replace manual processes to extract records
directly from court websites on demand
• Business Benefit
• Average background check time reduced from hours/days to minutes
• Much better quality - far fewer errors – guaranteed turnaround time
• In 12 months, order fulfillment increased 62% while operating expenses
decreased $150,000
20
21. Automated Records Check (FetchCheck):
Tandem Select
21
Standard customer order
at Tandem site
Tandem’s application calls
Connotate/FetchCheck
with a Web services
request
Agent extracts, transforms
and normalizes data
Information is returned to
calling application
Process takes between 6
and 20 seconds to
complete
22. Improve Revenue Collection Processes with
Accurate Reporting: Interactive Advertising
• Challenge/Opportunity
• Billing reconciliation was taking weeks/months (14 people overseeing daily
data collection, 5 days/week)
• Usage data posted on multiple password-protected Web sites (portals)
• Solution: Automated Web data collection accesses portals
for highly-accurate reporting and billing
• Reported data is 100% error-free; data is collected 365 days/year
• Business Benefit
• Quality data supports timely, accurate billing (reconciliation in days)
• Aggregated views enable ad placement optimization increasing customer ad
revenue 30 – 300%
22
23. Web Page is Transformed into Usable Data
23
1. Navigates the portal 2. Precisely captures statistics
3. Turns data
into Excel
27. A Closer Look at Different Approaches
27
Approach Considerations
Manual offshore No economies of scale; human error compromises quality.
Crowdsourcing
A viable approach for complex tasks like product matching
of apparel for one-shot projects; may be less reliable for
ongoing monitoring and long-term projects.
In-house or low-cost
Web scrapers
Not resilient; scrapers break when Web page HTML
changes, expensive programmers must fix scripts, increasing
total cost of ownership (TCO)
Robust automation
installed on-premise
High degree of control; better resiliency to change – reduced
TCO however, project complexity and future needs may
indicated hosted solution is better
Robust solution hosted
by vendor
Highest resiliency; no maintenance burden – reduced TCO;
24/7 follow-the-sun support; infinitely scalable and no capital
expenditures for hardware or IT resources.
28. Manual versus Automated Approaches
28
Your Data Needs To Automate or Not?
High-volume data monitoring Automate
Variety of sources Automate
Frequent updates and/or monitoring Automate
Need for data post-processing Automate
Small amount of data required just a few
times a year from very simple sites
A manual approach may be
adequate
One-time feed of very specific data Purchase data from 3rd
party
Product matching applications where unique
identifiers are not available
May want to consider
crowdsourcing
29. Polling Question: Web Data Collection
Are you currently collecting data from the Web?
Yes – we are doing this using an automated process
Yes – we are collecting Web data using a manual process
Yes – we are using BOTH manual and automated approaches
No – we are not collecting Web data
31. Scoping Your Project: 5 Steps to Success
1. Clarify what you want to do with the
data
2. Look at what’s happening manually
today – find out how users are
accessing the Web – these are
targets for automation
3. Identify the sources you need
4. Narrow your scope….you may not
need “everything”
5. Anticipate future requirements
31
33. Evaluating Providers: 5 Questions to Ask
• Can it scale up easily and quickly?
Look for a proven ability to handle high-volume, high-frequency applications without draining
your IT resources
• Is it resilient ?
Can it withstand website formatting changes – or will it “break,” requiring code fixes?
• How does it detect / deliver updates?
You’ll save time and money with change detection with highlighting – the ability to detect and
deliver “just the changes”
• Does it support my operational workflows?
Built-in job scheduling, resource shared access and other features can increase efficiencies
and coordinate workflow
• What are the deployment options?
Flexible options for on-premise and hosted solutions should adapt to your needs
33
34. Polling Question: The Value of Automated
Web Data Collection
Do you believe automating Web data monitoring and
extraction could add value to your business?
Yes – we are doing this now
Yes – we are planning a project in the near future
No – not at this time
I need more information before deciding
35. Here’s What Success Looks Like…
Create new and
enhanced
products and
services faster
Predict company
and market
performance
faster, better:
gain transparency
into non-
transparent
markets
Monitor
competitor
prices to
optimize
product
positioning
Automate
reporting for
timely, accurate
revenue
collection
35
… Connotate’s experts are ready to take you
there
36. Q & A
Connotate will email a link to this presentation as well as a
copy of the slides to you within 2 business days.
If you have an immediate need and would like us to contact
you about a forthcoming project, please check the appropriate
box in the last polling question or call (+1) 732-296-8844.
For more information, you may also visit www.connotate.com
or www.connotate.co.uk.
36
37. Thank You
If you have an immediate need and would like us to contact
you about a forthcoming project, please check the appropriate
box in the last polling question or call (+1) 732-296-8844.
For more information, visit
www.connotate.com or www.connotate.co.uk
37
Editor's Notes
<Gina> Welcome to today ’s presentation, “Using Web Data to Drive Revenue and Reduce Costs.” My name is Gina Cerami, and I ’m the Vice President of Marketing here at Connotate. This presentation seeks to explore how companies can strengthen their competitive advantage by leveraging publicly available Web sources. This presentation will last approximately 30 minutes, followed by a live question and answer session . You may submit your questions anytime during the session using Chat feature. Also, during the presentation, we will pose several survey questions which you may answer using the Polling feature that will appear when we open the survey.
<Gina> We still have some folks logging in. So, while we ’re waiting, I’d like to take a moment to provide a bit of background about Connotate. Our heritage is in leading-edge research conducted at Rutgers University, funded in part by DARPA. For over a decade, our focus has been to discover the most efficient ways to extract value from Web data. Connotate is an expert in this field. Since 2000, we have been helping global clients like the Associated Press, Thomson Reuters, Dow Jones and many others leverage Web data for strategic advantage. Today, we will share best practices that we’ve developed over the years. We hope this information will help you get more value out of any Web data project you may attempt, either now or in the future. Our presenters today are Keith Cooper, CEO of Connotate and Chris Giarretta , Vice President of Sales Engineering at Connotate.
<Gina> To start off our discussion today, Keith Cooper will share use cases illustrating the variety of ways in which organizations are using Web data to make money and save money. These use cases span a wide spectrum of vertical industries including financial services, biochemicals, background screening, online advertising and the information industry. Chris Giarretta will then delve into some of the technical aspects of using Web data, including options for automating Web data collection processes, questions you should ask your vendor and best practices for tackling a Web data extraction project. Throughout the presentation, we will be conducting a series of polling questions to ask you how you are using Web data today and what you are planning to do in the future. Before I turn the presentation over to Keith, let me remind the audience that there will be live Q&A at the end of the presentation. If you would like to submit a question, please use the CHAT feature provided by Webex. You can submit your questions at anytime during the presentation. Now I will turn the presentation over to Keith Cooper.
<Keith>
<Keith> (Some suggested talking points – Keith, please modify as you see fit) – “ 90% of the data in the world today has been created in just the past two years .” Gina got the number from an IBM citation: http://www-01.ibm.com/software/info/rte/bdig/index-pre.html?S_TACT=101MY87W
<Keith> Here’s a quick overview to set the stage. We start with a Web site, on the left, and go through a process that we call Data Extraction. The data may initially be in HTML, PDF or images – not usable by a computer. Once we extract the data, we transform it into something usable such as XML or Excel files. At that point, it’s ready to be used by your applications to generate revenue or reduce costs, as you’ll see in the next few examples. We’ll come back and revisit this process in more detail after we talk about some real world case studies.
<Keith> There are many use cases…today we’ll touch on just a few of them.
<Keith> The first cases will focus on companies who are experiencing revenue growth derived from products and services that incorporate Web data.
<Keith> Generating sales leads is a big business. The higher quality the lead, the more it is worth. HG Data is building a business around this concept. Let’ say a plain vanilla B2B sales lead is worth $1 – just contact information for Joe (name, email, title). Now, append distinct attributes to that lead such as: --Joe’s years of job experience managing Oracle deployments and his background in retail --Joe’s connection to a CRM reseller who publishes a customer win announcement mentioning Joe’s employer --Joe’s employer posting job openings in his department From this, one can deduce that Joe may be heading up a long-term CRM project and is likely candidate for software or services that support Oracle-based CRM projects The value of that lead increases 5 or 10-fold by appending sales intelligence attributes to the lead. That’s what HG Data does, but at a huge, scale. The result is a database of millions of profiles of enterprise technology usage, a highly granular resource for technology vendors.
<Keith> To produce this high-value directory of sales leads, HG Data leverages millions of sources – including corporate postings, press releases, articles etc. They are using publicly available Web data, along with licensed content. By applying artificial intelligence to map the connections and by continually refreshing the model with updated Web data, HG Data creates a dynamic information service that it can sell to other businesses who find it extremely valuable.
<Keith> This next example is for the investors in the audience. Everyone is trying to outsmart the market. The trick is to do this with publicly available data. Smart investors are doing this today. Organizations and government agencies continually publish data on the Web that provide enough information to accurately predict company or sector performance well in advance of published numbers. However, it takes a lot of time to manually grab that data. An automated process makes it feasible.
<Keith> Let’s say you want to track camera sales. You can capture pricing data on a daily or weekly basis; depending on your analysis model you may want to do a full sweep of inventory or just samples.
<Keith> … .using automation, the specific data you wanted from the Web page is now in an Excel file that can be charted and trended over time. With this, you can: --Build unique time-series data sets for predictive analysis --Feed this into your proprietary modeling and analysis tools Outsmart the market!
<Keith> This next example focuses on a financial/business information provider. This company has rolled out several highly successful information services that leverage Web data – including one that provides accurate background data on parties to a financial transaction. The company sought to enhance this service with real-time data that made it much more valuable to its subscribers and command a premium price. The were able to leverage automation to roll out a high-end risk-assessment service that continually monitors data sources to provide up-to-date information on a company’s assigned risk status. This is important not only to help banks and other parties put a dollar value on the deal and weigh the risks involved but also to avoid compliance penalties for doing business with ‘bad guys”. There was a point in product development where the team reached a critical “build versus buy” decision. They were more than half-way to getting the product out the door when the process got bogged down. They turned to Connotate and we were able to help them get the product to market faster – ahead of the competition.
<Keith> Now I’ll explain the process of collecting and transforming the data. There are thousands of regulatory bodies posting judgments, actions and notices about companies and persons with whom you might be conducting business….These sites are updated daily. On your screen are three ….the Dubai Financial Services Authority, the Netherlands Authority for Financial Markets (AFM) and the U.S. Federal Trade Commission (the FTC). You’ll notice that the AFM site is password-protected. It is easy for an automated solution to input login and passwords to extract data, as long as you are already a legitimate subscriber to the service. On the next slide, you’ll see what happens when we pull out the data and structure it in a usable form.
<Keith> The data is now in XML format. This particular file provides information about a person on the Board of Governors of a Bank; the data extracted shows that he is a Politically Exposed Person (PEP) which impacts the bank’s risk status; this is a very valuable piece of information. The level of risk can change daily - hence the high value of automating access to websites that keep tabs on this type of information.
<Keith> Now let’s look at a completely different use case – Price Optimization. Sigma-Aldrich is a top tier life science company specializing in biochemicals, with over 7,600 employees and operations in 40 countries. In the world of chemical and biochemical manufacturing, margins are key. With all things being equal, the decision of where to buy can be motivated by just a few dollars and cents. Sigma recognizes that their customers are becoming more and more educated, and they wanted to track pricing for over 150,000 products across 40+ competitor sites. They were using some automated tools along with some manual processes to collect pricing data but these processes were introducing errors and yielding inaccurate results. Automation helped them not only reduce labor costs but also optimize their prices faster and more accurately to improve margins.
<Keith> Let’s take a step back and look at what price optimization might mean for your business. Results will vary widely across different industry segments. Here is one fact to consider. According to a July Gartner 2012 Report, “companies that have implemented price optimization successfully have realized improvements of two to four percent of total revenue or more.” So let’s put that into perspective: For a Fortune 500 Company, this translates to an uptick of anywhere from $8.75 million to $16 billion dollars – based on the 2% increase in annual revenue for smallest company on the Fortune 500 list to 4% of revenues for Walmart, who is typically at the top.
<Keith> Now, let’s turn our attention to cost reduction
<Keith> Tandem Select is a mid-size credit reporting agency (CRA) performing criminal records-checking as part of its background screening services. Previously, the company relied on manual processes to obtain criminal records from various court houses. By using automation to collect criminal records from websites across hundreds of jurisdictions, the company was able to reduce background check time from hours to minutes, offer its clients guaranteed turnaround time – all without increasing staff. The volume of business shot up 62% while expenses went down.
<Keith> Here is how the solution works. When a background check is requested, the company’s internal software application uses a Web services request to kick off the process – so this is a request-driven action – the Web data extraction piece (the “Agent”) goes to the appropriate court sites and looks for criminal records, traffic violations, etc. returning the data in a spreadsheet-type format. The entire roundtrip to Web sites and back takes between 6 and 20 seconds to complete…a dramatic reduction in turnaround time.
<Keith> Our last case study example is quite compelling….interactive advertising. Online Advertising is a huge industry – Pricewaterhouse estimated its’ market size as $17B in revenue in just the first half of 2012 In the world of online advertising, Billing is complex with multiple layers of providers, middlemen and services - each taking their cut of the pie when an ad is served. Usage statistics are published as Web data across 100s of different Ad Network portals – revenue collection can be a nightmare for Supply-Side Platforms which aggregate clicks and impression data for advertisers. Data collection processes were error-prone Revenue collection lagged due to need for extensive error-checking and correction Now, Automation supports accurate data collection from 100s of password-protected sites throughout the day – and in addition, Supply Side Platform providers can continually display aggregated dynamic ad exchange data, letting advertisers see real-time, side-by-side comparisons of online ad traffic They can instantaneously optimize ad placement
<Keith> The automated solution is simple: the input parameters are “logins” and “Passwords” . There are also filters, which are easily configured to precisely capture statistics needed such as the date/time of the ad campaign, the number of impressions, and revenue generated. All of this unstructured data is transformed into an Excel spreadsheet for fast, accurate billing. The basics of this solution are applicable in any situation where usage data is retrievable from multiple Web portals. Using automation to streamline the process of capturing and structuring Web data not only saves time and money, it reduces errors to ensure timely, accurate reporting and revenue collection.
<Keith> In a number of these use cases, I mentioned the use of automation. So let’s take a look at exactly what that means, when it comes into play and how it affects your use of Web data. At this point, I’m going to turn the discussion over to Chris Giarretta who will take a look at automation and other aspects of the technology behind these success stories.
<Chris> The first thing to consider is, “what kind of data is on the Web” and what do I want to retrieve? The universe of data is quite large…most of our customers focus on text data.
<Chris> … . we can also extract images form the page or, give the snapsot of the page in HTML or PDF format.
<Chris> Let ’s take a closer look at the pros and cons of various approaches. Some applications lend themselves to using manual approaches and crowdsourcing, but there is always the risk of introducing human error. A bigger concern is the fra gile nature of the Web. The Web changes all the time. Many projects require continual monitoring for changes; and change detection with highlighting to support workflow productivity. A robust automated solution such as Connotate ’s will provide that. More importantly, will the solution “break” when the HTML on the page changes. Connotate’s patented visual abstraction solution is designed to be resilient to certain changes; if the page is competely changed, customers using our hosted solution don’t have to worry – we get the automation up and running quickly which isn’t the case for scrapers and data providers. Without a true monitoring service and dynamic platform, a single pull system or fragile system only delivers a fragment of the value and doesn ’t allow for the time series analytics that organizations need today.
<Chris> All of these actual case studies that Keith mentioned before achieved results by following a fully automated approach. Scenarios that warrant an automated solution include situations where a lot of internal and external data needs to be aggregated and / or you need to monitor a variety of sources. If you are dealing with high volumes of data – or Web sites which change frequently, it quickly gets very expensive to have your staff continually check sites and look for changes. Automation is also required when you need frequent updates, such as news aggregation or price optimization in retail. At Connotate, we hear a lot of different data needs from all different kinds of companies and we understand that an automated solution is not always the answer. For example, when we see a company that needs to do a lot of complex product matching---let ’s say, for apparel—we may recommend crowd sourcing as a viable approach. Or -- if you have a small amount of data that you need only a few times a year – you may not need a scalable, automated approach. Now I’m going to turn it over to Gina who will invite you to answer a polling question.
Gina Let ’s take a brief pause to ask our audience about their experiences collecting data from the Web. Is it an automated process? Are you doing it manually? Or, are you not collecting Web data at all? Also, I ’d like to remind the audience that there will be a live Q&A session at the end of the presentation…but you can submit your questions anytime using the Chat feature on your screen. Now I’m going to hand the presentation back to Chris Giarretta so he can give you some tips on evaluating Web data extraction solution providers.
Chris If you are thinking about a Web data extraction project, I’d like to share some best practices we’ve learned over the years to help you get started.
<Chris> At this stage, after examining your options, if you still need to narrow down your options, it may be possible to apply automation to leverage Google and other search engines to refine the scope of your project. Once you have the list of URLs, we can help you identify the sites that are easy to access versus those that aren ’t. (Chris, can you give some examples?) Next, you need to think about scoping the project. How many sites? How often do you need to monitor and/or collect data? It ’s important to be flexible here and to work with someone who will take the time to understand your needs and adjust the scope/direction of the project, if needed to deliver you the most value. Finally, you ’ll want to look in the long-term and consider the maintenance costs of your project, and how to minimize them. Deploying software on-site gives you the most control, but you’re carrying the ball when it comes to maintaining the solution and expanding scope quickly if need be. A hosted deployment eliminates those headaches and can be more cost-effective in the long-run.
<Chris> Let’s take a look at questions to consider before you choose a solution.
<Chris> Here are five useful questions to ask when evaluating Web data extraction solutions.
Gina Let ’s take a moment to ask our audience to answer a question about the value of automation. Based on your experience and based on what you’ve heard hear today, do you believe that automating the collection process could add value to your business? Also, I ’d like to remind the audience that they can use the Chat feature on their screen now to submit questions to the presenters for our Q&A session at the end.
Gina We covered a number of use cases for collecting Web data - there are many other examples as well. You may be thinking of other strategic initiatives in your own organization. If so, we hope that you have found today ’s presentation helpful in discovering some of the aspects you need to consider as you decide the next steps in your project. At this point, we will be posting a final polling question. Please take a moment to respond before you leave the webinar.
<Gina> Now, for your questions. Several of you have asked about obtaining a copy of today ’s presentation. We will send you a link to the archived presentation within 2 business days. We also invite you to answer our last poll which appears on the right of your screen. -----------------
Thank you for attending today ’s Webinar. Please visit our Web site for information about our products, services, and future Webinars. This concludes our presentation.