This document summarizes a presentation about how a traditional media company, Entravision, embraced big data. It discusses that Entravision implemented a big data solution to become more data-driven, gain insights into underrepresented Hispanic consumers, create synergies across its business units, and develop a new revenue stream. Entravision created a new business unit called Luminar to offer analytics services and insights to other companies. It worked with a technology partner, Impetus Technologies, to design and deploy the big data platform through a multi-phase approach.
US Hispanics: Content Producers and Marketers who want to attract US Latinos should considered mobile as the first screen and in many cases as the only one
Break Through the Traditional Advertisement Services with Big Data and Apache...Hortonworks
Entravision Communications Corporation (NYSE: EVC) is a diversified Spanish-language media company with a unique group of media assets including television stations, radio stations and digital platforms. In 2011, they made the strategic decision to build a data analytics, modeling and insights division to expand the value of its traditional advertisement services. Join us in this session with Franklin Rios, President of Luminar (an Entravision company), Oscar Padilla, VP of Strategy, Luminar, along with Impetus and Hortonworks as we discusses key implementations, results and lessons learned from their big data services operations.
Hispanic Digital and Print Media Conference 2012 - Oscar PadillaPortada
Unlocking Key Insights to Reach the Hispanic Consumer by Osar Padilla, Vice President of Strategy at Luminar. Presentation for Portada's 6th Annual Hispanic Digital and Print Media Conference in New York City.
Attend Portada's 2013 Latin Content Marketing Forum in Miami this June 4th, 2013.
Learn more at: http://www.portada-online.com/conferences
The Brave New World of Universal Analytics - SMX London 2014Martijn
Measuring the Multi-Platform world. My talk in the Brave New World of Universal Analytics at SMX London 2014. Providing an overview of the UK / US Digital Landscape, best practices on Multi-Platform Analytics and the Mobile Path to Purchase in Retail.
US Hispanics: Content Producers and Marketers who want to attract US Latinos should considered mobile as the first screen and in many cases as the only one
Break Through the Traditional Advertisement Services with Big Data and Apache...Hortonworks
Entravision Communications Corporation (NYSE: EVC) is a diversified Spanish-language media company with a unique group of media assets including television stations, radio stations and digital platforms. In 2011, they made the strategic decision to build a data analytics, modeling and insights division to expand the value of its traditional advertisement services. Join us in this session with Franklin Rios, President of Luminar (an Entravision company), Oscar Padilla, VP of Strategy, Luminar, along with Impetus and Hortonworks as we discusses key implementations, results and lessons learned from their big data services operations.
Hispanic Digital and Print Media Conference 2012 - Oscar PadillaPortada
Unlocking Key Insights to Reach the Hispanic Consumer by Osar Padilla, Vice President of Strategy at Luminar. Presentation for Portada's 6th Annual Hispanic Digital and Print Media Conference in New York City.
Attend Portada's 2013 Latin Content Marketing Forum in Miami this June 4th, 2013.
Learn more at: http://www.portada-online.com/conferences
The Brave New World of Universal Analytics - SMX London 2014Martijn
Measuring the Multi-Platform world. My talk in the Brave New World of Universal Analytics at SMX London 2014. Providing an overview of the UK / US Digital Landscape, best practices on Multi-Platform Analytics and the Mobile Path to Purchase in Retail.
Artificial Intelligence, The Rise of Agents and The Death of ChoiceMichael Nicholas
These are the slides I stood in front of during a presentation I gave at the Modern Marketing Summit at The Village in San Francisco on November 1, 2016.
The presentation was a high level 25 minutes attempting to explain the potential and power of Artificial Intelligence for marketeers.
The discussion Included:
Bot hype
Artificial Intelligence + Empathy
Natural Language Understanding + Trust
The A.I. Choice Paradox
Challenges for CMO's & Agencies
The Raise of A.I. Agencies
Born
I used to be able to export from keynote with a full size slide and the presenter notes...
Unfortunately I can't seem to do that now, so most of the discussion isn't actually here. See if i can get the notes in here somehow so this makes a bit more sense.
Maximize Performance of Your Campaigns with Sponsored Updates PartnersLinkedIn
Wondering how you can improve the ROI of your Sponsored Updates? LinkedIn advertisers using a certified partner to manage and track their campaigns are seeing results that are 20% above benchmark performance.
Through LinkedIn’s Certified Sponsored Update Partners, marketers can tap into a selection of third-party services and technology that can unlock greater campaign expertise, effectiveness, and efficiency.
Join LinkedIn’s Partner Enablement Manager, Jonathan Young to learn about the Certified Sponsored Updates Partner program, the benefits to marketers, and an understanding of what each partner offers.
Also hear from AdStage’s CEO and Co-Founder, Sahil Jain. AdStage provides powerful campaign tool to help you manage your Sponsored Updates and Direct Sponsored Content. Sahil will share client case studies, provide a live demo of the platform, and answer questions from the audience.
Statistics
Spear Marketing Group and TechValidate by SurveyMonkey recently worked together, using the TechValidate Market Research platform, to understand more about the content, tactics, and technologies that are most valuable for today’s B2B marketers. More than 200 marketing VPs, directors, and managers responded to the research study and weighed in on crucial factors that drive demand generation ROI today.
South Big Data Hub: Text Data Analysis PanelTrey Grainger
Slides from Trey's opening presentation for the South Big Data Hub's Text Data Analysis Panel on December 8th, 2016. Trey provided a quick introduction to Apache Solr, described how companies are using Solr to power relevant search in industry, and provided a glimpse on where the industry is heading with regard to implementing more intelligent and relevant semantic search.
Reflected Intelligence: Lucene/Solr as a self-learning data systemTrey Grainger
What if your search engine could automatically tune its own domain-specific relevancy model? What if it could learn the important phrases and topics within your domain, automatically identify alternate spellings (synonyms, acronyms, and related phrases) and disambiguate multiple meanings of those phrases, learn the conceptual relationships embedded within your documents, and even use machine-learned ranking to discover the relative importance of different features and then automatically optimize its own ranking algorithms for your domain?
In this presentation, you’ll learn you how to do just that - to evolving Lucene/Solr implementations into self-learning data systems which are able to accept user queries, deliver relevance-ranked results, and automatically learn from your users’ subsequent interactions to continually deliver a more relevant experience for each keyword, category, and group of users.
Such a self-learning system leverages reflected intelligence to consistently improve its understanding of the content (documents and queries), the context of specific users, and the relevance signals present in the collective feedback from every prior user interaction with the system. Come learn how to move beyond manual relevancy tuning and toward a closed-loop system leveraging both the embedded meaning within your content and the wisdom of the crowds to automatically generate search relevancy algorithms optimized for your domain.
Building a real time big data analytics platform with solrTrey Grainger
Having “big data” is great, but turning that data into actionable intelligence is where the real value lies. This talk will demonstrate how you can use Solr to build a highly scalable data analytics engine to enable customers to engage in lightning fast, real-time knowledge discovery.
At CareerBuilder, we utilize these techniques to report the supply and demand of the labor force, compensation trends, customer performance metrics, and many live internal platform analytics. You will walk away from this talk with an advanced understanding of faceting, including pivot-faceting, geo/radius faceting, time-series faceting, function faceting, and multi-select faceting. You’ll also get a sneak peak at some new faceting capabilities just wrapping up development including distributed pivot facets and percentile/stats faceting, which will be open-sourced.
The presentation will be a technical tutorial, along with real-world use-cases and data visualizations. After this talk, you'll never see Solr as just a text search engine again.
This Big Data case study outlines the Hadoop infrastructure deployment for a Fortune 100 media and telecommunications company.
Hadoop adoption in this company had grown organically across multiple different teams, starting with “science projects” and lab initiatives that quickly grew and expanded. Going forward, some of the options they considered for their Big Data deployment included expanding their on-premises infrastructure and using a Hadoop-as-a-Service cloud offering.
Fortunately, they realized that there is a third option: providing the benefits of Hadoop-as-a-Service with on-premises infrastructure. They selected the BlueData EPIC software platform to virtualize their Hadoop infrastructure and provide on-demand access to virtual Hadoop clusters in a secure, multi-tenant model.
Learn more about this case study in the blog post at: http://www.bluedata.com/blog/2015/05/big-data-case-study-hadoop-infrastructure
eSports: The Biggest Sport You've Probably Never Heard Ofsparks & honey
With millions of people already playing video games and the popularity of video game competition rising, gamers developed an interest in watching others play for fun, while learning tips to improve their play and witnessing pro-gamers
showcasing their skills.
eSports organizations recognized this trend and created platforms for people to participate and watch. The dramatic rise of game streaming services like Twitch, ESL, and MLG created communities between players and fans.
Then came the big prize money. The professional game casters. Video games broadcast on major networks. Huge, sold-out crowds. Brand sponsorships. And from the beginning, unrelentingly passionate fans.
A perfect sport that fulfills the cultural need to be the hero, to be part of a community as both participant and spectator and experience the thrill of victory.
Big Data and advanced analytics are critical topics for executives today. But many still aren't sure how to turn that promise into value. This presentation provides an overview of 16 examples and use cases that lay out the different ways companies have approached the issue and found value: everything from pricing flexibility to customer preference management to credit risk analysis to fraud protection and discount targeting. For the latest on Big Data & Advanced Analytics: http://mckinseyonmarketingandsales.com/topics/big-data
The high-level product journey in the mind of PMs.
* Understanding the scope of the area and strategy pillars
* Approach to stakeholder management and governance
* Building the digital product roadmap
* Launching MVP
* Approach to optimize the product
* Measuring ROI
* Problem solved?
* What to build next
“The intern program in the Americas is a two-way street where both the students and Avnet benefit tremendously. Students get the opportunity to apply and test their knowledge on Avnet business initiatives, and Avnet has the opportunity to engage incredibly talented individuals who bring fresh and unbiased perspectives that help us grow as a company.”
— Aaron Dean, Vice President, Global Talent Programs
Overcoming Obstacles to Success with MicroservicesPerficient, Inc.
Microservices are the next evolution in the enterprise integration landscape, allowing organizations to continually adapt to the new demands of the digital marketplace. But where do you start? How do you evolve legacy architecture and IT processes in areas like Agile and DevOps to support microservices?
Our webinar covered the benefits and challenges of microservices and the steps to build a practical, successful microservices strategy and roadmap.
Manisha Datye, VP of the Integration Center of Excellence at Perficient client TCF Bank, discussed TCF’s experience with microservices including the business drivers and benefits.
Discussion centered on:
-An understanding of the microservices evolution
-Insight into the constraints and benefits of microservices architecture
-Steps to building a microservices migration strategy and roadmap
-A look at how to jump start microservices with Perficient’s Innovation Lab
Artificial Intelligence, The Rise of Agents and The Death of ChoiceMichael Nicholas
These are the slides I stood in front of during a presentation I gave at the Modern Marketing Summit at The Village in San Francisco on November 1, 2016.
The presentation was a high level 25 minutes attempting to explain the potential and power of Artificial Intelligence for marketeers.
The discussion Included:
Bot hype
Artificial Intelligence + Empathy
Natural Language Understanding + Trust
The A.I. Choice Paradox
Challenges for CMO's & Agencies
The Raise of A.I. Agencies
Born
I used to be able to export from keynote with a full size slide and the presenter notes...
Unfortunately I can't seem to do that now, so most of the discussion isn't actually here. See if i can get the notes in here somehow so this makes a bit more sense.
Maximize Performance of Your Campaigns with Sponsored Updates PartnersLinkedIn
Wondering how you can improve the ROI of your Sponsored Updates? LinkedIn advertisers using a certified partner to manage and track their campaigns are seeing results that are 20% above benchmark performance.
Through LinkedIn’s Certified Sponsored Update Partners, marketers can tap into a selection of third-party services and technology that can unlock greater campaign expertise, effectiveness, and efficiency.
Join LinkedIn’s Partner Enablement Manager, Jonathan Young to learn about the Certified Sponsored Updates Partner program, the benefits to marketers, and an understanding of what each partner offers.
Also hear from AdStage’s CEO and Co-Founder, Sahil Jain. AdStage provides powerful campaign tool to help you manage your Sponsored Updates and Direct Sponsored Content. Sahil will share client case studies, provide a live demo of the platform, and answer questions from the audience.
Statistics
Spear Marketing Group and TechValidate by SurveyMonkey recently worked together, using the TechValidate Market Research platform, to understand more about the content, tactics, and technologies that are most valuable for today’s B2B marketers. More than 200 marketing VPs, directors, and managers responded to the research study and weighed in on crucial factors that drive demand generation ROI today.
South Big Data Hub: Text Data Analysis PanelTrey Grainger
Slides from Trey's opening presentation for the South Big Data Hub's Text Data Analysis Panel on December 8th, 2016. Trey provided a quick introduction to Apache Solr, described how companies are using Solr to power relevant search in industry, and provided a glimpse on where the industry is heading with regard to implementing more intelligent and relevant semantic search.
Reflected Intelligence: Lucene/Solr as a self-learning data systemTrey Grainger
What if your search engine could automatically tune its own domain-specific relevancy model? What if it could learn the important phrases and topics within your domain, automatically identify alternate spellings (synonyms, acronyms, and related phrases) and disambiguate multiple meanings of those phrases, learn the conceptual relationships embedded within your documents, and even use machine-learned ranking to discover the relative importance of different features and then automatically optimize its own ranking algorithms for your domain?
In this presentation, you’ll learn you how to do just that - to evolving Lucene/Solr implementations into self-learning data systems which are able to accept user queries, deliver relevance-ranked results, and automatically learn from your users’ subsequent interactions to continually deliver a more relevant experience for each keyword, category, and group of users.
Such a self-learning system leverages reflected intelligence to consistently improve its understanding of the content (documents and queries), the context of specific users, and the relevance signals present in the collective feedback from every prior user interaction with the system. Come learn how to move beyond manual relevancy tuning and toward a closed-loop system leveraging both the embedded meaning within your content and the wisdom of the crowds to automatically generate search relevancy algorithms optimized for your domain.
Building a real time big data analytics platform with solrTrey Grainger
Having “big data” is great, but turning that data into actionable intelligence is where the real value lies. This talk will demonstrate how you can use Solr to build a highly scalable data analytics engine to enable customers to engage in lightning fast, real-time knowledge discovery.
At CareerBuilder, we utilize these techniques to report the supply and demand of the labor force, compensation trends, customer performance metrics, and many live internal platform analytics. You will walk away from this talk with an advanced understanding of faceting, including pivot-faceting, geo/radius faceting, time-series faceting, function faceting, and multi-select faceting. You’ll also get a sneak peak at some new faceting capabilities just wrapping up development including distributed pivot facets and percentile/stats faceting, which will be open-sourced.
The presentation will be a technical tutorial, along with real-world use-cases and data visualizations. After this talk, you'll never see Solr as just a text search engine again.
This Big Data case study outlines the Hadoop infrastructure deployment for a Fortune 100 media and telecommunications company.
Hadoop adoption in this company had grown organically across multiple different teams, starting with “science projects” and lab initiatives that quickly grew and expanded. Going forward, some of the options they considered for their Big Data deployment included expanding their on-premises infrastructure and using a Hadoop-as-a-Service cloud offering.
Fortunately, they realized that there is a third option: providing the benefits of Hadoop-as-a-Service with on-premises infrastructure. They selected the BlueData EPIC software platform to virtualize their Hadoop infrastructure and provide on-demand access to virtual Hadoop clusters in a secure, multi-tenant model.
Learn more about this case study in the blog post at: http://www.bluedata.com/blog/2015/05/big-data-case-study-hadoop-infrastructure
eSports: The Biggest Sport You've Probably Never Heard Ofsparks & honey
With millions of people already playing video games and the popularity of video game competition rising, gamers developed an interest in watching others play for fun, while learning tips to improve their play and witnessing pro-gamers
showcasing their skills.
eSports organizations recognized this trend and created platforms for people to participate and watch. The dramatic rise of game streaming services like Twitch, ESL, and MLG created communities between players and fans.
Then came the big prize money. The professional game casters. Video games broadcast on major networks. Huge, sold-out crowds. Brand sponsorships. And from the beginning, unrelentingly passionate fans.
A perfect sport that fulfills the cultural need to be the hero, to be part of a community as both participant and spectator and experience the thrill of victory.
Big Data and advanced analytics are critical topics for executives today. But many still aren't sure how to turn that promise into value. This presentation provides an overview of 16 examples and use cases that lay out the different ways companies have approached the issue and found value: everything from pricing flexibility to customer preference management to credit risk analysis to fraud protection and discount targeting. For the latest on Big Data & Advanced Analytics: http://mckinseyonmarketingandsales.com/topics/big-data
The high-level product journey in the mind of PMs.
* Understanding the scope of the area and strategy pillars
* Approach to stakeholder management and governance
* Building the digital product roadmap
* Launching MVP
* Approach to optimize the product
* Measuring ROI
* Problem solved?
* What to build next
“The intern program in the Americas is a two-way street where both the students and Avnet benefit tremendously. Students get the opportunity to apply and test their knowledge on Avnet business initiatives, and Avnet has the opportunity to engage incredibly talented individuals who bring fresh and unbiased perspectives that help us grow as a company.”
— Aaron Dean, Vice President, Global Talent Programs
Overcoming Obstacles to Success with MicroservicesPerficient, Inc.
Microservices are the next evolution in the enterprise integration landscape, allowing organizations to continually adapt to the new demands of the digital marketplace. But where do you start? How do you evolve legacy architecture and IT processes in areas like Agile and DevOps to support microservices?
Our webinar covered the benefits and challenges of microservices and the steps to build a practical, successful microservices strategy and roadmap.
Manisha Datye, VP of the Integration Center of Excellence at Perficient client TCF Bank, discussed TCF’s experience with microservices including the business drivers and benefits.
Discussion centered on:
-An understanding of the microservices evolution
-Insight into the constraints and benefits of microservices architecture
-Steps to building a microservices migration strategy and roadmap
-A look at how to jump start microservices with Perficient’s Innovation Lab
How RRD Approaches Continuous Value Flow in its Digital Transformation Journe...AppDynamics
In this session, we will highlight how some business teams at RR Donnelley (RRD) are reaping the benefits of continuous value flow and continuous improvement practices—paired with analytics and APM—to rapidly experiment, deploy, and measure customer value. Hear an overview of how we are leveraging Agile, lean, CI/CD, analytics, and AppDynamics to do so.
RRD is a global, integrated communications provider enabling organizations to create, manage, deliver, and optimize their multi-channel marketing and business communication solutions. Founded in 1864, RRD serves large, fragmented markets experiencing significant changes in how businesses are communicating with their audiences using both print and digital channels. RRD is uniquely positioned with an extensive customer base and wide portfolio of capabilities to continuously evolve our digital transformation story to help our customers achieve their goals.
Key takeaways:
• How AppDynamics is used to track key business transactions release-to-release to build confidence, trust, and partnership with business teams
• How RRD leverages analytics and AppDynamics to facilitate a rapid experimentation approach
• High-level approach RRD uses to evolve existing software architecture to better align to the digital transformation journey
For more information go to: www.appdynamics.com
Driving End-to-End Procurement Excellence by Integrating SAP and Ariba (Custo...SAP Ariba
SAP and Ariba together are offering solutions under one roof that cover the procurement process end-to-end and support all categories of spend. Our solutions portfolio allows SAP customers to quickly and easily benefit from the Ariba Network, and deploy Ariba’s cloud-based sourcing and procurement solutions, connected to their existing SAP infrastructure. Join this session to hear how our customers have rapidly implemented the Ariba Procure-to-Pay solution, including integration with SAP ERP and the Ariba Network, in order to gain maximum compliance and efficiency, and accelerate their journey to procurement excellence.
Almost every Project Management book introduces the project management triangle. Almost every certified Project Manager thinks that she or he understands the relationships between the elements of triangle correctly: “The larger the scope, the more cost and time needed”. However, especially in ICT industry majority of the projects overrun both the budget and schedule, and deliver less functionality than expected. In this presentation we take another look at the project management triangle, to learn how to get more outcomes with spending less money and time.
Customer Success Story: Interact Everywhere with IBM Active ReportsCCG
Hillsborough County Public Schools needed to create an interactive Strategic Plan Scorecard that summarized metrics at a high level, but also provided some detailed information to the users similar to a drill-through report. To meet this need, an Active Report was created that impressed business users as well as solved the delivery and security concerns from an I.S. perspective. Although Active Reports are not exactly new to IBM Cognos, there have been significant improvements in regards to performance and file compression. This session explains how the Active Report was created at an introductory level, including a demonstration of the product.
How to Improve Performance with Next-Gen Sales Enablement Technology in Finan...Perficient, Inc.
How are next-generation knowledge management solutions changing the pace of sales and marketing efficiency in financial services?
IT departments for financial firms are investing in enterprise solutions like cloud and mobile technology to improve efficiency, and sales and marketing are no exception. Information is often scattered, compliance approvals for collateral are a burden, and real-time access to the right materials and data is a challenge for most financial services companies.
A lot has changed with knowledge management solutions in financial services. From mobile access and cloud-based platforms to automation, analytics and seamless integration to CRM – you can’t afford to be left in the dust.
Join us as our team of industry experts explore:
Technology trends redefining your processes and IT systems
Scalable, dynamic solutions to empower sales and marketing teams
Best practices in implementing enterprise platforms
Client success stories in asset and wealth management
Future-Proof Your Streaming Analytics Architecture- StreamAnalytix WebinarImpetus Technologies
Future-Proof Your Streaming Analytics Architecture- StreamAnalytix Webinar
View the webcast on http://bit.ly/1HFD8YR
The speakers from Forrester and Impetus talk about the options and optimal architecture to incorporate real-time insights into your apps that provisions benefitting from future innovation also.
Impetus White Paper- Handling Data Corruption in ElasticsearchImpetus Technologies
This white paper focuses on handling data corruption in Elasticsearch. It describes how to recover data from corrupted indices of Elasticsearch and re-index that data in a new index. The paper also guides you about Lucene’s index terminology
Deep Learning: Evolution of ML from Statistical to Brain-like Computing- Data...Impetus Technologies
Presentation on 'Deep Learning: Evolution of ML from Statistical to Brain-like Computing'
Speaker- Dr. Vijay Srinivas Agneeswaran,Director, Big Data Labs, Impetus
The main objective of the presentation is to give an overview of our cutting edge work on realizing distributed deep learning networks over GraphLab. The objectives can be summarized as below:
- First-hand experience and insights into implementation of distributed deep learning networks.
- Thorough view of GraphLab (including descriptions of code) and the extensions required to implement these networks.
- Details of how the extensions were realized/implemented in GraphLab source – they have been submitted to the community for evaluation.
- Arrhythmia detection use case as an application of the large scale distributed deep learning network.
SPARK USE CASE- Distributed Reinforcement Learning for Electricity Market Bi...Impetus Technologies
SPARK SUMMIT SESSION -
A majority of the electricity in the U.S. is traded in independent system operator (ISO) based wholesale markets. ISO-based markets typically function in a two-step settlement process with day-ahead (DA) financial settlements followed by physical real-time (spot) market settlements for electricity. In this work, we focus on obtaining equilibrium bidding strategies for electricity generators in DA markets. Electricity prices in DA markets are determined by the ISO, which matches competing supply offers from power generators with demand bids from load serving entities. Since there are multiple generators competing with one another to supply power, this can be modeled as a competitive Markov decision problem, which we solve using a reinforcement learning approach. For power networks of realistic sizes, the state-action space could explode, making the RL procedure computationally intensive. This has motivated us to solve the above problem over Spark. The talk provides the following takeaways:
1. Modeling the day-ahead market as a Markov decision process
2. Code sketches to show the markov decision process solution over Spark and Mahout over Apache Tez
3. Performance results comparing Mahout over Apache Tez and Spark.
Real-time Streaming Analytics: Business Value, Use Cases and Architectural Co...Impetus Technologies
Impetus webcast ‘Real-time Streaming Analytics: Business Value, Use Cases and Architectural Considerations’ available at http://bit.ly/1i6OrwR
The webinar talks about-
• How business value is preserved and enhanced using Real-time Streaming Analytics with numerous use-cases in different industry verticals
• Technical considerations for IT leaders and implementation teams looking to integrate Real-time Streaming Analytics into enterprise architecture roadmap
• Recommendations for making Real-time Streaming Analytics – real – in your enterprise
• Impetus StreamAnalytix – an enterprise ready platform for Real-time Streaming Analytics
Leveraging NoSQL Database Technology to Implement Real-time Data Architecture...Impetus Technologies
Impetus webcast "Leveraging NoSQL Database Technology to Implement Real-time Data Architectures” available at http://bit.ly/1g6Eaj4
This webcast:
• Presents trade-offs of using different approaches to achieve a real-time architecture
• Closely examines an implementation of a NoSQL based real-time architecture
• Shares specific capabilities offered by NoSQL Databases that enable cost and reliability advantages over other techniques
Maturity of Mobile Test Automation: Approaches and Future Trends- Impetus Web...Impetus Technologies
Impetus webcast " Maturity of Mobile Test Automation: Approaches and Future Trends " available at http://lf1.me/Pxb/
This Impetus webcast talks about:
• Mobile test automation challenges
• Evolution of test automation challenges from Unit tests to image based and object comparison methods
• What next?
• Impetus solution approach for comprehensive mobile testing automation
The Shared Elephant - Hadoop as a Shared Service for Multiple Departments – I...Impetus Technologies
For Impetus’ White Papers archive, visit- http://lf1.me/drb/
This white paper talks about the design considerations for enterprises to run Hadoop as a shared service for multiple departments.
As Hadoop becomes more mainstream and indispensable to enterprises, it is imperative that they build, operate and scale shared Hadoop clusters. The design considerations discussed in this paper will help enterprises accomplish the essential mission of running multi-tenant, multi-use Hadoop clusters at scale.
The white paper talks about Identity, Security, Resource Sharing, Monitoring and Operations on the Central Service.
For Impetus’ White Papers archive, visit- http://lf1.me/drb/
Performance Testing of Big Data Applications - Impetus WebcastImpetus Technologies
Impetus webcast "Performance Testing of Big Data Applications" available at http://lf1.me/cqb/
This Impetus webcast talks about:
• A solution approach to measure performance and throughput of Big Data applications
• Insights into areas to focus for increasing the effectiveness of Big Data performance testing
• Tools available to address Big Data specific performance related challenges
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
How a Media Company Embraced Big Data- Impetus & Entravision @Strata Conference 2012
1. How a Traditional Media
Company Embraced Big Data
Presented by:
Oscar Padilla, Luminar, an Entravision Company
Franklin Rios, Luminar, an Entravision Company
Vineet Tyagi, Impetus Technologies
2. Key Points We Want to Make Today
● Big Data requires top-down executive sponsorship
● There has to be a synergistic need to your business to successfully
implement a big data solution
● Keep a flexible and open approach
● Retain the best and brightest talent; both, in-house and through your
partners
Slide | 2
3. Who is Entravision?
● We’re a diversified media company targeting US Latinos
● We have a unique group of media assets including television stations, radio
stations and online, mobile and social media platforms
- We own and/or operate 53 television stations
- Radio group consists of 48 radio stations
- Our television stations are in 19 of the top 50 U.S. Hispanic markets
- 109 local web properties with millions of visitors
● EVC is strategically located across the U.S. in fast-growing and high-density
U.S. Hispanic markets
Slide | 3
4. National Cross-Media Footprint
Entravision delivers TV, radio, Internet and mobile across the top
U.S. 50 Hispanic markets
Slide | 4
6. Understanding Why Entravision Decided to
Make a Big Data Play
Four main factors influenced this decision:
1. Become a data-driven organization
2. Hispanic consumers are under represented
3. Synergistic opportunity
4. New revenue stream
Slide | 6
7. Underserved Market – What We Saw
in the Marketplace
● Brands are making marketing investment decisions on
limited information
● No real insights or true performance of program
● Targeting assumptions based mostly on survey or sample
methods (i.e. “Latinos over-index on mobile usage”)
● Campaigns mostly based on just ethnically-coded data
● Stereotype approach; they speak Spanish, consume Spanish
media, heavy online users…therefore, good target
● Little or no cultural relevancy
Slide | 7
8. Actionable Insights is an Evolving Process
Evolution of a Marketer into Hispanic Share of Wallet
Slide | 8
9. How is Big Data Synergistic to Entravision?
● As a media company with a national presence in major markets, data and
analytics is a core component of EVC’s operations
● EVC uses both quantitative and qualitative data to support internal and client
performance analytics needs
- Campaign response analysis
- Segmentation analysis
- Market analysis
- Marketing and editorial tone
- Digital channels measurements; online display, mobile
Slide | 9
10. Big Data Brings to Entravision High-Value Offering
● Ability to more precisely support customers across the entire marketing value
chain:
- Move from a media & communications discussion to a business challenge
discussion
- Help identify growth opportunity within the Hispanic market
- Improve measurement of Hispanic market investments
- Demonstrate ROI
- Help accelerate growth through empirical data insights
● Transformative in the way we approached business and marketing needs
● Leverage big data environment and 3rd party data sources across business units
Slide | 10
11. Winning Executive Buy-in Was Critical
● It’s was a significant investment and commitment that required CEO vision
and support
● Developed detailed roadmap for success:
- Prepared comprehensive plan detailing operations, resources, level of
investment and implementation path
- We weighted the need for big data as new revenue source for EVC
- We identified “packaged solutions” for a big data offering
- And, we clearly defined how big data fulfilled an underserved market and
provided a shift from sample-based research to empirical analytics
Slide | 11
12. Result – Luminar Was Created as a New
Entravision Business Unit
New business unit was created dedicated to serving Hispanic-focused analytics
and insights
Slide | 12
14. Luminar Big Data Would Need to Support these Needs
● Analytics-as-a-Service platform
● Aggregate multiple sources of data from diverse sources
- Licensed data
- EVC data
- Unstructured social data
- Client data
● Offer an advanced and unique focused analytics service
- Provide insights into Hispanic consumer behavior
- Targeting customers in retail, financial services, insurance and auto segments
● Future offerings
- Platform as a Service
- White Label Services
Slide | 14
15. Importance of Aligning our Vision with the
Right Technology Partner
● Proven track record – vendor had to have a demonstrable experience in the
implementation of big data solutions
● Technology agnostic – We needed a technology partner that could help plan
and deploy a solution architecture that was not married to any one vendor
● Experience with multiple technology providers/suppliers – We needed a
partner that could understand the big data landscape now, in 6 moths and 18
months from today
● Blended team approach – Our ideal partner had to clearly understand that
they would be operating in a blended client/vendor team environment
Slide | 15
16. Deployment Objectives
● Build a best-of-breed model based on Luminar requirements
- Take a vendor neutral approach
- Lowest Total Cost of Ownership
- No requirement to integrate with any legacy systems but SQL data migration
● Cloud based architecture
● Maximize “re-use” of vendor experience in Big Data
● Scalability for future data requirements
● Data security requirements
● Visualization
● Start with a “shoestring” approach
Slide | 16
17. Build the Right Foundation for Growth
● Impetus lead solution architecture and vendor selection process
● We established a solution framework that delivers four client offerings
● We architected a solution that defined all major technology Key
Performance Indicators (KPIs) and SPOF
Slide | 17
18. Solution Architecture Phased Approach
Phase 1: Architecture and design consulting
● Blueprint architecture for a big data analytics solution covering the roadmap for 12
months and 24 months.
- Provide list of candidate solutions and vendors
- Re-use Impetus experience in Big Data such as iLaDaP framework
- Assess building new solution if necessary
● Provide deployment options – Public vs Private Cloud, Vendors
● Duration: 3-4 weeks
Prepare detailed project plan and proposal for implementation
- Phase 2 - Detailed POC benchmarking
- Phase 3 - Implementation of Big Data Solution
Slide | 18
19. Solution Creation Approach - Steps
• Understand Data, ETL and Analytical/Reporting
& roadmap requirements
1: Initial • Prepare comprehensive/ long list of candidates
Phase • Finalize assessment criteria and weightage
factors
2: Finalize • Compare and recommend short list
of candidates after detailed
POC evaluation including vendor
Candidates meetings
• Implement, execute and benchmark
critical use cases
3: POC • Execute POC candidates in parallel if
possible
• Assessment report
4: Final • Recommend best
Phase solution fit
Slide | 19
20. Short-list Creation Process
● Input to process – Long list of options
- Comprehensive high level evaluation criteria established
● Drill down high-level criteria into sub-factors, and assign scores
- Interview vendors on specific capabilities as needed
- At this level scores are not weighted
● Create final weighted cumulative score for each option
- Multiply weights and scores against each detailed criteria and add-up
● Recommendation of final short-list to proceed with POC
- Add narrative and detailed description of comparison and results
- Provide Pros and Cons of each option
Slide | 20
21. Internal Weighted Evaluation Helped with Vendor
Selection Process
We created a custom-scoring matrix used for evaluating
vendors pros and cons, defining requirements, and
weighting against Luminar’s objectives
Slide | 21
22. Final Result Creation
● Input to process
- Bake-off results
● Document findings and select winner
● Discuss next steps and additional value-adds
- Additional findings discussion
- Data model modifications if any required
- Preparation for production readiness
- Others as discovered during the project execution
● After brief break period – submit final documented reports
Slide | 22
23. Defined Performance Metrics Across the Entire
Technology Platform
● Database ● BI/Visualization
- compute (CPU utilization) & memory used - compute (CPU utilization)
- storage capacity utilization - memory used
- I/O activity - layout computations
- DB Instance connections - No of reports processed
● Hadoop ● ETL/ELT
- File system counters - Completed/queued/failed/running tasks
- Map-reduce framework counters - CPU utilized
- Sort buffer - Memory used
● Various counters - Job start and end time
- Total Memory (RAM)
- Number of CPU cores
- CPU Idle Percentage
- Free Memory, Cache Memory, Swap
Memory used
Slide | 23
25. Implemented Solution Overview
● Hortonworks as technology integrator
● Hadoop Cluster provisioned on Amazon
EC2 in under four hours
● Original data sets imported from MySQL
to HDFS/Hive using Sqoop and Talend
● Existing R scripts were modified to work
with Hive for data analysis. Minimal code
modification required
● Tableau work books modified to connect
to Hive via Hortonwork’s ODBC driver
Slide | 25
30. Luminar Rolled Out Four Key Solution Offerings
Business Data, Modeling,
and Analytics solutions for:
● Growth
● Acquisition
● Profitability
● Retention
31. Lessons Learned
● Having a flexible technology approach helped define the optimum
architecture supporting our needs
● You cannot do this alone, it’s too complex. Having the right partner
was paramount
● It’s hard to find talent, don’t be geographically limited
● The big data market is still in flux, we opted for best-of-breed
solution to support future industry shifts that we anticipate in the
next 12-18 months
Slide | 31
32. Closing Remarks…Four Key Takeaways
1 You need to have executive believers in the transformative
benefits of Big Data
Strata You must make a “synergistic” connection to your business Tyagi
2 “Office Hour” with Oscar Padilla, Franklin Rios & Vineet
This Thursday 3:10pm - 4:10pm EDT
Room: Rhinelander North (Table B)
3
Big data can be big headaches…don’t do it alone
4
Have a flexible approach to your roll-out strategy
Slide | 32