Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Trust & Predictive
Technologies 2016
A data driven study into privacy, prediction and personalisation by Edelman
& The Uni...
// 02
Contents
Executive Summary // 03
Introduction // 04
The Trust Divide - Business & The Public // 10
The Trust Divide ...
// 03
E
delman, in conjunction with The University
of Cambridge Psychometrics Centre,
embarked on a research study to exam...
// 04
// 05
Key Findings
Privacy concerns operate across age,
gender, country and personality
Psycho-demographic variables expla...
// 06
Introduction
B
ig data and predictive technologies are
among the most powerful tools for positive
change in business...
// 07
As Edelman’s most recent Trust Barometer revealed
- there is a significant divide between the beliefs
of business an...
// 08
1.3 Research Methodology and Sample
This research represents the largest audit of public
opinion on Big Data and pre...
// 09
// 10
The Trust Divide
Business & The Public
T
he 2016 Trust & Predictive Technologies
study found that only 29 percent of...
// 11
2.2 Privacy Is Highly Valued
Low levels of consumer trust stem from wider privacy
concerns that touch all areas of l...
// 12
// 13
It is vital that organisations appreciate the
consistency with which many consumers mistrust
the use of their person...
// 14
// 15
The Trust Divide
Marketers & The Public
A
cross the board, the Predictive Data
Project found that marketing and
comm...
// 16
Personalisation is arguably the most immediate
benefit of predictive technologies, a realm where
the use of personal...
// 17
Emerging Best Practice
C
ompanies should bear in mind that business
is increasingly expected to take on the
responsi...
// 18
4.3 A Two-way Conversation
As the research findings show, creating a two-way
conversation with consumers is of the u...
// 19
// 20
// 21
F
or those service providers using predictive
technologies, who are already following
best practice with regard to p...
// 22
Conclusion
As the below diagram shows, getting the approach
to data and predictive technologies right means
organisa...
Trust & Predictive Technologies 2016
Trust & Predictive Technologies 2016
Upcoming SlideShare
Loading in …5
×

Trust & Predictive Technologies 2016

2,000 views

Published on

A data driven study into privacy, prediction and personalisation by Edelman & The University of Cambridge Psychometrics Centre

Published in: Technology
  • Be the first to comment

Trust & Predictive Technologies 2016

  1. 1. Trust & Predictive Technologies 2016 A data driven study into privacy, prediction and personalisation by Edelman & The University of Cambridge Psychometrics Centre
  2. 2. // 02 Contents Executive Summary // 03 Introduction // 04 The Trust Divide - Business & The Public // 10 The Trust Divide - Marketers & The Public // 15 Emerging Best Practice // 17 Ambient Intelligence - the Next Level // 21 Conclusion // 22
  3. 3. // 03 E delman, in conjunction with The University of Cambridge Psychometrics Centre, embarked on a research study to examine and monitor public attitudes towards Big Data and predictive technologies. Building a data-driven platform that enabled individual engagement and contextualised feedback, the study explored the diverse opinions and psychological attributes of both global consumers and key corporate decision makers in this area, namely marketing and communications professionals. Consolidating the views of more than 34,000 individuals from around the world, the study provides an up-to-date view of the pivotal role of trust in the development and adoption of predictive technologies. Its findings lend some objectivity to several rapidly changing areas of technology, business and society, with the goal of stimulating further research and an informed dialogue on what Big Data can do, and more importantly, what it ought to be used for. Privacy concerns operate across age, gender, country and personality: Psycho- demographic variables explain less than two percent of the variance between yes and no answers to questions about uses of Big Data. Privacy is therefore a universal concern, not an esoteric interest. Executive Summary
  4. 4. // 04
  5. 5. // 05 Key Findings Privacy concerns operate across age, gender, country and personality Psycho-demographic variables explain less than two percent of the variance between yes and no answers to questions about uses of Big Data. Privacy is a universal concern, not an esoteric interest. There is a deep lack of trust in data-driven businesses and in government 71 percent of people thought most companies with access to their personal data did not use it ethically; only 26 percent of people trust the government not to sell their electoral roll and demographic data without their consent. Marketers are at risk of overestimating consumers’ willingness to adopt predictive technologies Across sectors, marketing professionals were consistently more knowledgeable and more open to sharing data for prediction than the general public. This reveals a clear need for companies to better communicate their data practices, or face potentially dire consequences. There is a clear gap between consumer desire and the reality of how predictive technology is used 84 percent of people thought predictive tech should be used to improve the quality of healthcare and 47 percent of people thought it should be used to determine the price of their car insurance. People want personalisation 66 percent of people would prefer to see personalised advertising, assuming they have to see some advertising. Privacy, transparency and relevance are the building blocks of effective Big Data-based marketing. ‘Pay for Privacy’ is a real opportunity for traditionally data-dependent businesses 27 percent would pay $3 a month to use Facebook without their behaviour being recorded. Offering paid options helps remind consumers that their data has value, and that even if they use a service for free, they are still effectively paying for it. Businesses are investing in smarter Big Data 77 percent thought their organisation ought to invest in predictive data and 94 percent said it was important for them to understand the psychological attributes of their customers. Demand for secondary services from IoT data is soaring 57 percent of people thought that e.g. smart fridge data should be used to recommend groceries to them when they go shopping; 58 percent would like to be automatically warned of unhealthy dietary habits. There is demand for personalised finance, yet distrust of actuarial prediction The majority think predictive tech should not be used to assess mortgage eligibility or likelihood of default (62 percent and 67 percent respectively), but were open to its use for better account management and advice. 1 2 3 4 5 6 7 8 9
  6. 6. // 06 Introduction B ig data and predictive technologies are among the most powerful tools for positive change in business and society. The term ‘predictive technologies’ is used in this report, and was used in the consumer survey, to refer to services, analytical techniques, machine-learning algorithms and other tools capable of discovering and analysing patterns in data to predict future behaviour on the basis of past behaviour. Each of us are generating more data than ever before, with at least 2.5 quintillion bytes’ worth of human behaviour being digitised every day. This rapidly expanding and searchable landscape of information is being utilised by organisations in practically every sector. From FitBit in wearable technology or Samsung in Internet of Things (IoT) devices, to Barclays in consumer finance and indeed the UK Government. Furthermore, every organisation from Uber to Homeland Security are getting better and better at using it. But how good are we really at understanding the individuals behind all this data? Does control rest with the organisation, the government or the citizen? And how can Big Data become personalised, inclusive and relevant for everyone, rather than merely being ‘big’? The University of Cambridge Psychometrics Centre (hereafter UCPC) is attempting to tackle these questions head on. Through multidisciplinary academic research and product development, the UCPC hopes to make Big Data not just machine- readable, but more human-interpretable. Its team has created a trait prediction engine called Apply Magic Sauce that translates digital footprints of behaviour (such as the Pages you Like on Facebook), into detailed psycho-demographic profiles. These predictions enable citizens to see how others see them, bringing to life the ways in which our digital personas are monitored and perceived. UCPC’s research is underpinned by the largest and richest social science database in history, called myPersonality, comprising of the psychometric test scores and social media profiles of over six million volunteers globally. This resource provides a deep psychological perspective on the audience attributes of over 200,000 brands. Edelman supports this research and brings the impressive predictive potential of the UCPC's tools to its clients. With a sample of only 10 Likes, the Psychometrics Centre’s platform can more accurately predict an individual’s personality profile than their colleagues. Not only does it surpass 360 degree feedback, but given a sufficient tranche of digital behaviour (around 300 Likes), the computer models can in fact better detect personality traits than a person’s husband or wife1 . Combined with predictions of age, gender, relationship status, intelligence, life satisfaction, leadership potential, political views and a host of other variables, predictive technologies such as these promise to bring greater control and personalisation to all aspects of the internet. From travel and retail to healthcare and education, building psychological sensitivity into machine intelligence is arguably the next frontier for Big Data. But with that comes a pressing need for service providers to evolve with consumers at front of mind. The Psychometrics Centre’s research was recently cited by the European Data Protection Supervisor in Opinion 7/20152 to illustrate that ‘the more powerful computers become, the more acute is the challenge [of deciding] what is fair and lawful and what is not when it comes to big data analytics’. This exercise requires a comprehensive and multidisciplinary approach to balance the legal, ethical, commercial and personal interests at stake. 1 www.pnas.org/content/112/4/1036.abstract 2 https://secure.edps.europa.eu/EDPSWEB/webdav/site/mySite/shared/ Documents/Consultation/Opinions/2015/15-11-19_Big_Data_EN.pdf 1.1 The Power of Predictive Data
  7. 7. // 07 As Edelman’s most recent Trust Barometer revealed - there is a significant divide between the beliefs of business and government, and the beliefs of the general public, with regard to how personal data is being used. The optimism of the informed elite contrasts with a deep mistrust from consumers in many sectors. This report illuminates several aspects of this conflict through the lens of the individual traits and desires underlying the public’s beliefs. It provides a summary of initial findings from what promises to be an ongoing, knowledge-seeking collaboration between Edelman and the University of Cambridge Psychometrics Centre. 1.2 Research Objectives The aim of this research project was to understand and monitor the public’s attitude towards predictive technologies. The assessment platform developed for this purpose will continue operating at www.predictivedataproject.com as a shared research asset, enabling objective measurement of attitudes towards the many possible applications of predictive technologies, both now and in the future. Another aim was to contrast the opinions of the general public with that of data professionals, uncovering the extent to which specialist knowledge might engender different ideas on how predictive technologies ought to be used. For this purpose, part of the questionnaire targeted marketing and communications professionals, whose rapid adoption of digital solutions represents one of the most advanced practical applications of these technologies. Questions spanned the usage domains of advertising, politics, quantified self and wearables, data protection, cloud storage and IoT. It is hoped that this research will help improve the business world’s understanding of consumers’ views of predictive technologies, thus helping to outline inclusive strategies and best practice for any organisation facing challenges related to Big Data.
  8. 8. // 08 1.3 Research Methodology and Sample This research represents the largest audit of public opinion on Big Data and predictive technologies in history. It was conducted by Vesselin Popov, Business Development Director for the University of Cambridge Psychometrics Centre, and Sandra Matz, a PhD candidate investigating psychological fit in marketing, finance and other business contexts. Delivered openly to the public via UCPC’s online Predictive Data Project assessment platform, 34,267 people from around the world have taken part, providing detailed opinions on Big Data and self- assessing their personality traits. Of this group, 4,454 (13 percent) were marketing and communications professionals, with at least 1,868 of these respondents answering all seven 'CMO-focused' questions. We can therefore be confident that the study is a statistically robust indicator of the views of both C-suite and less senior marketing staff. The research also benefited from strong levels of engagement in all other areas of the study. 10,411 people answered all 27 questions relating to predictive technologies, and 8,877 people answered all the survey questions in total, which included 20 questions focused purely on personality assessment. Participants were not financially incentivised, but were rewarded for their honesty and participation by being offered instant feedback for every answer. This was done using an iterative calculator of cohort similarity, which displayed the percentage similarity of an individual’s yes/no response pattern to the response pattern of thousands of previous test-takers. Participants could therefore see instantly how similar their views were to others’. Personality was also calculated in real-time to provide a summary of the individuals’ openness, conscientiousness, extraversion, agreeableness and neuroticism, following the popular BIG5 model. Aside from size, the sample was truly global in nature. 43 percent of participants were from Europe; 27 percent were from North America, and 15 percent from South America. In national terms, the countries with the highest proportion of respondents were the USA, with 23 percent of participants, the UK with eight percent, Brazil with seven percent, and France with five percent. Lastly, self-reported demographic data showed an average age of 30 (with the distribution varying from 10 to 78 years old), and a 54:46 split between male and female respondents. The sample was sufficiently large and diverse to be representative of the general public whilst permitting more focused group comparisons in future. Lastly, 16 percent of respondents worked in technology, 11 percent in education, five percent in advertising, and four percent in financial services.
  9. 9. // 09
  10. 10. // 10 The Trust Divide Business & The Public T he 2016 Trust & Predictive Technologies study found that only 29 percent of people believe most companies with access to their personal data use it ethically. Furthermore, only 22 percent of respondents thought the threat of bad press was a sufficient deterrent to prevent businesses from misusing personal data. This is a severe indictment of the current levels of consumer trust regarding corporate data practices, and a strong indication that consumers are looking for more regulation from government. 78 percent of the public specifically thought that legal restrictions were more important and effective than public relations concerns at preventing misuse of their personal data. Yet, in many cases, it is likely to be the PR and marketing department rather than the legal team making the final decision on whether a technology, once approved, ought in fact to be deployed. It is in this grey area that considerations of law and ethics interact with those of revenue targets, competition, media, brand strategy and customer service. Where new regulation is implemented, its application is often hindered by legal complexity on jurisdictional issues, place of breach, place of business and other technical aspects, such as the ubiquity of cloud computing3 . All in all, the fast pace of technological change in Big Data and prediction make the regulatory landscape very difficult to navigate, especially for smaller enterprises. The study also found a low level of consumer trust in government. While 55 percent of those surveyed thought the government should use predictive technologies when designing new policies, at least in some areas, only 26 percent of respondents 3 Companies, digital transformation and information privacy: the next steps, a 2016 report from The Economist Intelligence Unit http:// www.eiuperspectives.economist.com/sites/default/files/EIU_ Companiesdigitaltransformation_PDF_1.pdf trusted the government not to sell their electoral roll or demographic data to companies without their consent. When it came to their private financial information, however, a much higher 72 percent of people said they trusted the Government not to sell it on without consent. Contrasting statistics such as these emphasise the fact that context matters, and that one must consider the specific data sharing being proposed even when the parties are the same. There are different expectations of privacy in every situation, and different opportunities to miss the mark as a business. Recognising the areas where consumer desire does not align with current practices should therefore be a priority for data-driven, consumer-oriented organisations in future. 84 percent of people thought that predictive technologies should be used to improve the quality of healthcare, for example by helping doctors recommend personalised nutrition plans. But only 47 percent thought it should be used to determine the price of their car insurance. Contrasting the prevalence of the latter practice with the slow adoption of the former raises an important point: the rapid pursuit and adoption of Big Data techniques in the last few years has led to many assumptions being made about what consumers really want. Another study by Pedraz et al. (BMJ Qual Saf, 2015) found that 71 percent of 1,432 patients surveyed were happy to share social media information with their doctor, providing further evidence of demand and potential for Big Data in health4 . 4 Linking social media and medical record data: a study of adults presenting to an academic, urban emergency department, Padrez et al, British Medi- cal Journal Quality & Safety 2015 http://qualitysafety.bmj.com/content/early/2015/10/09/bm- jqs-2015-004489.abstract 2.1 The Public’s View Rapid pursuit and adoption of Big Data techniques has led to many assumptions about what consumers really want
  11. 11. // 11 2.2 Privacy Is Highly Valued Low levels of consumer trust stem from wider privacy concerns that touch all areas of life, not only those within the remit of this study. But it is still remarkable, given how much of our behaviours and social interactions are digitally mediated, that the majority of respondents (58 percent) reported having not used a digital service due to privacy concerns. These are not inconsequential fears, but strong emotional reactions that are undoubtedly driving decision-making around which apps to download, which email addresses to share, which social networking site to log in with, and more. This is further supported by 26 percent of respondents in the IoTPI 20155 report mentioning security or privacy of data collection as a reason why they did not currently own a smart device. While there are, without question, massively high levels of mistrust around personal data usage, only 27 percent would pay $3 a month to use Facebook without their behaviour being recorded. Bearing in mind Facebook’s average revenue per user is estimated to be around $1.33 per month, it is perhaps surprising that more would not find this option appealing. Yet it would be foolish to think of the 73 percent as being tight-fisted, or as only paying lip service to the value of privacy. Many people consider privacy a right rather than a service to be paid for, either not appreciating Facebook’s business model or simply not believing that a do-not-track option would ever be feasible, let alone genuine. On the other hand, 27 percent of the population, translated to Facebook’s user base, would represent a potential demand for anonymous usage from over 429 million users, a number that is surely sufficient 5 2015 US IoT Privacy Index from TRUSTe https://www.truste.com/resourc- es/privacy-research/us-internet-of-things-index-2015/ to justify dedicated product lines. Aside from the economic arguments for business model innovation in how user data is treated and monetised, there are other factors that could make ‘pay for privacy’ a good idea for many businesses. One is that it would help consumers realise that digital services cost money to design and deliver, so if they are not paying money to use them, then they must be paying with their data. Even if the user still decides to continue with the free option, at least they will appreciate they are giving something up and are therefore more likely to weigh up the benefits and disadvantages intelligently. If the user makes an informed decision to share that data, they are likely to be much more comfortable with its use by the company, and the data is more likely to be accurate, creating a mutual benefit. This creates a beneficial data-for-insight partnership between the user and platform that is the hallmark of a mature and successful data-driven business. In cases where people do choose to pay for a do-not- track option, they can come to appreciate how sharing parts of their data can actually improve their experience of the service. For example, while a news site might be ad-free upon subscription, recommendations for which articles to read cannot be personalised without some data being shared. However, the crucial difference in this situation would be that the service provider might generate revenue from increasing subscriptions for the private, personalised version of the platform, rather than by trying to record as much sensitive behavioural data as possible. ‘Featurisation’ of data protection may therefore become a more common feature of online platforms in the near future. The opportunity for industry to better align with peoples' desires is now clearer than ever. Desire for more control, customisability and service quality is there and it is growing, but businesses cannot assume that every sector or market will have reached the same level. Clearly, with 16 percent of people still believing that predictive technologies should not be used in healthcare, providers cannot assume all of their stakeholders to be equally keen, and so should evaluate and tailor their communications accordingly. Overall, the paucity of consumer trust regarding data is in keeping with the larger trends currently at play, as revealed by the 2016 Edelman Trust Barometer. Now in its 16th year, this independent survey found that trust in business and government is below 50 percent for the mass population - in over sixty percent of the 28 countries surveyed. In other words, the public’s trust in business and government combined has barely increased since the days of the Great Recession (2007-9).
  12. 12. // 12
  13. 13. // 13 It is vital that organisations appreciate the consistency with which many consumers mistrust the use of their personal data. Around the world, people are deeply concerned. The findings show that personality differences accounted for only two percent of the variance between opinions on Big Data. This means that privacy concerns are more universal than previously thought, not being merely characteristic of guarded or unadventurous users, but widespread across the population. The age, gender and location of participants were also not predictive of whether they would accept or reject a proposed use of predictive technologies. The predictive data project is collecting longitudinal data to investigate these effects in greater detail, and it may be that the public’s understanding of the technology is still too nascent to bring out strong effects mediated by individual differences. For the moment at least, it would appear that considerations of trust and privacy are so pervasive in consumers’ minds that they cross international, cultural and psychological boundaries. One possible advantage for organisations given these findings may be that a clear and consistent stance on privacy issues could help to inform large segments of the public, without necessarily alienating people of certain psychological profiles. The flip side is that this universality may not last for very long, and a broad brush communications approach would still not recognise the personal nature of sharing one’s unique digital footprint with a third party. Businesses will ultimately need to decide upon a strategy that instils confidence in the public and trust in the individual consumer. Recognition of universal concerns must be balanced with the desire for personalisation if businesses are to show leadership in this area and remain sensitive towards the consumer mind set. While it could be said that ‘the EU is the highest common denominator when it comes to privacy issues’ (EIU Report 2016), there is also a strong impetus for legislative reform in the United States6 . Calls for advancement of the Consumer Privacy Bill of Rights, establishment of a 6 e.g. Big Data: Seizing Opportunities, Preserving Values. Interim Progress Report. February 2015. Executive Office of the President.’ www.whitehouse.gov/sites/default/files/docs/20150204_Big_Data_Seiz- ing_Opportunities_Preserving_Values_Memo.pdf national data breach standard or greater oversight of data brokers are principles not far removed from those underpinning the Digital Single Market. The outcomes of these debates are global in scale but individual in effect. It is therefore essential to recognise the small but significant role of personality in shaping our views on Big Data. This role was elucidated by the Predictive Data Project in findings that are largely in line with prior research in this area7 ; Korukonda (2007). For example, this study found that people who are more accepting of predictive technologies tended to also be more open- minded, cooperative and extroverted than those who are more reserved on the topic. Agreeable people were more likely than average to share their data for new services, but less likely to admit that those with access to their data could learn something about them. Liberal and artistic people were more trusting of companies and the government in general, but less comfortable with predictions for financial purposes, including eligibility for car insurance, default risk on loans and mortgage assessments. Finally, extroverts had greater belief in the threat of bad press as being a powerful deterrent against corporate misuse of personal data, possibly as a function of their greater engagement with the outside world. 7 Zhou, T., & Lu, Y. (2011). The effects of personality traits on user accep- tance of mobile commerce. Intl. Journal of Human–Computer Interaction, 27(6), 545-561. Korukonda, A. R. (2007). Differences that do matter: A dialectic analysis of individual characteristics and personality dimensions contributing to com- puter anxiety. Computers in human behavior, 23(4), 1921-1942. 2.3 Big Data Issues Transcend Psycho-demographics "Privacy concerns span the full spectrum of personality, but consumers still exercise their rights as individuals. Communications need to be both transparent and tailored to inspire long-term trust in the business of Big Data" Vesselin Popov, Business Development Director University of Cambridge, The Psychometrics Centre
  14. 14. // 14
  15. 15. // 15 The Trust Divide Marketers & The Public A cross the board, the Predictive Data Project found that marketing and communications professionals are more accepting of the use of predictive technologies - and view their adoption as more desirable - than people who do not work in marketing. This supports our hypothesis that the opinion of marketing professionals towards predictive technologies is running significantly ahead of the general public, to the point where businesses are at risk of gravely overestimating consumer willingness to adopt new predictive technologies. For example, 50 percent of CMOs are aware that our future behaviour can be accurately predicted from our personal data, compared to 36 percent of non- CMOs. Also, while 62 percent of CMOs have heard of “the right to be forgotten”, that figure drops to less than half (47 percent) for non-CMOs. Additionally, 94 percent of marketing professionals said predictive technology is important to help them understand the psychological attributes of their customers, and 78 percent said their organisation needs to invest in predictive technologies. This highlights the cross-sector momentum in marketing departments to utilise the latest technology and integrate analytics into sales, media planning and new product development. A key point here is that issues such as the right to be forgotten are no longer just part of in-house professional conversations, but are present in the public discourse as well. The legal complexities of putting prediction into practice are to be elucidated, not shielded from consumers, and further education is needed on both sides of the ‘share’ button. This study argues that businesses ought to be more proactive in explaining changes in the consumer data landscape to the public. Information should not be withheld in order to preserve speculative knowledge advantages over competitors, but open- sourced and shared for discussion. Too often a change in rules manifests in a new clause buried in privacy policies or compliance procedures, rather than an opportunity for business-to-consumer dialogue, wherein lies the greater competitive advantage. Consumers are both more concerned and more curious than ever, and businesses can foster greater respect by educating them and shaping simple conversations about privacy through their products and services. This is the view adopted by the European Data Protection Supervisor in arguing that “as a society, we must be able to look into the ‘black box’ of big data analytics’ and that privacy policies should ‘genuinely serve to safeguard the interests of the individual…not merely to shield the controller from legal liability’. Without serious efforts to develop, adopt and popularise ethically sound data practices, consumers’ lack of trust could become a real problem for organisations that strive to be data- driven. The complexity of the subject matter means that those marketers who get ahead of the curve in communicating with consumers will succeed. Those who waiver risk user withdrawal and irreparable damage to their brand. 3.1 The Marketing Professional’s View 94% of CMOs said predictive technology is important to help them understand the psychological attributes of their customers
  16. 16. // 16 Personalisation is arguably the most immediate benefit of predictive technologies, a realm where the use of personal data can deliver a more personal experience instantly. One might therefore expect the views of marketers and of the public to be more closely aligned. The Predictive Data Project confirms this, but with some caveats. 64 percent of CMOs thought the brand, product and or services they offered were highly personalised to individual customers, while 55 percent of non-CMOs thought this was the case – nine percent less. As was the case with the relatively high use of prediction in insurance compared to healthcare, such gaps should be viewed as major opportunities for improvement. With a reported 144 million people8 now using ad blocking software when browsing the web, and Apple’s recent addition of ad blocking to iPhone and iPad software, the future of advertising is far from certain. The study found that 66 percent of people prefer to see personalised advertising, on the assumption that they do have to see some advertising, showing a clear demand for more relevance in future. 56 percent of respondents also agreed with the more general suggestion that predictive technologies should be used to personalise their online and mobile experiences. 8 PageFair, 2014 3.2 Personal, Just Not Personal Enough
  17. 17. // 17 Emerging Best Practice C ompanies should bear in mind that business is increasingly expected to take on the responsibility of solving the challenges facing society. The 2016 Edelman Trust Barometer found that 80 percent of people (up from 74 percent in 2015), think that ‘a company can take specific actions that both increase profits and improve the economic and social conditions in the community where it operates.’ This shows that the vast majority of the public is not anti-profit, or indeed anti-progress – quite the opposite in fact. There could therefore be an opportunity for companies to harness greater support for predictive technologies, if they can demonstrate how it helps address wider challenges too. 4.1 Business Must Lead As Jonathan Hargreaves, Vice Chair of Edelman’s Global Technology Practice says: “The ability to see around the corner is a super power that these new predictive technologies enable. Everyone in a leadership position has a responsibility to ensure this new power is used for the right purpose and good outcomes.”
  18. 18. // 18 4.3 A Two-way Conversation As the research findings show, creating a two-way conversation with consumers is of the utmost importance. Organisations have a responsibility to empower and engage people. Predictive technology, and the issues surrounding it, are complex, as is the public’s view - so only a sustained dialogue will prove effective. For example, Edelman’s 2015 Earned Brand research found that 66 percent of consumers believe brands are innovating simply to make more money for the company, rather than e.g. having a positive societal impact or solving bigger environmental problem. The key insight here, is that while the promise of innovation can inspire people – they first need to be reassured. Specifically - consumers are twice as likely to want to be reassured than inspired by a brand. Organisations should also realise that while the data belongs to the customer – the power often lies with them, as it is businesses who own, operate and understand the predictive technology, as well as the benefits it makes possible. With this power comes the responsibility to help the general public understand too. 4.2 How Media Providers are Getting It Right Three words to remember: Privacy, Transparency & Relevance. These are at the core of making Big Data work for everyone. This means focusing on offering accurate, detailed information about how customers’ information is being captured, stored, used and shared. It is also important to celebrate and raise awareness of the effective and ethical uses of data that benefit consumers. The average customer is not an expert in predictive technologies, so business has a responsibility to educate them and to lead best practice. Some of the best current examples of predictive technology are theory-driven recommendation systems based on psychological insights about the customer – such as Netflix’s showcasing of content it predicts to be of most interest, based on the individual user profile and previous viewing behaviour. Not only does this enable the business to better understand its customers, it enables it to prove relevance, and explain – in individual terms – why a particular recommendation has been made. The benefit is clear, and accepted, when strategy follows this format: “We recommend X, because you watched Y.” As well as Netflix, many other media organisations are among the most focused on privacy, transparency and relevance. The Financial Times for example, gathers data in return for access to the site – making the connection clear between insight gathering and reward – as long as the questions are deemed reasonable of course. Again, each sector will have its own context - the questions people would be willing to answer on a site selling consumer dental equipment could be very different to what they would be willing to tell a newspaper. Some of the best current examples of predictive technology are theory- driven recommendation systems based on psychological insights Edelman’s 2015 Earned Brand research found that 66 percent of consumers believe brands are innovating simply to make more money for the company
  19. 19. // 19
  20. 20. // 20
  21. 21. // 21 F or those service providers using predictive technologies, who are already following best practice with regard to privacy and transparency, and demonstrating its relevance to the consumer – the next step is what we call Ambient Intelligence. Ambient Intelligence is what emerges when organisations are using customer data and predictive technology in a way that is both immediate and ubiquitous. Immediate in that there is a speedy delivery of a relevant consumer benefit following the rapid collection and predictive analysis of the data. And ubiquitous in that the technology is deployed and benefits delivered in both static environments, such as the home and office, and mobile environments, such as via a smartphone or tablet. To be clear, this is an opportunity that exists right now. The research found that 52 percent of people thought the benefits of home smart appliances - such as a fridge that automatically pre-orders groceries or a thermostat that pre-adjusts room temperature - outweighed the privacy risks related to the generating such data about the patterns of their home life. This is a staggeringly high percentage, considering how recently the Internet of Things has become part of the ordinary citizen’s knowledge and vocabulary. Furthermore, 57 percent of people would like such smart fridge data to be used to recommend groceries to them when they go out shopping. So, in spite of consumers’ concerns, if executed properly, the market demand for Ambient Intelligence is already there. To truly take advantage of the opportunity Ambient Intelligence offers however, the data has to be sound, the recommendations correct, and consumers’ consent explicit. Fortunately, the research finds that some of these are areas where CMOs are in a good position already. 88 percent of CMOs said it was important to always back up their decisions with statistically valid data. However, 32 percent of CMOs admitted their company stores data on sensitive personal attributes, such as ethnicity, sexual orientation and intelligence. Explicit consent is required under the Data Protection Act 1998 for this data to be collected and even then it must be for a limited purpose and amount of time. This leaves a possible 68 percent of companies that are storing email addresses, census information, social media data or other digital records of behaviour (which 48 percent of respondents explicitly said they do). If, as psychological research has shown, it is possible to predict sensitive attributes from this data as well, then companies could be unwittingly storing data ‘relating to sensitive attributes’, and so be liable under the Data Protection Act 1998. Thus both rigour and caution are needed, for businesses to ensure they remain compliant. Organisations should also be aware of what is driving the emergence of Ambient Intelligence. Broadly, technological progress is driving changes in people’s behaviour, which is changing how businesses are operating. 5.1 A Call to Action 5.2 Taking Advantage of Ambient Intelligence Ambient Intelligence The Next Level
  22. 22. // 22 Conclusion As the below diagram shows, getting the approach to data and predictive technologies right means organisations can initiate a virtuous circle of improvement. An ethical approach delivers more consistent data and over a longer period of time. Better data enables a company to enhance its offering, building a stronger business, which leads to a stronger brand from an ethical perspective too. Of course, should best practice not be followed, the reverse cycle is equally possible. The stakes are high because the opportunities are so great, and consumers' concerns so deep. Fundamentally, Big Data will not be the final frontier. Organisations still have a way to go in building resilience to the shifts in consumer mindset that accompany technological progress. Undoubtedly, the information age ecosystem can now make or break a business faster than ever. T he limits to how much value organisations can create - using data, predictive technology and ambient intelligence - are not defined by a company’s hardware or data processing capabilities. Rather, the value is limited only by its creativity, vision and the design quality of its relevant systems and processes. The new value chain of predictive technologies, where incentives align for business and the consumer, is as follows: envision the potential value, identify what data is required to make it happen, ensure data is collected and analysed appropriately, and finally deploy the solution in a personalised manner 6.1 Enabling & Realising Value 6.2 A Virtuous Circle ETHICS BETTER OFFERING BETTER DATA BETTER BUSINESS TRUST

×