Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Message MapsThe Most Effective Technology forAssessing Message Effectiveness
Research shows that                                                        Most research companies rely on voters to guess...
The Solution   WPA’s Message Mappingmethodology measures actual   The solution to this problem is to measure the actual ef...
Examples                                                                                               WPA plots each mess...
1.	 Krosnick, J.A., S. Narayan, W.R. Smith (1996). Satisficing in Surveys: Initial Evidence.    New Directions for Evaluat...
Upcoming SlideShare
Loading in …5
×

Message Mapping Explained

363 views

Published on

Most research companies rely on voters to guess at what motivates them.

What we know about how human beings evaluate information, make choices, and respond to survey questions tells us that the traditional approach to message testing—askingto judge which people how effective message would be—is not reliable. The problem: voters (or people in general) just aren’t good at understanding the reasons they do things.

Message Mapping is the only research technique proven to solve this problem. If you're relying on traditional message testing (more/less likely or vote for/against scales), you might as well be guessing. They don't work.

Published in: News & Politics
  • Be the first to comment

  • Be the first to like this

Message Mapping Explained

  1. 1. Message MapsThe Most Effective Technology forAssessing Message Effectiveness
  2. 2. Research shows that Most research companies rely on voters to guess at what motivatesvoters are unable them. What we know about how human beings evaluate information, make choices, and respond to survey questions tells us that the traditional approach to message testing—askingto judge which people how effective message would be—is not reliable.messages actually There are several reasons for this, including:motivate them The problem • Voters (or people in general) just aren’t good at understanding the reasons they do things. Traditional research does not >> There’s a reason that psychology and psychiatry are burgeoning identify messages that industries—most people act without fully understanding why they act actually change opinions. and often act in ways that are contrary to what they believe are theirduring a telephone preferences and motivations. >> Voters just can’t differentiate among the importance of as many as a dozen distinct messages, so they wind up rating one high and the rest low, or all of them high, or another simple strategy. All of these can leadsurvey, so WPA uses to us reaching the wrong conclusion when we rely only on respondent ratings to assess messages.a methodology that • Voters want to be liked by the interviewer. >> The foundation of telephone polling is the social exchange between interviewer and interviewee. >> While this is what allows us ask 20 minute surveys, it creates bias inmeasures the actual message assessments. >> People will say one thing and do another on socially controversial topics such as race, class, honesty questions, and others.effectiveness of each • People tend to give greatest weight to what they’ve been exposed to recently in the news (hot topics). >> While these messages sound familiar at the time, they may not have anymessage without impact at all on their vote. >> For this reason voters will say a message matters a lot that is already driving their choice on the ballot—if we repeat that message in an ad wehaving to rely on a won’t gain any ground because everyone already know about it.respondent’s guesses.
  3. 3. The Solution WPA’s Message Mappingmethodology measures actual The solution to this problem is to measure the actual effect of change in opinion hearing each message on a respondent’s vote choice. We still ask respondents to rate messages because it gives them a cognitive task that causes them to listen to each message. We evaluate effectiveness, however, not by their responses, but by using observed changes from the pre-ballot to the post-ballot. The way this works is as follows: • Each respondent is asked to rate a random selection of messages and then the ballot is retested. • We record what messages each did and did not hear. • We measure the actual behavioral response—the difference between their initial ballot vote and the informed ballot vote. • We build a regression model with that response for each respondent as the dependent variable and a series of indicator variables for heard/did not hear each message as the independent variables. • The coefficients on each heard/did not hear is the measurement of the effectiveness of each message in changing votes. We then use the actual effect of each message on the X axis of our Message Map™, and combine it with scales of stickiness (the ability to recall the message later in the survey), and believability of the message. What our Message Maps reveal is the latent values that voters bring to elections—and the messages that appeal to them—that we would not get looking just at message ratings.
  4. 4. Examples WPA plots each message on a chart, showing the actual effectiveness on the X axis, the stickiness of the message on the Y axis, and the believability of a message represented by the size of the bubble. The best messages• In a recent Texas legislative are large bubbles in the green area, balancing effectiveness, stickiness, and believability. primary we found that voters Bubble Size: Believability who said they cared most about 1.4 border security and illegal immigration really responded Increasing Memorability of a Message 1.2 best to a message about life 1. Will fight for a balanced budget 2. Will cut Dept of issues. Illegal immigration was 1.0 amendment Energy and Dept of Education a hot topic at the time, but funding 0.8 6. Pro-life 7. Wants to greatly Message Maps revealed that the champion expand domestic oil drilling enduring issue of protecting life 0.6 5. Wants to privatize really mattered more and helped Social Security 8. Will end pork 0.4 spending our candidate win. 4. Supports repeal of Obamacare 0.2 3. Cut corporate taxes• In a competitive Congressional to spark economy 0.0 general election in Kansas last 0.0 0.25 0.50 0.75 1.0 cycle, voters rated a message about balanced budgets most Increasing Effectiveness of a Message highly. But Message Maps revealed that a more aggressive • On this Message Map, • Respondents’ self-reporting • This example illustrates message about fighting against the most effective gave the strongest ratings to how respondents gave the the Governor’s tax hike proposal messages are numbers messages 4 and 8, repealing “expected” conservative 2 and 5, cutting Obamacare, and ending pork responses—ending pork won more votes. Voters the funding of the spending. Messages 2 and 5 spending and repealing wanted to believe they wouldn’t Departments of Energy were self reported as two of Obamacare—to the and Education funding, the lowest-rated messages. interviewer while avoiding respond to a “combative” and privatizing Social >> But our analysis shows that more controversial responses message, but in reality they did. Security.1 while respondents said regarding eliminating the statements 4 and 8 would Departments of Energy and motivate them, those messages Education and privatizing ultimately had very little effect Social Security. on their vote choice.2 >> But in reality, these controversial topics were the winning messages for this particular campaign.3
  5. 5. 1. Krosnick, J.A., S. Narayan, W.R. Smith (1996). Satisficing in Surveys: Initial Evidence. New Directions for Evaluation 70: 29-44.2. Fisher, R.J., J.E. Katz (2000). Social-desirability bias and the validity of self-reported values. Psychology and Marketing 17(2): 105-120.3. Ashton, R.H., J. Kennedy. (2002) Eliminating recency with self-review. Behavioral Decision Making 15(3): 221-231.324 Second Street, SE 1319 Classen Dr 1005 Congress Washington, DC Oklahoma City, OK Suite 495, Austin, TX 20003 73103 78701 www.WPAresearch.com

×