'Tools For Radio Content Regulation #1: Playlist Diversity Analysis' by Grant Goddard


Published on

A research paper proposing that commercial radio regulation in the United Kingdom should undertake radio station music playlist analysis in order to ensure diversity of content within the radio marketplace, written by Grant Goddard in January 2003.

  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

'Tools For Radio Content Regulation #1: Playlist Diversity Analysis' by Grant Goddard

  2. 2. “It is worrying that many local radio stations play the same music all day every day.” Rosemary McKenna MP, House of Commons Communications Bill debate.1 “If a format is good enough for one owner, it should be good enough for another.” Commercial Radio Companies Association Response to the Draft Communications Bill.2 “Examining performance of media industries ought to be the ultimate step in media economics analysis….. We need to foster a connection between media economics and the longtime problems of how to best promote diversity, how best to promote localism, or how best to promote some other value we might hold dear.” Professor Douglas Gomery, University of Maryland.3 “One possibility which suggests itself is whether there already exists a body of data within the industry, which might be made available to Ofcom/the Radio Authority, in order to make develop [sic] a more useful data base, which can in turn be used to inform regulation of the industry…. quite a lot would be gained simply by flushing information about station performances into the open.” Professor Ian Hargreaves, Radio Authority Strategy Conference.4 1 Hansard, House of Commons Debates, 3 December 2002, Column 843 “Commercial Radio Companies Association Response to the Draft Communications Bill,” CRCA, 8 February 2002 Gomery, Douglas, “Ownership Policies, Diversity & Localism,” photocopied working paper. Douglas Gomery is a Professor of Media Economics and History at the University of Maryland, and author of eleven books, two of which earned national book awards. "Who Owns The Media?" (with Benjamin Compaine) recently won the Association in Education in Journalism and Mass Communication’s Picard prize for best media economics book of the year 2000, while "Shared Pleasures" won Lincoln Center’s best television and film book of 1992. Gomery’s numerous articles have appeared in journals around the world, and for five years he was author of the column "The Economics of Television" for the "American Journalism Review." 4 Hargreaves, Ian, “Radio And The Quality Test,” Confidential paper presented at the Radio Authority Strategy Conference, 11-12 April 2002 2 3 Tools For Radio Content Regulation #1: Playlist Diversity Analysis © 2003 Grant Goddard page 2
  3. 3. BACKGROUND The Radio Authority uses ‘formats’ as a regulatory proxy for content diversity in local markets. These are not ‘formats’ in the American sense of the word that merely describes a radio station by its dominant programming content. For example, Capital FM London’s ‘format’, as mandated by the Radio Authority, is: CAPITAL FM IS A CONTEMPORARY/CHART MUSICLED SERVICE FOR UNDER 40'S IN LONDON. Music programming will be predominantly (up to 100%) current chart hits, new releases, and hits less than 10 years old. No more than 20% will be hits over ten years old. Specialist programmes for the target audience, which complement the main music mix, may be broadcast in non-daytime for up to 15 hours a week. Speech must account for at least 15% daytime weekdays (10% at weekends) or 5% non-daytime. News bulletins containing local/regional news must be broadcast at least during peaktime weekdays and breakfast at weekends. National news will feature at other times. Other information, including entertainment news, travel news, whats-ons, leisure activities and so on should be balanced across each day.5 But such format definitions, however detailed, can serve only as proxies for content diversity in a radio marketplace. Empirical tools are needed that can measure the actual diversity achieved in a specific market comprising any number of competing or complimentary radio stations. Economic analysis of product diversity in radio markets was pioneered in 1952 by American economist Peter Steiner.6 He assumed a thoroughly active listenership in which audience size was determined by the availability of a preferred programme type. In Steiner's model, when your favourite programme type is not on, neither is your radio.7 If there is a large enough audience for a particular programme type, then two or more competing stations will split that audience by offering similar programmes of that type. This process will continue until the audience for that programme type has been divided into small enough segments that it makes more sense for the next competitor to counter-programme with a completely different type of programme. Consequently, where only a few competitors exist (as is the case in most UK local markets), similar programmes tend to be offered by stations. Only as the number of competitors increases does stations' programming become more differentiated. 5 from Radio Authority website http://www.radioauthority.org.uk Steiner, Peter O, “Program Patterns & The Workability Of Competition In Radio Broadcasting,” Quarterly Journal Of Economics, 1952, #66 (2) 7 Webster, James G & Phalen, Patricia F & Lichty, Lawrence W, "Ratings Analysis," Lawrence Erlbaum Associates, New Jersey, 2000, p 165 6 Tools For Radio Content Regulation #1: Playlist Diversity Analysis © 2003 Grant Goddard page 3
  4. 4. Steiner asserted that a monopolist station owner would maximise product diversity and economic welfare in a broadcast market because a monopolist would want to capture every single listener and would therefore not duplicate programming across its co-owned stations. Or, in plain English, “some of the time you really get more diversity, more variety, if somebody controls more than one channel.”8 But Steiner’s findings rely on strict assumptions about consumer preferences that may not always hold up in the real world. If listeners pay for programming, even implicitly by sitting through advertisements, Steiner’s conclusions may no longer hold. Or if programmes of the same type, broadcast by competing stations within a radio market, are perfect substitutes to listeners, then one of those competitors may eventually move to broadcast a different type of programming, so as not to compete for advertising revenue solely on the basis of price.9 Because this economic theory does not provide us with a clear relationship between diversity of station ownership and diversity of programming within a market, the issue of content diversity analysis has increasingly attracted the attention of empirical study in the US. 8 Federal Communication Commission, “Roundtable Discussion On Media Ownership Policies,” 29 October 2001, MM Docket Nos. 01-235, 96-197, 92-264, 94-150, 87-514 & CS Docket Nos. 98-92 and 96-85, page 47 Becker, Gary & Murphy, Kevin, “A Simple Theory Of Advertising As A Good Or A Bad,” Quarterly Journal Of Economics, 1993, #108 (4) Gabszewicz, J, & Laussel, Didier & Sonnac, Nathalie, “TV Broadcasting Competition & Advertising,” CORE Discussion Paper 2000/6, 1999 Anderson, Simon & Coate, Stephen, “Market Provision Of Public Goods: The Case For Broadcasting,” NBER Working Paper #W7513, 2000 9 Tools For Radio Content Regulation #1: Playlist Diversity Analysis © 2003 Grant Goddard page 4
  5. 5. PLAYLIST DIVERSITY ANALYSIS IN THE U.S. “There certainly is a lot of format diversity across the US.” Conclusion of Wall Street report on US radio industry.10 A recent study (commissioned by the FCC) examined and compared the song playlists of 288 radio stations with the highest ratings in the largest US markets, as reported by Radio & Records magazine. The analysis uses a technique proposed by Peter Alexander11 that breaks down products into a bundle of characteristics that are given a numerical value of one if that characteristic is present or a numerical value of zero if that characteristic is not present. Each song played by a radio station is treated as an individual product characteristic. The songs played on two stations are compared and a measure of diversity is calculated according to how many of those songs are identical.12 If two stations each play 10 songs and none of those songs are the same, the measure of diversity equals 10. If two stations play exactly the same songs, the measure of diversity is 0. If two stations have only one song in common, the measure of diversity equals 9. The study analysed the top ten playlisted songs of stations within the same format, but in different markets, in March 2001 and produced the resulting measures of diversity: TABLE ONE: AVERAGE DISTANCE WITHIN SAME FORMAT OF US RADIO STATIONS13 Format: Distance: Adult Contemporary Active Rock Adult Alternative Alternative CHR Pop CHR Rhythm Country Hot Adult Contemporary Jazz Rock Urban Adult Contemporary Urban 5.53 6.02 7.06 6.68 6.00 6.79 5.94 6.49 6.39 6.94 7.21 6.68 where distance = 10, no songs are the same; where distance = 0, all songs are the same (from the 10 most played songs on national sample of 288 US radio stations). To the UK observer, what might seem remarkable is the homogeneity of songs between hundreds of radio stations operating the same format, but located in markets separated by thousands of kilometres within the continental US. The evidence that more than four out of the top ten songs are exactly the same on adult contemporary stations in, say, New York, Nashville and Los Angeles 10 Miller, Victor B & Ensley, Christopher B & Young, Tracy B, “Format Diversity: More Or Less?”, Bear Stearns & Co Inc, November 4 2002, research paper 11 Alexander, Peter, “Product Variety & Market Structure: A New Measure & Simple Test,” Journal Of Economic Behavior & Organisation, 1997, #32 12 Williams, George & Brown, Keith & Alexander, Peter, “Radio Market Structure & Music Diversity,” FCC Media Bureau Staff Research Paper #2002-9, September 2002 13 ibid., Table 2 Tools For Radio Content Regulation #1: Playlist Diversity Analysis © 2003 Grant Goddard page 5
  6. 6. seems to put the lie to the notion that commercial music radio in the US is locally-orientated. Much of this homogeneity is due to the power that the record companies and music trade magazines wield through their huge promotion and advertising budgets. Stations tend to slavishly follow the playlists of similarly formatted, larger stations in bigger markets. The smaller the market, the less likely a station can afford to commission its own regular music research, therefore the more likely it is simply to follow the music playlists of bigger city stations within the same format. (National, cabledelivered music television channels such as MTV, BET and CMT also have an important impact on tastes in local markets.) In order to determine whether the huge amount of ownership consolidation that has taken place in the US radio industry since the 1996 Telecommunications Act has impacted the level of format homogeneity, the researchers compiled the same data for March 1996. They found that station playlists had become very slightly less diverse between March 1996 and March 2001, but only by an almost insignificant factor of 2.4%. Playlist diversity showed a marginal decline in nine of the twelve formats analysed, but it is impossible to attribute such changes solely to radio industry consolidation. The declining fortunes of the music industry have caused major record companies to release significantly less product than they did in 1996, so there are now fewer newly issued music titles from which stations can select their playlists.14 In addition to their analysis of stations within exactly the same format, the US researchers also compared playlists between stations in pairs of different, but closely related, formats. Again, the data was based upon the ten most played songs listed in Radio & Records magazine in March 2001. The results were: TABLE TWO: AVERAGE DISTANCE ACROSS PAIRS OF SELECTED FORMATS OF US RADIO STATIONS15 Format One: Format Two: Distance: Adult Contemporary Active Rock Active Rock Adult Alternative Adult Alternative Adult Alternative Alternative Alternative CHR Pop CHR Pop CHR Rhythm CHR Rhythm Urban AC Hot Adult Contemporary Alternative Rock Alternative CHR Pop Hot Adult Contemporary CHR Pop Rock CHR Rhythm Hot Adult Contemporary Urban AC Urban Urban 9.06 7.40 7.30 9.37 9.47 8.96 9.44 8.33 8.49 8.11 9.41 7.85 8.61 where distance=10, no songs are the same; where distance=0, all songs are the same (from the 10 most played songs on national sample of 288 US radio stations). For those of us (anoraks) in the UK who have long tried to discern the subtle differences between the 254 radio formats16 that exist in the US [see Appendix 14 ibid., page 11 ibid., Table 3 16 BIA – Investing In Radio, Fall 2001 15 Tools For Radio Content Regulation #1: Playlist Diversity Analysis © 2003 Grant Goddard page 6
  7. 7. A], these results are interesting. If, on average, two out of the ten most played songs on CHR [Contemporary Hit Radio] Rhythm stations and Urban stations are exactly the same, how significant are the differences between these two formats to the average radio listener? In the US, niche media markets are segregated by race, as much as they are by age or gender. The greatest difference between CHR Rhythm stations and Urban stations is that the former target white listeners whilst the latter target black listeners.17 It is likely that the former stations employ white DJs and target mainstream advertisers, whilst the latter employ black DJs and target black and minority advertisers. What is not so obvious to their segregated audiences is that the stations in these two formats overlap musically more than their two sets of listeners realise. The two stations could even be co-owned since most black-targeted stations in the US are white-owned.18 In a wider context, these results throw significant light on a number of other recent research studies in the US that argue, to differing degrees, that consolidation either has not changed or has increased the number of different formats available in local markets since industry consolidation started in 1996.19 If many of the 254 radio formats designated in the US have some similarities in musical content to each other, any research that simply uses format names (rather than playlists) to demonstrate an increase in the number of different formats available in each local market becomes a rather academic exercise. In this post-consolidation era, it is natural that an owner of multiple stations in the same market will move to adopt different format names, in order to attract to each station different listeners and to sell advertising to different sets of buyers. But the results outlined above suggest that some of the songs on stations' playlists will be similar, even if one station labels itself "CHR Rhythm" and the other "Urban." Another recent research report by the Future of Music Coalition20 analysed playlist data from US trade magazines Radio & Records and Billboard and concluded that “formats with different names have similar playlists” with evidence of as much as a 76% overlap. Its assertion that there exists “considerable format homogeneity” within local markets in the US elicited a frosty response from the radio industry. A spokesperson for Clear Channel, which owns 1200 US radio stations, argued that radio stations offer different listening experiences even if they share the same songs. Diane Warren, Clear Channel VP Communications, explained: “The music might cross over, but the combination of the information, the community service, the commercials, the 17 The word “urban” is a US radio industry euphemism for “black” Waldfogel, Joel, “Comments On Consolidation & Localism: Prepared For FCC Roundtable: Oct 29 2001,” 2001, photocopied working paper 19 Miller, Victor B & Ensley, Christopher B & Young, Tracy B, “Format Diversity: More Or Less?”, Bear Stearns & Co Inc, November 4 2002, research paper “Has Format Diversity Continued To Increase?,” report by BIA Financial Network, 5 June 2002 Fratrik, Mark R, “State Of The Radio Industry: What Is Going On With Radio Formats?,” report by BIA Financial Network, 2001 20 DiCola, Peter & Thomson, Kristin, “Radio Deregulation: Has It Served Citizens And Musicians?,” report by the Future of Music Coalition, 18 November 2002 18 Tools For Radio Content Regulation #1: Playlist Diversity Analysis © 2003 Grant Goddard page 7
  8. 8. promotions, all these elements make up the total radio station.”21 In a world where stations’ own adopted format labels may not reveal any significant information about their musical content (relative to their competitors'), broadcast regulators need to exercise extreme caution not to take format information only at face value. In the UK context, the Radio Authority's legislative ability to prescribe, for example, a CHR station, an AC [Adult Contemporary] station and a dance music station within the same local radio market is no guarantee that real content diversity will exist within that market for listeners. Analysis tools need to be designed and implemented in radio regulation that measure ‘exposure diversity,’ as much as ‘content diversity.’ ‘Content diversity’ measures the diversity of viewpoints, diversity of programme types (formats) and demographic diversity that exist in media content available within a given broadcast market. Whereas ‘exposure diversity’ refers to the diversity of sources that are actually consumed by the audience, which may be very different from the diversity of content or sources that are made available to them. The ‘free market’ model of broadcast regulation assumes that audiences who are provided with a diversity of content options automatically consume a diversity of content. This is not necessarily the case. The mere provision of a radio service within an area is no guarantee that it will attract an audience.22 A local radio station might choose to air a minority interest programme at a broadcast hour it knows has minimal ratings potential. The station can (justifiably) claim to have made an effort to achieve ‘content diversity’ in its output. But an analysis in more critical terms of ‘exposure diversity’ would award it ‘nul points’ if the programme was hidden away in a quiet corner of the schedule and never actively promoted. Similarly with music content, a radio station might (justifiably) claim to achieve ‘content diversity’ by playing a wide variety of songs from an extensive music library. But if, in its peak hours, that station is simply playing the same current chart hits over and over again, such diversity will be lost on the average listener. Offering such diversity at times when very few people are available to listen does not achieve ‘exposure diversity.’ 21 Sullivan, Andy, “UPDATE 1 – Radio Regulation No Hit For Listeners – Report,” Reuters/Variety news story accessed on the internet at Yahoo! News, 19 November 2002 22 Napoli, Philip M, “Diversity & Localism: A Policy Analysis Perspective,” Graduate School of Business Administration, Fordham University, photocopied working paper. Tools For Radio Content Regulation #1: Playlist Diversity Analysis © 2003 Grant Goddard page 8
  9. 9. PLAYLIST DIVERSITY ANALYSIS IN THE UK "While there can be difficulty in practice in establishing and maintaining distinctions between various mainstream popular music formats, the [Radio] Authority believes that there is a continuing need to regulate for diversity in some way, lest all services tend towards a bland centre and a limited focus." Radio Authority document ‘Radio Regulation For The 21st Century.’23 Music radio stations publish playlists on a weekly basis that comprise a list of the new releases that will be played on that station in the forthcoming week. Often these playlists are divided into categories. ‘A-list’ songs receive the most regular and high-profile exposure on a station and represent the station’s most popular songs and/or the biggest current hits. ‘B-list’ songs receive less exposure on the station and represent up-and-coming artists and songs. ‘C-list’ songs receive irregular exposure in non-peak periods. However, a station’s published playlist may not be synchronous with the songs actually played on that radio station during the same week. This can be due to a number of reasons. A station may add a particularly ‘hot’ song (for example, a new Robbie Williams single) in the middle of a week, as soon as it is supplied to the station. Or, the parameters that have been set within the station’s music scheduling software may force the play of some B-list songs on more occasions in a week than some songs on the A-list. Or, a station may add a song to the playlist due to pressure from the promotion department of a major record company to do so, but then not actually play the song on-air.24 Because of the ‘reality gap’ known to exist between the list of songs that a station says it is playing (the playlist) and the songs it actually plays, the music industry has long subscribed to monitoring services that provide song-by-song lists of every programme broadcast on the largest UK radio stations. Initially, this was achieved by employing a team of staff (mainly women) in a London office who listened to stations’ output in real time or on time-shifted recordings, and wrote down the song details.25 A decade ago, electronic ‘fingerprinting’ was introduced which allowed computers to monitor a station’s output and recognise a song by analysing a tiny sample of music within it, which is then matched against one of many thousand song samples digitally stored in its memory. 23 "Radio Regulation For The 21st Century - Submission To The DCMS/DTI," Radio Authority, June 2000, section 7.11, p 23 24 An aside. In 1980 I was responsible for compiling the playlist at Metro Radio. When I refused to add the Queen song “Flash Gordon” to the playlist, EMI Records boycotted the radio station and refused to supply it with free promotional copies of its new releases. The boycott lasted several months during which time, when I needed copies of other EMI Records releases to add to the playlist, I had to drive to the nearest record shop and purchase copies over the counter. Life would have been easier if I had simply added the Queen song to the published playlist but not scheduled it within programmes. EMI Records, at the other end of the country, would never have known that the song was not being played. 25 Sham Tracking was the most successful company in London doing this painstaking monitoring work. Tools For Radio Content Regulation #1: Playlist Diversity Analysis © 2003 Grant Goddard page 9
  10. 10. In the UK, a company called Music Control26 monitors a growing number of radio stations across the country and analyses the information for paid subscribers to its service. The data is also used in music industry charts published weekly in the trade magazine ‘Music Week.’ Music Control goes much further than merely providing a list of songs played by a particular radio station in any given week. Their data is weighted according to the RAJAR ratings achieved by that radio station, both in absolute terms (i.e. does the station have two million listeners rather than two thousand) and in relative terms (i.e. does the station’s breakfast show have twenty times the number of listeners to the station’s late-night show). Music Control compiles a weekly airplay chart for each of the stations it electronically monitors, with songs ranked by their total number of ‘audience impressions,’ rather than simply by the number of times they were played. This is similar to the system used in radio advertising, where the success of a campaign can be quantified according to the number of ‘gross impressions’ – the theoretical number of listeners reached across the whole of an advertisement’s life, taking account of scheduling across different dayparts and different programmes. The weekly airplay charts produced by Music Control provide us with excellent data for examining the issue of ‘exposure diversity,’ because they already incorporate audience weighting using the most recent RAJAR survey data. For example, a particular song would have to be played more than 100 times during a station’s overnight programmes to achieve equivalent weight within that station’s airplay chart to a single play in its breakfast show (if the ratio of breakfast to overnight listeners were 100 to 1). Such weightings produce an airplay chart that more accurately reflects the impression that the ‘average’ listener to a station will come away with of that station’s music content. If a station schedules a narrower selection of songs in its highly rated breakfast show than it plays during the remainder of the day, then that narrow selection of songs will tend to dominate the station’s Music Control airplay chart. As an example, the top five songs in BBC Radio 1’s Music Control airplay chart for week 48, from 24 to 30 November 2002 inclusive, were: TABLE THREE: BBC RADIO 1 AIRPLAY CHART FOR WEEK 48 OF 200227 ranking artist – title no. of plays no. of points 1 Jennifer Lopez – Jenny From The Block 29 25.243 2 Christina Aguilera – Dirrty 31 25.190 3 DJ Sammy & Yanou – Heaven 30 24.843 4 Ms Dynamite – Put Him Out 31 24.689 5 Liam Lynch – United States Of Whatever 31 24.591 26 "Music Control monitors music broadcast on over 700 radio and television stations across 18 countries worldwide. The music industry standard, we provide a wide range of reports and charts, from bespoke analyses of individual titles to radio station profiles to company market share performance. Music Control uses a unique patented electronic fingerprinting technology 'Medicor' developed for the sole use of Music Control and the direct specific needs of the music industry. Reports are available via our closed user client website in real time, daily, weekly or any other period requested. Over 1000 record companies worldwide use the Music Control services. We are IFPI recognised and are a partner company of BDS the North American Music Monitors. Music Control is part of the Media Control Group, Europe's leading music monitors for over 20 years." 27 2002 © Music Control, proprietary research Tools For Radio Content Regulation #1: Playlist Diversity Analysis © 2003 Grant Goddard page 10
  11. 11. This table demonstrates the ranking by Music Control’s own system of ‘points’ rather than simply by the total number of plays achieved throughout the station’s output during the whole week. The top song was played fewer times than other songs, but its plays were in hours with higher audiences than those other songs. This renders the Music Control airplay charts a powerful tool for comparative analysis of stations' music formats, since the charts accurately reflect:    the songs actually played on-air, rather than the songs the station includes in its published playlist, or the songs produced by the station’s scheduling software (for example, its RCS Selector log); the ratings skew of a particular station, so that if its evening show is the most popular segment of its broadcast day (as sometimes happens with dance music formats), the airplay chart accurately factors that characteristic; the relative weight of stations’ audience sizes within the whole UK radio market, since the points allocated to songs in each station’s chart reflect the relative size of each station’s listenership. The Music Control airplay charts of two different radio stations can be compared quantitatively in different ways: METHOD ONE. A simple method of comparison is to count the number of songs that are the same within two stations’ playlists. This is the basis of the American research referred to above. However, this methodology does not weight the data according to a particular song’s prominence within a station’s playlist. So, for example, two stations may have the same song on both their playlists but, on station A, that song might be ranked #1 whilst, on Station B, it might be ranked #20. The average listener would be unlikely to hear the similarity between those two stations, as the song would be much less frequently played on station B than on station A. Whereas, a listener would definitely notice the similarity if a song were ranked #1 on both stations. In the American research, both these situations would be analysed the same way with the same result. METHOD TWO. Additional data can be compiled that shows the difference between two or more stations’ playlist systems and music scheduling. So, for example, an analysis of airplay charts from Week 48 (24 Nov to 30 Nov 2002 inclusive) for music stations heard in the London market reveals the following characteristics [calculations in Appendix E]: Tools For Radio Content Regulation #1: Playlist Diversity Analysis © 2003 Grant Goddard page 11
  12. 12. TABLE FOUR: ANALYSIS OF AIRPLAY CHARTS FOR RADIO STATIONS MONITORED IN LONDON RADIO MARKET FOR WEEK 48 200228 BBC BBC Capital KISS Heart Magic Choice Virgin XFM BBC R1 R2 FM FM 106.2 105.4 FM London 474 251 690 800 449 162 300 373 540 53 aggregate no. of plays/week of Top 20 songs 31 21 59 54 42 26 22 27 41 9 highest number of plays/week of any song 16 6 17 26 8 6 7 13 19 1 lowest number of plays/week of any song 24 13 35 40 22 8 15 19 27 3 mean plays per week of all Top 20 songs 22 11 34 40 16 7 16 18 25 2 median plays per week of all Top 20 songs 15 15 42 28 34 20 15 14 22 8 range of no. of plays/week within Top 20 songs 5 5 13 10 13 4 4 4 7 2 standard deviation of plays/week within Top 20 songs The huge differences demonstrated here between stations’ top twenty airplay charts can be the result of many contributory factors:      some stations have no commercials (BBC); some stations have more speech content than music (BBC London); some stations operate a playlist during the day but have specialist music shows at night that include almost no playlisted songs (Choice FM, BBC Radio One); some stations operate short playlists whose songs are played very frequently (KISS FM); some stations operate long playlists whose songs are played relatively infrequently (Magic 105.4). As a result of these varying station characteristics, it is difficult to execute any useful analysis from the data alone of the number of plays attributed to each song within a station’s playlist. The data only serves to offer us a vague impression of the different playlist regimes and programming policies adopted by different stations. METHOD THREE. A system of playlist comparative analysis is required that will examine the number of identical songs within the playlists of two or more radio stations, but will weight those similarities to the comparable degree that they impact on the typical listener. When a listener makes the oft-heard comment that “radio station X always play the same songs,” they are 28 Appendix E, 2002 © Music Control, proprietary research Tools For Radio Content Regulation #1: Playlist Diversity Analysis © 2003 Grant Goddard page 12
  13. 13. (unknowingly) referring to those songs at the very top of that station’s airplay list, because these are the very songs to which they are most exposed. Appendix B contains an Excel spreadsheet that demonstrates the detailed mathematical basis of a new index created to measure the degree of diversity/homogeneity between airplay charts of two radio stations. What follows is a shorthand explanation of how that index is calculated from airplay chart data collected and published by Music Control. (The mathematicallyminded reader will benefit from consulting Appendix B whilst reading the following account.) A radio station’s twenty most played songs are assigned points according to the ranking within its airplay chart. The #1 song is allocated 20 points, the #2 song allocated 19 points, the #3 song allocated 18 points, in sequence down to the #20 song allocated only 1 point. Where two or more songs are tied at an equal placing within the chart, each is allocated an equal share of their total points. For example, taking the BBC Radio One data from Table Three as an example, points are allocated thus: TABLE FIVE: POINTS ALLOCATED TO RANKINGS WITHIN BBC RADIO 1 AIRPLAY CHART FOR WEEK 48 OF 200229 ranking artist – title no. of points 1 Jennifer Lopez – Jenny From The Block 20 2 Christina Aguilera – Dirrty 19 3 DJ Sammy & Yanou – Heaven 18 4 Ms Dynamite – Put Him Out 17 5 Liam Lynch – United States Of Whatever 16 A quantitative comparison can now be made between two radio stations’ airplay charts according to the points allocated to each song in each station’s Top Twenty. Where a particular song appears in both stations' playlists, the points allocated to the song for Station B are subtracted from the points allocated to the same song for Station A. Any resulting minus sign is ignored (in mathematical terms, the result is expressed as an ‘absolute’), so that is does not matter which station is selected as Station A or which station is selected as Station B. There are three possible outcomes when comparing the points accrued by a song within two airplay charts:    29 a song is played on Station A but not on Station B, the resultant points will be the points allocated to the song on Station A (= points for Station A minus zero); a song is played on Station B but not on Station A, the resultant points will be the points allocated to the song on Station B (= points for Station B minus zero); a song is played on Station A and on Station B, the resultant points will be that song’s points on Station A minus that song’s points on Station B (or vice versa) expressed as an absolute value. 2002 © Music Control, proprietary research Tools For Radio Content Regulation #1: Playlist Diversity Analysis © 2003 Grant Goddard page 13
  14. 14. Proceed to sum the total number of resultant points. If Station A and Station B have absolutely identical playlists, the total number of points will be 0. If there were no identical songs on Station A and Station B, the total number of points will be 210. Stations that share some songs will accrue a number of points between 0 and 210, according to the degree of similarity of their playlists. The number of points depends not only upon a song being on both stations’ playlists but, importantly, on the relative proximity of the song within the two playlists. Thus, in economic terms, there are three distinct ‘distance measures of diversity’30 simultaneously incorporated into a single ‘score’:    the number of identical songs that appear in both station’s playlists; the ranking of a song within a station’s airplay chart; the distance between the rankings of the same song within two stations’ playlists. The diversity measure score (with a value between 0 and 210) is better understood by conversion to a handy user-friendly diversity/homogeneity INDEX (see Appendix A for the conversion process) that takes a value from 0 to 100 which signifies:    value 0 if no songs on two station playlists are the same; value 100 if all songs on two stations’ playlists are identical in both name and rank; value between 0 and 100 if some songs appear on both station playlists. 30 Alexander, Peter, “Product Variety & Market Structure: A New Measure & Simple Test,” Journal Of Economic Behavior & Organisation, 1997, #32 Tools For Radio Content Regulation #1: Playlist Diversity Analysis © 2003 Grant Goddard page 14
  15. 15. THE PLAYLIST DIVERSITY INDEX This INDEX can be used to examine and quantify radio station playlists in a number of different situations: 1. Playlist Diversity Analysis – within a local market As an example of how this tool can be used, the airplay charts of music stations available on AM/FM radio in the London market for week 48 of 2002 (24 to 30 November inclusive) were analysed. [BBC London is excluded in all subsequent comparisons because it plays such a minimal amount of music that analysis yields no useful information.] [Calculations are detailed in Appendix C.] TABLE SIX: PLAYLIST DIVERSITY INDICES FOR LONDON MUSIC RADIO STATIONS31 BBC BBC Capital KISS Heart Magic Choice Virgin XFM R1 R2 FM FM 106.2 105.4 FM 11 37 46 6 0 23 16 18 BBC R1 11 29 12 9 0 0 9 0 BBC R2 37 29 39 7 0 13 11 0 Capital FM 46 12 39 12 0 20 0 1 KISS FM 6 9 7 12 4 0 0 0 Heart 106.2 0 0 0 0 4 0 0 0 Magic 105.4 23 0 13 20 0 0 0 3 Choice FM 16 9 11 0 0 0 0 5 Virgin 18 0 0 1 0 0 3 5 XFM where 0 means no identical songs and 100 means all songs and playlist rankings are identical. (The table is identical on either side of the grey axis.) The two stations that demonstrate the greatest similarity in playlists are BBC Radio One and KISS FM with an index of 46. It can be inferred that the average listener to both stations will probably come away with the impression that about half of the music they hear is identical on both stations. This is confirmed by a less mathematical look at their respective playlists. A comparison of their playlists shows that 11 songs are common to both stations. Both stations have an identical #1 song (Jennifer Lopez). 7 out of the top 8 songs on KISS FM appear on the Radio One playlist. 4 out of the top 8 songs on Radio One appear on the KISS FM playlist. Pairs of stations that have absolutely no similarity in their playlists (index = 0) are: BBC Radio One and: BBC Radio Two and: Capital FM and: KISS FM and: Heart 106.2 and: Magic 105.4 and: Virgin and: Choice FM and: XFM: 31 Magic 105.4 Magic 105.4, Choice FM, XFM Magic 105.4, XFM (co-owned) Magic 105.4 (co-owned), Virgin Choice FM, Virgin, XFM BBC Radio One, BBC Radio Two, Capital FM, KISS FM (coowned), Choice FM, Virgin, XFM KISS FM, Heart 106.2, Magic 105.4, Choice FM BBC Radio Two, Heart 106.2, Magic 105.4, Virgin BBC Radio Two, Capital FM (co-owned), Heart 106.2, Magic 105.4 Appendix D Tools For Radio Content Regulation #1: Playlist Diversity Analysis © 2003 Grant Goddard page 15
  16. 16. 2. Playlist Diversity Analysis – between individual stations The playlists and Table Six above can also be examined to analyse the performance of individual stations: CAPITAL FM Roughly one third similarity with BBC Radio One, BBC Radio Two and KISS FM. These indices (37, 29 and 39 respectively) are too close to each other not to make the inference that the station’s music is being programmed defensively against its main three competitors in the London market. This might also explain why the station can seem to lack an effective ‘station sound’ since it is rather attempting to be a hybrid of three other stations. KISS FM The index of 46 with BBC Radio One speaks volumes about the latter’s desperation to be a dance-orientated station, rather than a youth station embracing multiple music genres. The index of 12 with BBC Radio Two is more surprising. The appearance of Sugababes, Blue, Sophie Ellis Bextor and Dannii Minogue within KISS FM’s Top 20 playlist demonstrates its determination to be more than a dance music station. HEART 106.2 Heart 106.2’s greatest similarity is with KISS (though the index is only 12)! A look at Heart’s playlist reveals the reason. The ten most played songs (in descending order) were by: Westlife, Nelly, Liberty X, Blue, Atomic Kitten, Atomic Kitten (again), Liberty X, Kylie Minogue, Darren Hayes and Anastacia. This is somewhat surprising, given that the station’s ‘format’ as agreed with the Radio Authority states: “The music will be melodic or soft adult contemporary and will exclude the extremes of dance, rap, teenage pop, indie and heavy rock.” [emphasis added]32 Heart 106.2 is moving towards teenage pop from one direction, just as KISS FM too is edging towards teenage pop from the opposite direction. MAGIC 105.4 The analysis shows that Magic is the most distinctive music station heard in London by a long way, playing songs not duplicated on other stations. 32 from Radio Authority website http://www.radioauthority.org.uk. To ensure that this was not simply a one-week aberration, data for previous weeks was checked. The ten most played records (in descending order) on Heart 106.2 for previous weeks were by these artists: Week 47 (17 to 23 November 2002) – Liberty X, Nelly, Westlife, Blue, Atomic Kitten, Liberty X, Atomic Kitten, Shaggy, Kylie Minogue and Emma Bunton; Week 46 (10 to 16 November 2002) – Nelly, Blue, Liberty X, Atomic Kitten, Westlife, Atomic Kitten, Liberty X, Emma Bunton, Enrique Iglesias and Nelly Furtado; Week 45 (3 to 9 November 2002) – Atomic Kitten, Blue, Liberty X, Nelly, Westlife, Liberty X, Atomic Kitten, Darren Hayes, Anastacia and Shaggy. Week 44 (27 October to 2 November 2002) – Liberty X, Blue, Atomic Kitten, Nelly, Sophie Ellis Bextor, Westlife, Sugababes, Liberty X, Darren Hayes and Kylie Minogue. Week 43 (20 to 26 October 2002) – Atomic Kitten, Liberty X, Blue, Liberty X, Sugababes, Westlife, Darren Hayes, Nelly Furtado, Shaggy, Anastacia. Week 42 (13 to 19 October 2002) – Sugababes, Liberty X, Blue, Westlife, Liberty X, Atomic Kitten, Darren Hayes, Anastacia, Shaggy, S Club 7. Tools For Radio Content Regulation #1: Playlist Diversity Analysis © 2003 Grant Goddard page 16
  17. 17. CHOICE FM Choice’s similarity with KISS FM (index = 20) is not unexpected, given that both major in black music. Choice’s similarity with BBC Radio One (index = 23) is another reflection of the latter’s current preoccupation with black music and dance music. VIRGIN and XFM These two rock stations prove to be very distinctive from each other (which is how it should be) and equally distinctive from other commercial stations in the market. 3. Playlist Diversity Analysis – within radio ownership groups "[The Radio Authority] also believe it is important to take into account the impact of concentration of ownership on the recorded music industry, with central playlists reducing the opportunities for a diversity of music to find broadcast time, and the risk of undue influence being exercised by a very limited number of people." Radio Authority document ‘Radio Regulation For The 21st Century.’33 “Red Dragon FM in Wales and its sister station in London, Capital FM, are virtually identical in terms of their play list…. So is it not essential to make sure that we jig the market a little bit? It will not do it all on its own.” Chris Bryant MP for Rhondda, House of Commons Standing Committee on Communications Bill.34 The same analysis can be made of stations with common ownership. This helps to illuminate the issue of whether a radio group is scheduling the music locally or whether a ‘network playlist’ is in operation. Examining the data for stations owned by Capital Radio for the week 48 of 2002 (24 to 30 November inclusive) gives the following results [calculations are detailed in Appendix F]: 33 "Radio Regulation For The 21st Century - Submission To The DCMS/DTI," Radio Authority, June 2000, section 7.13, p 24 34 Communications Bill, Standing Committee E, 28 January 2003, Column Number 739 Tools For Radio Content Regulation #1: Playlist Diversity Analysis © 2003 Grant Goddard page 17
  18. 18. TABLE SEVEN: PLAYLIST DIVERSITY INDICES FOR CAPITAL RADIO GROUP RADIO STATIONS35 Century Century Capital Beat Invicta Power Southern BRMB Fox Ocean Red 100-102 105.4 FM 106 FM FM FM FM FM FM Dragon FM 81 28 30 27 20 17 27 29 56 25 Century 100-102 81 26 27 25 18 17 24 27 52 23 Century 105.4 28 26 31 80 80 76 90 82 22 84 Capital FM 30 27 31 39 38 34 33 36 27 38 Beat 106 27 25 80 39 82 81 84 88 21 90 Invicta FM 20 18 80 38 82 82 87 82 18 85 Power FM 17 17 76 34 81 82 77 80 14 82 Southern FM 27 24 90 33 84 87 77 84 21 88 BRMB FM 29 27 82 36 88 82 80 84 24 85 Fox FM 56 52 22 27 21 18 14 21 24 21 Ocean FM 25 23 84 38 90 85 82 88 85 21 Red Dragon FM where 0 means no identical songs and 100 means all songs and playlist rankings are identical. (The table is identical on either side of the grey axis.) The conclusions that can be drawn from these data are:     Century 100-102 (NorthEast England) and Century 105.4 (NorthWest England) operate with almost identical music scheduling; Capital FM (London), Invicta FM (Kent), Power (Hampshire), Southern FM (Sussex), BRMB FM (Birmingham), Fox FM (Oxfordshire) and Red Dragon FM (Cardiff & Newport) all operate with almost identical music scheduling; Power FM and Ocean FM, co-owned stations in the same local market, score a common index of only 18, which makes them quite distinct from each other in music terms; Beat 106 appears to be a hybrid of Century and of Capital FM in equal proportions (indices of 30 and 31 respectively). The remaining third, one hopes, comprises the “fresh dynamic mix of new rock and dance music” definition in its Radio Authority ‘format.’36 An analysis of stations across one radio group would also reveal whether that group was exerting market influence unduly in the UK market. For example, you could see if Capital Radio was using its stations to prominently promote releases on its Wildstar record label. Or you could see if the Chrysalis Group were using its stations to excessively promote its own music copyrights. In such cases, Music Control’s airplay charts would easily reveal the impact not just of a station playing a particular song a number of times per week, but 35 36 Appendix G from Radio Authority website http://www.radioauthority.org.uk Tools For Radio Content Regulation #1: Playlist Diversity Analysis © 2003 Grant Goddard page 18
  19. 19. whether it is scheduling those plays in peak hours, which would make a much more substantial impression on the audience. 4. Playlist Diversity Analysis – across a timeline Comparisons can be made between the same stations in the same market at different points in time. This can help illuminate such issues as whether:      a station has made a sudden music policy change; a station has slowly moved its format toward or away from that of other radio stations; ownership changes in a market have effectively widened listener choice over time, or narrowed it; a station has responded effectively to instructions from the regulatory authority to amend its music programming policies; a station is simply trying to clone an existing competitor within the same market. For example, a comparison of the music played in the London market this year (week 48: 24 to 30 November 2002 inclusive) and four years ago (7 June 1998) [calculations in Appendix H] produces these two tables: Tools For Radio Content Regulation #1: Playlist Diversity Analysis © 2003 Grant Goddard page 19
  20. 20. TABLE EIGHT: COMPARISON OF PLAYLIST DIVERSITY INDICES FOR LONDON RADIO STATIONS IN 2002 AND IN 199837 summary table of indices: 24-30 November 2002 BBC R1 BBC R1 BBC R2 Capital FM KISS FM Heart 106.2 Magic 105.4 Choice FM Virgin XFM BBC R2 Capital FM 11 KISS FM 37 29 Heart 106.2 46 12 39 Magic 105.4 Choice FM Virgin XFM 6 9 7 0 0 0 23 0 13 16 9 11 18 0 0 12 0 4 20 0 0 0 1 0 0 0 0 0 3 11 37 29 46 6 12 9 39 7 12 0 0 0 0 4 23 0 13 20 0 0 16 18 9 0 11 0 0 1 0 0 0 0 0 3 Choice FM Virgin 5 5 summary table of indices: 7 June 199838 BBC R1 BBC R2 Capital KISS Heart FM FM 106.2 2 24 22 2 14 0 24 14 18 XFM 0 8 26 9 BBC R1 19 3 16 0 BBC R2 21 5 15 0 Capital FM 22 0 18 4 29 0 0 KISS FM 0 19 21 4 0 20 0 Heart 106.2 8 3 5 29 0 0 0 Choice FM 26 16 15 0 20 0 0 Virgin 9 0 0 0 0 0 0 XFM where 0 means no identical songs and 100 means all songs and playlist rankings are identical. (The tables are identical on either side of the grey axis.) A comparison of these two tables shows evidence that:      37 38 the highest index number in 1998 was 29 (between KISS FM and Choice FM), compared with an index of 46 (between KISS FM and Radio One) in 2002; XFM’s playlist is as distinctive from competitors in 2002 as it was in 1998; KISS FM is far less distinctive in 2002 than it was in 1998 (particularly from Capital FM and Radio One); Capital FM is less distinctive in 2002 than it was in 1998 (particularly from KISS FM and, surprisingly, Radio Two); Virgin Radio is far more distinctive in 2002 than it was in 1998 from all competitors. Magic 105.4 was not monitored by Music Control in 1998. Appendix I Tools For Radio Content Regulation #1: Playlist Diversity Analysis © 2003 Grant Goddard page 20
  21. 21. It can be deduced that:    Radio One, Capital FM and KISS FM are significantly closer to each other’s music format than they were only four years ago; Magic 105.4, XFM and Virgin extend listener choice significantly in the London radio market by adopting distinctive music policies from all their competitors; KISS FM and Radio One’s music playlist already overlap to a significant extent and are moving closer to one another. Tools For Radio Content Regulation #1: Playlist Diversity Analysis © 2003 Grant Goddard page 21
  22. 22. CONCLUSIONS Playlist diversity analysis provides us with useful information about the music content available from radio stations in a specific market. Because the system of analysis detailed in this paper is based upon empirical data of the actual songs played by a radio station (rather than upon data supplied by the station itself about its playlist), it offers us a regulatory tool that views the market from the perspective of the average listener. The factoring of the latest RAJAR ratings within the playlist data compiled by Music Control makes playlist diversity analysis an even more powerful tool, because no additional interpretation is required as to a particular station’s listenership patterns. No knowledge is required as to whether a station’s peak audience is at 8am or at 8pm, because Music Control has already incorporated this variable into its playlist chart for each station. Mere words are becoming increasingly redundant in trying to describe a radio station’s music policy in relation to another radio station. Questions such as “What is the difference between the music played on easy listening formatted Station A and middle-of-the-road formatted Station B?” are not easy to answer in purely descriptive terms. Playlist diversity analysis offers a tool that can objectively quantify the similarities and differences between radio stations in our increasingly complex radio markets. The new regulator Ofcom will assume a much more active role in monitoring, analysing and evaluating commercial radio content. In order to effectively represent consumers’ interests in its policy and strategy processes, the regulator needs to look at content diversity more directly from the viewpoint of the radio consumer. Empirical data derived from content monitoring can prove far more revealing about the state of diversity achieved in a marketplace than any amount of interpretation of format names or programming descriptions. As part of its knowledge management remit, and in order to acquire a deeper understanding of the content dynamics that operate within the radio industry, Ofcom should consider:      subscribing to Music Control data; instituting a rolling schedule of market analyses to help build time series for individual markets; preparing individual analyses for areas where new or re-advertised licences are pending; preparing individual analyses relating to stations and markets affected by pending mergers and acquisitions; ongoing evaluation of radio groups’ performances in achieving music policy distinctiveness and ‘localness.’ Tools For Radio Content Regulation #1: Playlist Diversity Analysis © 2003 Grant Goddard page 22
  23. 23. Grant Goddard is a media analyst / radio specialist / radio consultant with thirty years of experience in the broadcasting industry, having held senior management and consultancy roles within the commercial media sector in the United Kingdom, Europe and Asia. Details at http://www.grantgoddard.co.uk Tools For Radio Content Regulation #1: Playlist Diversity Analysis © 2003 Grant Goddard page 23