How long does it take to find out if country radio listeners dislike a song? How many weeks of airplay does it take to make ‘em tired of a song? Is the fact that two or three country stations per market all play almost the same songs in nearly identical rotations hurting the appeal of country music on the radio?
To try to learn the answers to those questions, A&O tracks audience response to every current hit tested by our client stations year-round. Most weeks, that's between 35 and 45 titles.
Lately, I have been entering all of them - by the number of weeks a song has been played nationally (A&O AccuTest recommendations) - into a spreadsheet. The tracking of average scores is displayed by number of weeks the songs play.
Here is how the audience felt about - not the best, nor the worst, but - the average of all of the country songs played during the summer and fall of 2011:
% Positive %Burn %Dislike %Unfamiliar
After four weeks: 51.95 2.58 7.03 20.35
In most cases, this was the first week the song was tested. The tune had received 14 plays in its first week on the radio, 20 plays in its second, 27 plays in the third week and 29 plays in the week the song was initially researched. Half of the national sample (52% was already starting to like it, while one-fifth were unfamiliar with it (based on a :07-:10 hook).
The acceptance radio margin (positive:negative) was 5.41:1. In other words, the odds were 5.4 to one that by the time that UNfamiliar 20.35% of the sample became familiar with it, they would also like it.
After seven weeks: 46.9 2.7 7.32 23.9
I don’t have an explanation, just an observation. After three additional weeks of play - on average in secondary rotation - the changes in acceptance and rejection are significant statistically.
Unfamiliarity grew by 17.4% and positives dropped by 9.7%.
Could it be that passionate fans, heavy-users, who are first to hear and recognize new music feel most positive about it right away? Then, their negatives begin to grow after this early positive response as they become more discerning?
Almost all songs in the sample received at least seven weeks’ airplay before being dropped (some were for poor research results, but most drops came as a result of slow chart momentum and not weak test scores). A small number of songs were dropped after three weeks’ play. The majority of songs tested in week four were still on the playlist by week eleven.
A dirty little "consultant secret:" our first national indication that perhaps a song isn't going to make it comes not from poor research scores, but when a number of influential clients simply stop testing it.
Eleven weeks: 65.98 3.65 8.75 6.08
The 41% increase in acceptance at this point was due to at least three possible factors: familiarity with the song had gotten up to almost 94% with (by now) an average of 35 plays per week, the lower-testing titles had been dropped from play at this point (which by and of itself improved the average score) and combined negatives had only increased from 10.02% to 12.4%, a 23.8% hike.
Positives seem to grow faster than negatives.
Fifteen weeks: 68.1 3.6 8.18 6.8
Now, familiarity has peaked and stabilized. Negatives are steady. Yet, at this point, almost all reporting stations are beginning to move the song from a 38+ play rotation to an average rotation of 22 plays, due to pressure from below the song on the chart to move on to other power records.
And, why not?
It begins to seem that if a PD or MD kept a song in power rotation until burn increased to extreme levels, almost nothing would ever go off that station:
Nineteen weeks: 72.75 5.2 5.53 4.2
Twenty-one weeks: 73.68 5.0 6.25 2.65
Normally, by that time, the song is starting to go off of all current trade charts - yet it’s clear that increased play does NOT appear to cause burn or dislike to grow.
Meanwhile, familiarity makes great drops in weeks 15 thru 21. Light and medium-users have now become familiar as well, and P-1 listeners are not yet growing tired of it.
Interestingly enough, all of the markets included in these averages have two, and in some cases, three country stations.
It certainly does not seem like fatigue with over-exposed music, or growing dislike for the burnt-out songs, is a problem for country radio today.
However: it must be remembered that callout/online test research respondents are screened for the fact that they listen to country radio regularly. It may be that the opinions of non-listeners or those who are listening less lately are simply not reflected here due to our testing methodology.
Have you noticed something I may have missed?
'WILL RADIO BE PUSHED OUT OF THE CONNECTED CAR?" IS THE WRONG QUESTION FOR
BROADCASTERS TO ASK
-
A recent A&O&B Facebook post from Jaye got quite a bit of attention.
It concerned a story by the Las Vegas Review-Journal’s Todd Prince
speculating about ...
7 years ago
2 comments:
Thanks Jaye!
Great information. do you have an idea what the average approximate spin count would be on a song when it goes from 38 to 22 spins?
Thanks,
Dewey Boynton
Of course, your music scheduler software gives you the total number of spins on any song you want to see it on, but as tempting as it is to come up with a "total spins and off" policy, I think that is as big a mistake as having a "number of weeks and off" policy.
Just like human beings, each song has a unique life of its own and I'd suggest using local research scores for determining what the strongest songs to play as judged by your listener right now than following any specific general rule.
Thanks for the comment/question/observation!
Post a Comment