Almost everyone in the radio and records industries decries the need for trade charts as a means of determining the difference between "hit" and "STIFF."
If you have local research, you've no doubt become accustomed to finding that there are a surprising number of tunes which "lost their bullet and stiffed" that continue to TEST very well in research.
The listeners who love these songs simply don't know about trade charts. However, since it's difficult to test things that people are unfamiliar with, at some point we need to make educated guesses as to which of the many songs that come in each week will be hits. Good charts can be helpful in deciding which songs and artists have the highest consensus probability of becoming a hit nationally.
Personally, I define a "hit" as a song that gets sufficient play on a national basis to chart at least top 15 and achieves a minimum of 65% positive and less than 20% negative research scores in auditorium testing and callout. Even releases which end up at the lower end of those metrics may not make it to my recurrent or gold categories, even after fairly substantial "current" spins.
I think of that airplay as a risky investment that went badly.
When charts accurately reflect what selective, local market-oriented stations who listen to the new music are playing, most of the time, those charts indicate songs which may test well many weeks before research scores become available .
What if a trade publication picked which stations report to their chart - NOT based on accurate reporting of real airplay - BUT on which songs a station did or did not report? Would you consider that chart helpful or harmful on your station's efforts to determine what REAL hits are going to be?
With stories circulating now that Clear Channel's identical Premium Choice and Cumulus' Atlanta-recommended 13-20 approved currents playlists are being mandated at more and more major stations, it seems sensible for trade chart editors to adjust their reporting panels.
Hopefully, any changes will still reflect what has become "syndicated/network" spins of course but also open the prospect of being a reporter and monitored by many more radio stations.
Today, nine "networks" and 236 stations for Mediabase and 132 monitored by BDS have become one of the primary sources of chart information for the nearly 3,000 country stations in North America.
Though errors and glitches do occur, monitored airplay is generally acknowledged as unimpeachable in its accurate reporting of what those country radio stations in 125 markets actually play.
As A&O sees it, if you want to know what a hit WAS last week, check Billboard's chart. Review Country Aircheck each week for a good determination of which of the many current hits being promoted to most of those same radio stations THIS week are ranked highest. Watch it occur in real time, day to day on BDS and Mediabase.
Looking at both can be an excellent guide to what radio is doing in the largest U.S. and Canadian markets with music.
Since, today, most new music breaks from the largest markets DOWN to the smaller ones because all label promotion and artist visits to radio only occur in those places, one might ask: do I really need any other charts?
Music Row hopes that you do. By only selecting as its reporters stations which agree to expose more new music than the typical monitored station in a competitive situation does they hope to create a definable difference. Problem is, of course, is that many songs and artists chart in the 30's and 40's of that chart and never go anywhere else.
This time of smaller reporting panels for BDS and Mediabase charts should be a major opportunity for a reliable and larger sample of stations that could be looked on by the industry as THE chart.
In the following post, I'll ruminate more about this and remind you of some painful lessons from the past.
THIS BLOG HAS MOVED TO ...
-
...the A&O&B website. Link to all our blogs and articles - new and old -
here
Link to the A&O&B homepage here
Thanks!
7 years ago
No comments:
Post a Comment