Saturday, July 09, 2011

"Research IQ"

The best time of year to field perceptual studies, callout, focus groups, music tests? In the constant change that is today's radio universe, the answer is usually "yesterday."

And, as a result, most research companies are working non-stop, year round where once upon a time they centered their activities in the first and third quarters - when many stations get set for fall and spring surveys.

Consolidation cut back on the amount of marketing research done by radio at first back in the late 90's and early 2000's, but now thanks to internet survey and social networking tools there's no reason why any radio manager can't do as much, or even more, listener research and data crunching as was done forty years ago -- when only a few very innovative companies began to recognize that it was possible to learn more about a competitor than even they knew about themselves.

Ultimately, perceptual studies, focus groups and music testing became ubiquitous, until it became obvious that if three stations in the same format all targeted the same demographic, they'd all end up positioning themselves as "less talk, more music" and playing the same 250-300 songs.

Today, it takes a much more sophisticated approach to digging deeper into tastes and lifestyles to fully understand what coalitions drive loyalty and usage, understanding when you can "do-it-yourself" and when the findings you get from online tactics require greater expertise from one of the excellent radio research companies, who - given the complexity of our multi-platform world - are busier than ever.

Stay in close touch with your consultant as you execute research strategies.

We work with all of the vendors and can help you decide what to trend and track in-house and when it's more cost and time-efficient (today's margin for error is smaller than ever and the price of errors is larger than ever!!) to bring in a pro.

To start with, if you have the choice and your market is stable enough to permit you the luxury of not having to panic-schedule research in reaction to competitive surprises, A&O likes:

1. Full market strategic or (if you only own one station and do not plan a format switch under ANY circumstances) competitive face off perceptual study. Field this when the weather is as its worst and respondents will be home - January or February are NOT the months to do focus groups or an auditorium music test unless you live in Florida, Hawaii or a place with residents are so hardy (can you spell Alaska? Wisconsin? Minnesota?) that bad weather won't stop them from going out. Insist on having the responses to this 15-20 minute questionnaire six to eight weeks prior to the start of the spring book. Get all major decision-makers together, including your consultant, for a day away from the station to create an immediate action plan based upon it. A&O does an annual "Roadmap" online study for all of our clients, so that we can track key metrics nationally and benchmark locally. For many of our clients, this is enough to stay competitive as judged by your core, but when you see something you don't understand or know what to do about, that's a great time to talk to a researcher you trust to target a random sample, replicating the ratings methodology. It's not cheap, but it's a lot cheaper than taking action on bad data.

2. Music testing. Ideal: four 400-700 song tests annually, in the month prior to the start of each survey so that you freshen your entire library each book. And, if that is what your competition does, you better as well. Many stations do one annually and, in that case, I'd schedule it exactly six months away from the perceptual study (July-August), include written mini-perceptual questions during the breaks and invite participants to stay after for an informal focus group discussion on promotion and programming issues. A&O tracks a total of almost a thousand gold and recurrent titles annually in four quarterly online music tests which clients are encouraged to participate in. However, as helpful as this info is to trend evolving tastes of core listeners, there's only one way to find out about the music preferences of non-core listeners and that involves more than just using your loyal listener database to test music.

3. Focus groups/listener advisory panels can be used to probe issues that you may want to test more formally in the perceptual study. Or, as a qualitative followup when the data tells you that listeners are behaving in a way you do not fully understand. The key point with focus groups: know three to five action-based questions that you want answered. I would plan them in October-November and/or April-May to learn if listeners are aware of your marketing efforts and how your product is being perceived. Best: Tue-Wed-Thu (be done by 10 pm). Or, Saturday midday. Again, with today's online tools, there's no excuse for not doing this modern day equivalent of hanging out in a bar where your listeners go and asking questions, listening critically. However, again, it's crucial to understand the limitations of this and to know when to ask for a second opinion from someone who does this for a living and can take a more objective view on your behalf.

4. Weekly/biweekly online/callout/listener advisory panels. I also like a monthly 400 person in-house 'mini-rating' that trends station preference and cume, as well as tracking key strategic issues defined in the perceptual study and testing current/recurrent music two to four times per month. Don't waste your time doing it, however, if you are going to ignore the results or call for help when your results don't track with ratings. Draw your music test sample from a mix of core and cumers to your station, balanced based on the needs of your strategy.

5. Marketing/management by wandering around. Monthly or at least quarterly, invite two groups of 15 people chosen at random from the callout/online testing panel to meet at 6 and 8 pm with a member of station management who isn't known publicly - or your consultant - to discuss what they're hearing and reacting to. Mail out/email/place on your website weekly current music rating sheets that encourages at least 100 request line callers/contest players/at work database members to 'listen and rate the music." The larger this sample, the better. Balance it to be sure that males and younger demos are represented proportionally. It's not statistically valid, of course, and there's always a response bias to groups like this to beware of. However, it's amazing how candid even the folks who love your station the most can be.

6. Diary reviews/mechanical diary analysis. Every radio station should have a postal code map in the promotions office with: color-coded ARB/BBM returns by zip plus four and when available even QR code for both your station and meaningful competition. Collect zip code data in callout and database marketing as well. Correlate your loyalty marketing efforts on the geographical areas where you gain the highest AQH contributions and your conversion tactics in your competition's strongest locales.

Note: ARB mailed the 2011 Spring Diary Review brochures to all subscribers in early June and the first choices dates were allocated on June 23. A&O advises you to return the form, requesting a diary review EACH book. You can always cancel the request if you decide when the results come out that you won't need to do a diary review. BBM returns ballots to their regional offices a few weeks after each diary survey is published and now both ARB and BBM have terrific online and desktop software tools that enable you to see much of what you once could only get from a diary review from your own office.

As with everything else mentioned above, A&O will be happy to share our perspective as the next book comes out, show you how much you can still learn from a diary review you can't get anywhere else and can refer you to reasonably-priced experts who tab the data and create strategies from it every day.

There are too many low-cost/no-cost ways to stay in touch with your listeners today, there's simply no excuse for not knowing what your heaviest users think about your programming efforts and also your competition's as well.

No comments: