Jason Calacanis is having proper conniptions over the comScore marketing study on blogs released this week. Fred Wilson and Heather Green have moments of dubious doubt themselves. Krucoff, says Jason, nails it and quizzes Rick Bruner, who helped on the study itself, as Rick tries to answer questions on the methodology on his blog. I leave it to you to follow the links above to the specifics.
My first reaction is that all this shows how messed up panel research is. This is the method used by Nielsen et al to measure TV and radio and print readership — affecting billions of ad dollars — and it is and always has been relative bullshit. That’s why advertisers buy it, though — because it is relative, because they can compare this magazine to that magazine on the same sheet. But it’s all based on a small and only allegedly representative sample of people. It’s meaningless. When I worked on magazines that allegedly had eight readers per copy — damned dogeared, they were — we benefited from this relative bullshit. But when I came online, we could measure the bull ourselves: We could compare our server and cookie stats with what the panel research told us and we could tell when they didn’t have any panel members in entire states. Panel research is a novel in numbers.
My second reaction is, however, that blogs need some sort of numbers advertisers will buy. I thought it was a good idea to try to get that research and, as I read that link, I see that this is partly my fault: At the blogging business session I emceed at Bloggercon II, I emphasized the need to feed advertisers their metrics. So it’s a damned shame that this research is raising such eyebrows.
My third reaction is that we should be creating our own meaningful metrics. Bloggers who care about making business at blogging (and let’s remember: that’s only some of us) should be agreeing on cookies and also on new means of measurement and new things to measure.
This isn’t as simple and stupid as an a-list or a panel or page-views or eyeballs. This is a much richer thing, this unmedium of ours, and it needs much smarter measurement. See Mary Hodder’s napkin notes for just some of the means of measuring blogs’ popularity and appeal; I can think of many more.
Advertisers are screaming for proof of “engagement” these days and while silly, inky magazines are trying to “engage” with flashing ads on paper, we flash without trying. We engage or die. We live by relationships and trust — more fave ad words. We have influence — yet another fave word. We need to measure and report all that.
Instead, we’re futzing and fussing and fuming over the few numbers we have and giving advertisers another excuse to ignore us. Arrgh.
comScore should reveal much more about its study so that bloggers can poke at it and so the crowd — the wise crowd — can help improve the methodology and ferret out what makes sense and what doesn’t — and let’s remember that these numbers do show that blogs are a thing that should not be ignored. If something looks odd, explain it or explain why you can’t.
And we should start finding new ways to measure our real value — and that’s not about continuing to chase the big numbers that are so old-media and it’s not about continuing to value relative bullshit. We need to find the numbers that count.
: Oh, and while we’re at it, can somebody point me to exactly what the oft-quoted and bragged-about Alexa numbers are really based on? Does there data come from the toolbar still? Do you know a single soul who actually uses that toolbar? How big is their sample? How representative?
Every one of these services that now tout numbers should be transparent about methodology and sources and scale. It’s late, so I’m not going to go looking now. But if you can point me to such disclosure at Blogpulse, Technorati, Bloglines, et al, please leave a link in the comments and let’s start by finding the best of breed in transparency.