In the recent history of technology, success is not often related to superior performance. Take MS-DOS for instance, it dominated the operating systems of personal computing because it was the only one available at a key moment of the evolution of the PC and secured by an exclusive license between Bill Gates and IBM. (Had Darwin been at work, we probably would have ended with something better, but a clever entrepreneur, son of a prominent Seattle attorney, and his own lawyers were running the show at the time). There are others example, like the stupid keyboard which I’m using to write the MondayNote. The positions of the keys descend from a layout designed to actually slow down typing on mechanical machines. At the time, typewriters were unable to keep up with the dexterity of typists. Now we have spell checking software correcting mistakes as we type, but we’re stuck with the unpractical keyboard. In that instance, no contract is responsible, simply the weight of the habits, and the equally heavy burden of the backward compatibility of education (I don’t see a company buying, all of a sudden, Dvorak keyboard for its people, even though any PC or Mac can handle it).

Let’s come back to this century and talk about measuring website audiences. We are witnessing the same growing dominance of a system without regard to its performance. In this case, the system is Nielsen. A truly imperfect technology, to put it nicely, but widely adopted by the advertising community.

Two systems: To measure the audience of a website, you have user-centric and site-centric systems. Nielsen falls into the first category. It relies on the old, unreliable process long in use for television (that’s Nielsen’s DNA, actually). It is based on a panels of people periodically queried on their viewing habits. Perfect to assess the number of Joe-six-packs
on their couches watching a football game, but totally unfit for the Internet. It would be like, in a biology lab, asking lab mice how they feel, instead of counting cell divisions in a Petri dish. Site-centric measurement is the Petri dish and the real “quant” analysis. Basically, software that follows the users when they land on your site. What they see, for how long, all with increasingly sophisticated data reduction and display interfaces. There are many of vendors, competition is fierce. In the early days of the Internet, these systems where prone to some amount of cheating. But order prevailed quickly. Now, most are certified and no serious player would dare tampering with its stats system.

Now, let’s compare performance. Where site-centric measurements happen in real time, the relevant data released by Nielsen are published monthly. Yep, monthly. Like in the good old days of broadcast radio. On the top of that, Nielsen websites look like a social security database in the Soviet era. Rows and columns, no comparison tools, clunky features, bugged like a Louisiana swamp. You have to perform tedious exports to Excel files (friends even sent me screen captures!) to perform the analysis that a good Ajax-based website would perform in a split second with graphics ready to be exported in any document or presentation.

As a result, website operators are like the owner of a sweatshop in the garment district: they rely on two sets of books. The official one — in that instance the approximate Nielsen monthly data to feed the ad market — and the unofficial, precise one. Because to add insult to injury, the results of the two systems differ widely: when you have 1 million unique visitors on Nielsen, your internal stat analysis tools will probably yield 1.5m or 2.3m depending upon where your site tends to be visited the most (at work or at home).

How come such a lousy system became the standard in the advertising business? Two possible answers. First, the site-centric players were a bit slow to organize themselves. And when they did, it was too late. Second : it’s a question of culture. The advertising sector is (still) dominated by the TV/radio mentality — a Nielsen fiefdom. Therefore, when Nielsen came saying “let us to bring order to Web stats, it’ll be a piece of cake”, nobody questioned that statement. Now, because Nielsen is a powerhouse in the media buying milieu, we are stuck with that company. (Other reasons combine the usual laziness and conservatism).

How will it evolve? Again two (tentative) answers. The first is generational. I would bet that the upcoming generation of Web publishers will be more committed to transparency in terms of audience and basic data. (That’s not the case for print media executives, they were unbeatable at constantly misleading the ad market with bogus audience figures. Now, facing the Internet era, they are paying the hard price). The second factor is called Google. A year ago, it introduced Google Analytics, a great site-centric stats system — available for free at a click near you — that an increasing number of site are using in tandem with their usual tool set. They did the same with MeasureMap, a tool dedicated to blogs. Now, Google is developing new services (also for free) designed to help advertisers plan their campaigns. (When such a plan was announced last week, the stock of ComScore, a publicly-traded firm that tracks Internet usage, fell by 23% ). Is this new foray of Google is a problem? Yes and no. Yes, the domination is somewhat worrisome, but now it is a given fact. And no because if we consider (as I do) that the Web has to be transparent for its basic audience data, then relying on Google is not such a bad thing since it already knows everything about our sites. And that will leave plenty of room for highly specialized firms that will deliver customized, value-added audience analysis, that will be worth the price. –FF

Print Friendly
Be Sociable, Share!