The Long Tail is a beautiful intellectual construct. Beautiful, therefore right. Who wouldn’t want to see it succeed? Chris Anderson coined the term back in 2004, in a Wired magazine article. A skillfully marketed book followed, which turned out to be a bestseller (i.e. the the Tail’s profitable head). When the concept began to gain currency, we all experienced an epiphany: visions of soon-to-be revealed bonanzas lying in our stashes of books, music, or for us journalists, news material buried deep in the bowels of our web sites.



Five years later, doubt is setting in. Fact is: very few businesses have been able to extract money from the Long Tail. Of course, as Anderson predicted, when entire inventories become accessible online, some of the lowest selling items in catalogs do get their Day in the Sun. But, when it comes to converting exposure into cash, the result is a pitiful rounding error. Last week in Oslo, friends and I were discussing the Long Tail theory’s impact on the news business. It turned out everyone around the table shared the same suspicion. One such doubter directed me to a recently released research paper by two Wharton scholars. To challenge Anderson's theory, Professor Serguei Netessine and his student Tom F. Tan pored over Netflix data.

For Monday Note readers outside of the US, Netflix is a (some say The) DVD rental company deploying a huge physical delivery system (2 million DVD sent each day, $300m a year in postage fees). For Anderson, Netflix is the Long Tail’s poster-child: a vast inventory made easily accessible thanks to the internet, with users smartly rating forgotten gems. Three years ago, Netflix launched the Netflix Prize, a crowd-powered contest aimed at improving its user ratings recommendation algorithm by 10% (quite a leap, actually). $1 million would go to the winner. To feed the math-freaks, Netflix opened its data vault, a boon to the Wharton scholars who hungrily dug into the 200-2005 numbers. Their study is called "Is Tom Cruise Threatened? Using Netflix Prize Data to Examine the Long Tail of Electronic Commerce," (full text here , presentation  here). The key finding:

"The Wharton researchers disagree with Anderson's theory and its implicit challenge to the Pareto principle, or so-called 80-20 rule, which in this case would state that 20% of the movie titles generate 80% of sales. Anderson argues that as demand shifts down the tail, the effect would diminish. Using Netflix data, Netessine and Tan show the opposite -- an even stronger effect, with demand for the top 20% of movies increasing from 86% in 2000 to 90% in 2005".

This isn’t the first challenge to Anderson’s glowing theory. Three years ago, two Harvard professors, Anita Elberse and Felix Oberholzer-Gee, published a study (see: Will the "Long Tail" Work for Hollywood? ) showing actual decline of the lower selling videos between 2000 and 2005. At the same time, they said, the market became more concentrated as the number of titles making up the top 10% of sales dropped by half. Many similar papers show the same concentration in local internet searches or web sites consultations (36% of users' time is spent on the top 20 sites, and that percentage is increasing, according to ComScore).

Of course Anderson disputes Wharton's findings. (See his blog The Long Tail). But his defense of his speaking-fees producing theory fails to convince.  Basically, Wired's editor opposes the use of percentage versus raw volume; in the case of unlimited inventories, he says, using percentage can be misleading. What he means to say is a very small percentage, say .01%, of a very large number, say 100 million, is still substantial: 10,000 units. Perhaps. But that misses the money point when no one knows or cares.

The point is the Long Tail doesn't work as far as revenue generation is considered. This for two main reasons:

#1: The Novelty factor. This key question is addressed on page 12 of the Wharton study. In short: the number of new items is very high (1000 new movies released every year plus 3000 older ones newly rated by users in the Netflix system). As a result, users don't have the time/attention to discover them.

Interestingly enough, this is exactly what takes place on the biggest news websites. If we do a search on those sites and narrow it down to the last 24 hours; we get a enormous amount of new URLs:

- 45,800 for the New York Times (out of 12m total URLs)

  • 37,400 for The Guardian (out of 20m)

  • 21,900 for Le Monde (out of 1.05m)

  • 69,300 for Aftonbladet.se -- #1 site in Sweden -- (out of 1.2m)


Of course, this includes all types of URLs, from new articles to the smallest items added (comments, blogs entries, graphs, automatically generated stuff...) But it has been estimated that just one week of the New York Times is equivalent to the amount of information an 18th Century human being would absorb during his/her entire life.

The new releases syndrome also applies to books. In a country like France, where writers are prolific, no less than 659 books were released this Fall only (worldwide, 3000 books are published every day). Beyond managing physical inventories, this raises the question of audiences’ ability to simply internalize such volumes of information and to derive some awareness of their preferences. (For a very contemporary example, ask iPhone users and developers how they feel about the sudden explosion of applications, more than 75,000 as of this writing. How do you navigate such a boiling ocean? With a recommendation engine.)

Hence a very distinct possibility: the next killers apps could be information-magma navigation tools: filters of all sorts, recommendation engines, social or human-powered algorithms. Indeed, these tools might lift up some long tail items, but the head of novelty is growing so fast that unearthing long tail items won't make much of a difference.

#2: The Dry Hole factor. Granted, both as a tool for access to catalogs and as medium, the internet flattened the music industry’s landscape. In the 80's, an artist had to sell 15 million albums to be N°1 in the US market. In 2005, that number fell to 4.5m and in 2008, the N°1 on the US charts sold only 2.7m albums. Does it mean unknown artists emerged in the meantime, thanks no newly accessible catalogs? Perhaps. But over the last ten years, the music industry lost 40% to 50% of its revenue. As for legal downloads -- in theory the Long Tail divers tool of choice -- those don't count for more than 5% in an album’s P&L. (Illegal downloads are 10 times higher in volume than their legal counterpart).

This also applies to the economics of the news business. Just consider the delta between the CPM for premium space (home page, head of news sections) and for the bottom of the inventory: we are talking of a 10 to 1 ratio, here. For news, the head paid much more than the tail. Having said that, it doesn't mean than we should not develop better recommendation engines: as I explained in an earlier article (see Recommendation Engines: A Must for News Sites ), I think recommendation engines hold great potential to increase the number of pageviews for the recent part inventory.

Let's add three other considerations: First, physical delivery of Long Tail items costs much more than it does for the top of the catalog (think about the parcel vs. the pallet). Second, for books or news, only the heaviest users are likely to drill into the depths of a catalog; and they represent a very small percentage of the user base. Third, the internet has done miracles, but it has had no effect on production costs. Whether for a new drug or a strong original news piece, pharmaceutical companies or publishers will continue to be more inclined to promote the recent, unamortized goods rather than the older, low margin ones.

To sum up, go ahead, pour time and resources in boosting the Long Tail -- as long as you enjoy a big and profitable head.   frederic.filloux@mondaynote.com

Print Friendly