It’s high time to fight the theft of news-related contents, really. A couple of weeks ago, Attributor, a US company, released the conclusions of a five-month study covering the use of unauthorized contents on the internet. The project was called Graduated Response Trial for News and relied on one strong core idea: once a significant breach is established, instead of an all-out legal offensive, a “friendly email”, in Attributor’s parlance, kindly asks the perpetrator to remove the illegal content. Without a response within 14 days, a second email arrives. As a second step, Attributor warns it will contact search engines and advertising networks. The first will be asked to suppress links and indexation for the offending pages; the second will be requested to remove ads, thus killing the monetization of illegal content. After another 14 days, the misbehaving site receives a “cease and desist” notice and faces full-blown legal action (see details on the Fair Syndication Consortium Blog). Attributor and the FSC pride themselves with achieving a 75% compliance rate from negligent web sites taking action after step 2. In other words, once kindly warned, looters change their mind and behave nicely. Cool.

To put numbers on this, the Graduated Response Trial for News spotted 400,000 unlicensed cloned items on 45,000 sites. That is a stunning 900 illegal uses per site. As reported in a February 2010 Monday Note (see Cashing in on stolen contents), a previous analysis conducted by Attributor pointed to 112,000 unlicensed copies of US newspapers articles found on 75,000 sites; this is a rate of of 1.5 stolen articles per site. Granted, we can’t jump to the conclusion of a 900x increase; the two studies were not designed to be comparable, the tracking power of Attributor is growing fast, the perimeter was different, etc. Still. When, last Friday, I asked Attributor’s CEO Jim Pitkow how he felt about those numbers, he acknowledged that the use of stolen content on the internet is indeed on the rise.

No doubt: the technology and the deals organized by Attributor with content providers and search engines are steps in the right direction. But let’s face it: so far, this is a drop the ocean.
First, the nice “Graduated Response” tested by the San Mateo company and its partners needs time to produce its effects. A duo of 14 day-notices before rolling out the legal howitzer doesn’t make much sense considering the news cycle’s duration: the value of a news item decays by 80% in about 48 hours. The 14-days spacing of the two warning shots isn’t exactly a deterrent for those who do business stealing content.
Second, the tactics described above rely too much on manual operations: assessing the scope of the infringement, determining the response, notifying, monitoring, re-notifying, etc. A bit counter, to say the least, to the nature of the internet with its 23 billion pages.

You get my point. The problem requires a much more decisive and scalable response involving all the players: content providers, aggregators, search engines, advertising networks and sales houses. Here is a possible outline:

1/ Attributor needs to be acquired. The company is simply too small for the scope of the work. A few days of Google’s revenue ($68m per 24 hrs) or less than a month for Bing would do the job. Even smarter, a group of American newspapers and book publishers gathered in an ad hoc consortium could be a perfect fit.

2 / Let’s say Google or Bing buy Attributor’s core engineering know-how. It then becomes feasible to adapt and expand its crawling algorithm so it runs against the entire world wide web — in real time. Two hours after a piece of news is “borrowed” from a publisher, it is flagged, the site receives an pointed notification. This could be email, or an automatically generated comment below the article, re-posted every few hours. Or, even better, a well-placed sponsored link like the fictitious one below:

Inevitably, ads dry up. First, ad networks affiliated to the system stop serving display ads. And second, since the search engine severed hyperlinks, ads on orphan pages become irrelevant. Every step is automated.

Think of the benefits for the search engine: instead of appearing as the internet’s biggest looter, it becomes the copyright white knight.

3 / The business model. Publishers pay a fee to have their content tracked. For the biggest ones, the cost/benefit is obvious as the fee is adjusted to the amount of advertising revenue the looters were generating from the abused property. Even better, Google can position itself, once again, as teaming up with the little guy: now, the tiniest hardworking blogger, specialized in astronomy or in sociology, can fight unauthorized use of its work — for a small monthly fee. Alternatively, if Bing choses to become the white knight, such a move could become a powerful differentiator — reality and perception — against Google.

Of course, merely beefing up algorithm-based detection of unlicensed contents won’t be enough to solve the ruinous information piracy problem. Many sites do business by rewriting, processing existing stories into compact, stand alone texts that are entirely self-sustaining, generating loads of page views at the expense of original publishers (see a recent Monday Note Aggregators: the  Good Ones and the Looters). This battle will be a long one. But it has to be fought. Serious legitimate money is at stake.

frederic.filloux@mondaynote.com

Print Friendly
Be Sociable, Share!