Each and every media gathering those days includes one subject: how to deal with the increasing traffic derivated from search engines, should our sites be "optimized", just "compliant" or "aggressively attractive" to search? Of course, Google is at the epicenter of the debate since it commands a market share for search ranging from 60% to 90% (depending on the country).

Search crawlers have given to our business a peculiar dimension. This Fall, managers will experience a queasy feeling as they fire up the spreadsheets and begin work on their ‘09 budgets. Search engines bring 40% to 60% of their traffic to be converted into revenue. How can they make sensible bets on revenue streams and forecast their expenditures when depending so much on the opaque workings of search engines? How to deal with such a level of inpredictability?

At stake is the ranking of a site when a user performs a search. The higher the site shows up in the response page, the likelier it is to be clicked on. And, in this matter, winner takes (almost) all since users generally don't to look beyond the second page of results.

How to get to the top of the page? The first parameter is depth of content ; search crawlers give a better spot to rich sites rather than to shallow ones. Second is contents optimization; SEO (Search Engine Optimization) has become a (black) science; most of the work is done on structural aspects: internal linking, recursive contents, archives accessibility. Third is a more controversial approach: SEM for Search Engine Marketing, SEM involves the acquisition of keywords more likely to be searched by users. If I'm a lawn-mower manufacturer, I'll do my best (means: pay a lot) to retain keywords associated to my business ; if I'm running a news website, I'll try to react swiftly enough to see when a news item is likely to be hot in order to capture the biggest chunk possible of search requests on a particular event or developing story.

Let's review and assess the three approaches.

First, let's not forget there is some fairness in the increasing dependency to search: the better a site is in terms of content and user-friendliness, the better it will naturally perform. The search system uses a ranking algorithm, PageRank, based on the popularity of a site

Second question, should sites spend time and money to be optimized for search? Of course they do. I my view, there are at least two good reasons to do so:
a) Traffic is traffic. Media buying agencies tend to look at the bottom line: how many page views and unique visitors a month a site is harvesting. When a visitor lands in your site through a search engine, it is likely to stay for a very short time (one, two pages, no more) before bouncing elsewhere in the cyberspace. Actually, this behavior is on the rise, as shown by a recent study: in four years, the proportion of users accessing a site via its home page dropped from 40% to 25%. That's the way it works, if you don't like it, try newspapers. Once again, if the content is interesting, if an article is surrounded by cleverly arranged related links, then the visitor will tend to stay on the site and perhaps will bookmark it -- then, bingo.

b) Reviving the long tail. One of the nicest things on the Internet, is its ability to revive its inventory. Undoubtedly, an optimized site helps to value old stories. That leads to the deep-linking policies. An increasing number of English language websites become more open to deep linking, even in their paid areas. For them a visitor searching for a 2006 article is a potential subscriber, rather than the mere purchaser of a $2.00 article. The reasoning escapes many French news sites, which remain stubbornly opposed to any open deep-linking policy.

Third big question, what about Search Engine Marketing, i.e. keywords acquisition? The answer depends upon your taste for mind-altering chemicals. SEM is a good way to improve your stats. But: a) it is very expensive, and b) the remanence of purchased traffic is low. You get a high, but it is a fleeting one. A major French newspaper is said to have spent E100,000 a month on keywords acquisition, a ridiculously expensive habit. Of course, from time to time, when you feel able to outsmart the market, you go for a one-shot deal. But doing so as a regular operational matter makes no sense. Plus, gravity always prevail in the end. One day, media buyers will take some time to have a closer look at the client's statistics and will be able to discern the strong, sustainable, solid traffic from the elusive SEM-generated flow.

Last question -- which is key in my view. Should editors and publishers remain stoic facing their increasing dependency to search engines? Of course not. We all know that slight alteration of search algorithm by Google has pushed many companies out of business. Applied to the news sector, it means that a 45% rate of traffic coming from Google can plunge overnight because of the mathematical tweak of a geek in Mountain View, California. The only way to mitigate such risk is actually to talk to the geeks. Media execs should send the following message to Google: OK guys, we've been able to develop a great business together -- you bring us traffic, we send you back advertising revenues --, but in order to preserve this great relationship, we should manage better our dependency on your clever algorithms. In a nutshell, let's find a way to work more closely to avoid Google affecting our business when your are fine-tuning your search process. We won't interfere with your math wizards, but we need some guarantee that they are not going to mess-up wit our business. Of course, it won't be as simple: SEO/SEM wizards try their best to cheat the Google system, which, in response adjust its algorithm to preserve some fairness in search results. The idea is simply to avoid collateral damages.

No doubt that, for the sake of preserving its "Don' Be Evil" motto, Google will listen. Even in its position, the search giant cannot afford a PR offensive from desperate medias.

Print Friendly