Each and every media gathering those days includes one subject: how to deal with the increasing traffic derivated from search engines, should our sites be “optimized”, just “compliant” or “aggressively attractive” to search? Of course, Google is at the epicenter of the debate since it commands a market share for search ranging from 60% to 90% (depending on the country).
Search crawlers have given to our business a peculiar dimension. This Fall, managers will experience a queasy feeling as they fire up the spreadsheets and begin work on their ‘09 budgets. Search engines bring 40% to 60% of their traffic to be converted into revenue. How can they make sensible bets on revenue streams and forecast their expenditures when depending so much on the opaque workings of search engines? How to deal with such a level of inpredictability?
At stake is the ranking of a site when a user performs a search. The higher the site shows up in the response page, the likelier it is to be clicked on. And, in this matter, winner takes (almost) all since users generally don’t to look beyond the second page of results.
How to get to the top of the page? The first parameter is depth of content ; search crawlers give a better spot to rich sites rather than to shallow ones. Second is contents optimization; SEO (Search Engine Optimization) has become a (black) science; most of the work is done on structural aspects: internal linking, recursive contents, archives accessibility. Third is a more controversial approach: SEM for Search Engine Marketing, SEM involves the acquisition of keywords more likely to be searched by users. If I’m a lawn-mower manufacturer, I’ll do my best (means: pay a lot) to retain keywords associated to my business ; if I’m running a news website, I’ll try to react swiftly enough to see when a news item is likely to be hot in order to capture the biggest chunk possible of search requests on a particular event or developing story.
Let’s review and assess the three approaches.
First, let’s not forget there is some fairness in the increasing dependency to search: the better a site is in terms of content and user-friendliness, the better it will naturally perform. The search system uses a ranking algorithm, PageRank, based on the popularity of a site
Second question, should sites spend time and money to be optimized for search? Of course they do. I my view, there are at least two good reasons to do so:
a) Traffic is traffic. Media buying agencies tend to look at the bottom line: how many page views and unique visitors a month a site is harvesting. When a visitor lands in your site through a search engine, it is likely to stay for a very short time (one, two pages, no more) before bouncing elsewhere in the cyberspace. Actually, this behavior is on the rise, as shown by a recent study: in four years, the proportion of users accessing a site via its home page dropped from 40% to 25%. That’s the way it works, if you don’t like it, try newspapers. Once again, if the content is interesting, if an article is surrounded by cleverly arranged related links, then the visitor will tend to stay on the site and perhaps will bookmark it — then, bingo.
b) Reviving the long tail. One of the nicest things on the Internet, is its ability to revive its inventory. Undoubtedly, an optimized site helps to value old stories. That leads to the deep-linking policies. An increasing number of English language websites become more open to deep linking, even in their paid areas. For them a visitor searching for a 2006 article is a potential subscriber, rather than the mere purchaser of a $2.00 article. The reasoning escapes many French news sites, which remain stubbornly opposed to any open deep-linking policy.
Third big question, what about Search Engine Marketing, i.e. keywords acquisition? The answer depends upon your taste for mind-altering chemicals. SEM is a good way to improve your stats. But: a) it is very expensive, and b) the remanence of purchased traffic is low. You get a high, but it is a fleeting one. A major French newspaper is said to have spent E100,000 a month on keywords acquisition, a ridiculously expensive habit. Of course, from time to time, when you feel able to outsmart the market, you go for a one-shot deal. But doing so as a regular operational matter makes no sense. Plus, gravity always prevail in the end. One day, media buyers will take some time to have a closer look at the client’s statistics and will be able to discern the strong, sustainable, solid traffic from the elusive SEM-generated flow.
Last question — which is key in my view. Should editors and publishers remain stoic facing their increasing dependency to search engines? Of course not. We all know that slight alteration of search algorithm by Google has pushed many companies out of business. Applied to the news sector, it means that a 45% rate of traffic coming from Google can plunge overnight because of the mathematical tweak of a geek in Mountain View, California. The only way to mitigate such risk is actually to talk to the geeks. Media execs should send the following message to Google: OK guys, we’ve been able to develop a great business together — you bring us traffic, we send you back advertising revenues –, but in order to preserve this great relationship, we should manage better our dependency on your clever algorithms. In a nutshell, let’s find a way to work more closely to avoid Google affecting our business when your are fine-tuning your search process. We won’t interfere with your math wizards, but we need some guarantee that they are not going to mess-up wit our business. Of course, it won’t be as simple: SEO/SEM wizards try their best to cheat the Google system, which, in response adjust its algorithm to preserve some fairness in search results. The idea is simply to avoid collateral damages.
No doubt that, for the sake of preserving its “Don’ Be Evil” motto, Google will listen. Even in its position, the search giant cannot afford a PR offensive from desperate medias.
- Google — Indexing the physical world TweetThe US Patent Office is a gold mine for journalists and industry watchers. From sketches on a possible new Macintosh (the first one being a small notebook sliding inside a vertical docking station) to hints on future Google plans, there is always something to eat. The website Search Engine Land has found three new patents [...]...
- The Traffic Bubble TweetThe new high tech-bubble might not be the one you’re thinking of. Measuring the bubble’s size and inner pressure of is a delicate exercise. For today, we’ll consider two sectors: social networks and online media — such as the Huffington Post acquired last week by AOL for a stunning $315m. In the valuation game, social [...]...
- Google and Apple are robbing us! TweetThat’s the cry of anguish heard in the executive suites of cellular carriers, poor things. Why the sorrow? Nuances removed, it boils down to this: . ISP (Internet Service Providers) don’t sell content, they bill at a flat rate regardless of what you download, music, e-mail, video. ISP don’t decide which computer you can and [...]...
- WSJ.com’s audience jumps at records high. Newsroom integration works TweetThe online version of the Wall street Journal is roaring. According to its editor, Alan Murray, quoted in the business monthly Portfolio , WSJ.com got 15 million unique visitors (UV) in March, a 175% jump over a year ago. In page views, figures for the same month came at 165m a 75% increase. Those figures [...]...
- Markitecture (take 2) — Google descends from the Cloud TweetGoogle’s markitecture isn’t so different from Microsoft’s. Just like the old champion, Google tells us we can have the best of both worlds: Everything in the Cloud, applications and data. What? You want to work off-line? No problem, we can do that too. Your data and your applications also on the desktop, re-connect and everything [...]...