About Jean-Louis Gassée


Posts by Jean-Louis Gassée:

Google Chrome: a new OS War

Not browser, OS.  More about that in a moment.
But, first, our kind, venture capitalists, loves disruption. When the established companies take too much room on the Petri dish, there is no way for a new bacterium to prosper.  When a Microsoft dominates a market, to pick a random example, launching a competitor becomes prohibitively expensive.  We love to see the economy move to virgin territories or to watch technology (or the law) weaken dominant players.
So, what’s not to like about Google’s new browser possibly weakening Microsoft’s position?  Possibly again, we could be trading one Microsoft for a new one, Google, for another black hole of a company sucking in all the business models coming into its orbit.

With this out of the way, let’s take a closer look at Chrome.
f you have the time and inclination, you might want to read Steven Levy’s story in Wired, or CNET’s shorter but insightful article, Why Google Chrome?  Fast browsing = $$$$.  I also like Niall Kennedy’s blog post: The story behind Google Chrome and, lastly, a refutation of the unavoidable conspiracy theories: When does Google Chrome talks to google.com? As I write this, a Google Chrome search returns close to 13 million results…

Back to the OS question. As early as 1994, Marc Andreesen, of Netscape fame, said The browser is the OS.  Many, yours truly included, thought the statement was both technically flawed and self-serving: Marc was one of the authors of the Navigator browser.  In 2008, Sergey Brin repeats the mantra.  Like Marc, he’s technically wrong but existentially correct in the most important of ways, the ways of business wars.  Like Marc, Sergey knows the role, the power, the weight of the (now) underlying OS.  The operating system juggles tasks, manages hardware and software resources such as memory and input/output devices.  With processors executing one instruction at any given instant, the operating system manages the illusion of many concurrent activities, downloading videos, getting email, Instant Messaging, playing music and getting pictures out of a digital camera.  For the applications programmer, the OS is the genie right under the water’s surface.  Wherever the coder sets foot, the genie is right under there, making sure the techie walks on water.

And, ask Microsoft, not if the OS matters, but what happens when OS trouble happens, when Vista misfires.
But Mark and Sergey are right, we have entered a new era, Cloud Computing and yet, the lessons of the desktop age are not forgotten. Going back to the application programmer’s feet staying dry, Microsoft played and won the game of tying the OS and the applications.  Windows programmers make sure Microsoft Office programmers have what they need.  Sometimes, this happens at the expense of competitors who can’t always have access to the same technical information, either at all, or in a timely fashion.  At the very start of the Internet era, Microsoft sees what they need to do, again, tie the browser and the OS.  This gives Microsoft control of Internet applications because these need to comply with the dominant browser from the dominant OS and office applications supplier.  Internet Explorer, free and tied, kills Netscape Navigator.  Microsoft spends time and money in various courts around the world but appears to have won that battle.
But, in September 1998, Google starts and quickly rises to its dominant position in search and advertising. In parallel, a non-profit foundation, Mozilla, resurrects Navigator as the Open Source Firefox browser.  Most of us like Firefox: free, good and getting better with every version, available on Windows, Linux and Macs.  Not tied to Microsoft or Apple.  In our happiness, we paid little attention to Mozilla’s ties to Google, financial ties, millions of dollars, $66.8 millions in 2006, to be exact.  A 26 percent increase over 2006, with little reason to think the progression stopped in 2007.  That revenue is mostly referral money generated each time we use the Google search box in Firefox.  In other words, Google cleverly financed a Microsoft (Explorer) and Apple (Safari) competitor.  A successful one: recent browser statistics credit Firefox with 43.7% share versus Explorer versions totaling 50.6%.  Too successful, perhaps.  Assuming more than $80 millions paid to Mozilla for “traffic acquisition costs”, a fraction of that easily pays for the engineers and parasites needed to write decent browser code.  That would be a make vs. buy argument.  And that would be the wrong one.
Google’s decision to ‘roll its own’ is based on the strategic requirement to provide its Cloud Computing applications with their own, controlled, under the water genie. Cynics will say Google is playing the Microsoft game of exacting monopoly profits by tying the new OS, the browser, with the new era applications.  But, there are several twists to that analogy.
First, Chrome is an Open Source browser. Anyone can inspect and use the code for their own work.  In the first place, Chrome is based on the Open Source Webkit also used by Apple’s Safari.  One significant improvement brought by Chrome is the V8 Javascript rendering engine.  Anyone can take the code and use it in their own work – as long as the Open Source licensing is enforced.  Will this cause Apple or Microsoft to Open Source their browsers?
Second, focusing on Javascript, Google makes another strategic decision, a good one in my view. Over time, browsers have become more complex as they need to deliver richer, livelier applications ranging from spreadsheets to games, from video to music or PDF documents.  Adobe now promotes a platform called AIR, working ‘above’ all desktop OS and purporting to be the engine of choice to deliver ‘Rich Internet Applications’, their words for Cloud Computing.
Not to be left behind, Microsoft comes up with their own ‘cross-platform platform’, Silverlight for the same new era target. There’s even a third-party Silverlight version for Linux being developed, with some difficulties, by a Linux advocate no less.  Why would Novell’s VP of Engineering, Miguel de Icaza help Microsoft?  I forgot, Microsoft just bought another $100 million of Linux ‘support vouchers’ from Novell.
Now, if you are Google, will you let Adobe or Microsoft design and constantly modify the genie under the water for your Cloud Computing applications?  Not if you want to control your destiny, not if that destiny is to ‘lead’, to stay Number One.
Javascript’s it is and we have our own V8 engine for it.
Today’s beta version looks good to some, and is panned by others. As the new fashion of perpetual betas dictates, see Gmail, we can expect a steady stream of improvements.  More interesting will be watching if and how Google plays the tying game, how it uses Chrome to give its email or photo editing programs features not available on other browsers or speed they can’t match.  And if, how Google one day manages to make money with these applications, the old fashion way, by charging real money for their use.  We VC would like to see that.  For us, ‘free’ is a four-letter word. — JLG

The Valley loves Obama

by Jean-Louis Gassée
Well, not everyone, we have our contingent of Republican believers who still think Obama is a socialist.
Which reminds me of the way we, the French and the Americans, are on occasion equally knee-jerk bone-headed.  In my country of birth, painful reforms are tarred as “libéral”.  There, the label means right wing free-market ultra-conservative.  Here, in my adopted country, painful reforms are called “liberal”, meaning left wing, bleeding heart, big government tax and spend socialist.  Logomachy.  Why think when you can maim an idea with a label?
We’ll see a lot more of that in the two months remaining before the November 4th vote, one many of us here think it will go Obama’s way.  Why?
In the first place, we despise the Bush administration. Never in the Valley’s history have we seen an administration so anti-scientific, anti-liberties, xenophobic, intrusive, profligate, dishonest, harmful to America’s standing in the world and in many ways an obstacle to what we do, a counter-example of what we stand for.
Yes, we’re capitalists, we like to make money. But, with few unfortunate exceptions, we do it because we help entrepreneurs realize their dreams, because we’re behind Google, Cisco, Yahoo!, Apple, Jupiter, BEA, Facebook and many, many others.  We don’t strip people from their home ownership with trick subprime loans, throwing the country’s financial system into a spin it hasn’t yet recovered from.  Yes, there was the Internet Bubble and, like the current crisis, it was aided and abetted by Wall Street con artists while Washington looked the other way, or took from the other hand.  To do what we do, to continue helping innovative companies start and grow, we need a stable financial system, not the biggest deficit this country ever dug itself in.
This administration condones the re-invasion of religion into public education: some schools in the South now teach creationism, holding the Bible’s account as a factual description of the beginnings of the Universe.  Not poetry, symbolism or a meditation on the mystery of our origins, no, fact.  The same intellectual honesty presides over discussions of climate change.
Here, we live in a nice oasis: the color of your passport, of your skin, the thickness of your accent, the way you pray or roll in the hay, none of that matters.  What can you do?  How can you help?  Those are the questions we ask.  As a result, entrepreneurs love to come here from all over the world, Russian programmers, Chinese Ph. D, even French Polytechniciens.  I remember the July 2001 day when I became a US citizen.  There were 996 of us in the San Jose Civic Auditorium.  The federal judge who administered the swearing in told us there were 80 nations in the room.  Tiny Chinese grandmothers, Hispanics, Slavs, Swedes, Indians, Iranians…  And, with tears in my eyes, tears that come back as I write this, I thought: This is how my dear Silicon Valley will continue to be this oasis of meritocracy and entrepreneurship.  The same judge kept telling us to use our new civic rights, to register to vote.  The ceremony came to an end and, as we exited the auditorium, we saw a big table and volunteers ready to help with the registration paperwork – for the Republican Party.  The Democrats were at the beach.  That’s how I became a registered Republican –  soon to re-register as an Independent and thus able to vote either way.
Back to the Bush administration, what does it do to help Silicon Valley continue to attract entrepreneurs from all over the world? Getting work visas becomes much harder.  This in a country where 25% of high-school “students” quit before graduation, when graduating is so easy all you have to do, in some of the worse schools, is fog the proverbial mirror.  In all fairness, that very problem, the state of high school education, the resulting lack of qualified “intellectual manpower”, pardon the oxymoron, and the ensuing need to import it, that situation is not Bush’s fault.  We blame his cavalier indifference to it.  But it predated him and secondary schools are but an example of a more general case of systems so entrenched, so powerful they can’t be reformed with politics as usual.
Let’s face it, it’s our problem.  We keep electing solons who, once in Washington, run into the arms and wallets of lobbyists and sell us down the river to telecom, Big Pharma, healthcare and Wall Street interests.  The executive, Bush, McCain or Obama can’t win against Congress and lobbyists.
Obama, once elected, displays the charisma and willpower to connect with the electorate over the heads of Congress. In other words, we need a President who gets our support, channels our willpower.  Then, together, we put legislators into a vise and squeeze them into working for us instead of being on the payroll of lobbyists and their clients.
In his column, Frédéric explains how Obama used the lessons and the people from Howard Dean’s successful Internet operation.  Obama has shown the will and skill to use technology to empower voters like no one before him.  That’s how he won against the “inevitable” Hillary.  Too bad for her supporters if they stay angry at Obama for beating their champion, they should be furious at her for her entitled behavior and for not paying attention to what the “inexperienced” competitor was building.
This is dangerous, of course.  Political scientists will rightly remind us of the dangers of direct democracy. It can lead to dictatorship, to a rump parliament, to the disappearance of checks and balances.  But this is a democratic 50-50 country and I don’t see a dictatorship happening here.  Unless, of course, we look at the Stalinist labeling of human beings as “enemy combatants” in order to torture them, to deprive them from the right to habeas corpus and to a fair trial.  A French communist once lectured me on the constitution of the Soviet Union, it guaranteed civil rights, personal liberties.  Unless, of course, you were an “Enemy of the People”.  No rights for you, then.  Off to the gulag.
With this in mind, for many of us here, Obama looks safer than playing the same Washington game with barely different players. We could be naïve, we know there is the “small matter of implementation”, of the ugly reality of governing once you’ve won the contest.  Still, we hope this mestizo of John Kennedy and Martin Luther King (minus the women and the pharmacy) will restore faith in our government. — JLG

DIS: a view from the Valley

Modest and proud of it, that’s us. Our perch at a center of innovation gives us the “right” to opine about almost anything, from biotech to movies, Net politics, wireless carriers and operating systems. So, why not mull over the future of newspapers?

et’s deal quickly with the formula: I agree with Frédéric’s prescription for the DIS. As described in last week’s Monday Note, new newspaper, laptop, smartphone, each medium, each prong of the integrated DIS has its features, its “rules of the genre”, its specific use and business model. Business model is a little abstract for me, let’s say money pump, the pockets we pick, advertisers, readers, and how.

Case closed, it’s a mere matter of implementation, right?
In the Valley, “a mere matter of implementation” is a code phrase, a tongue-in-cheek way to say we think we know the What but not the How. As in: to lose weight, all you need to do is eat les and exercise more – for ever. With the DIS, I see the question morphing into Who will do it? Fresh new money for an ab ovo entrant, an existing newspaper empire such as the New York Times or Rupert’s, or an existing enterprise outside of the newspaper world, Google, Tata or the Quandt family (they control BMW), for examples, realistic or not.

Let’s pause for a detour in the past: Exxon Information Systems.
In the seventies, the Big Oil company chartered the hypnotists at the Boston Consulting Group with designing a diversification strategy. Oil is running out, the OPEC is out of control, Exxon needs an alternative future. Information is the oil of the 21st century, chanted the Boston marabouts. (The Robber Baron from Redmond hadn’t emerged yet, but the BCG sees into the future.) So, Exxon started collecting little or no so little information systems companies, ranging from Intecom to Qwix, Qwip, Vydec and Zilog. The kommentariat bought it, Fortune Magazine sagely praised the diversification, the cover of Business Week asked: Exxon’s Next Pray, IBM or Xerox?

It all ended up in a $4 billion dollars hole. I know: I, too, bought the story and briefly ran their French subsidiary. And less than six months into the job decided I needed out. Right idea, wrong culture. We forgot Culture Eats Strategy For Breakfast. This was evident at Exxon, a well-managed company with no cultural clue (and no clue about lacking a clue) about the alien ways of computer people and technology.

Back to the DIS, fear someone with the right idea, armed with the right strategy but clueless about the people and the technology.
In the Valley, experienced, successful executives and entrepreneurs open a winery or buy a restaurant. You see, we know restaurants, we’re wine connoisseurs, we’ve been to the best ones around the world, we’ve swilled the grandest vintages. Wags call these pursuits buying oneself a phallic extender – these deluded individuals are all male, women are more sensible. These guys truly know how to be diners and wine tasters, but they know worse than nothing about the tough, thankless restaurateur trade or the bottomless vintner métier.

We need not look further than my country of birth to see other examples of Gallic phallic pride, of talented industrialists buying themselves an “organe de presse”. The malady is widespread and tells us big enterprises with big wallets probably won’t succeed in bringing a DIS to the world, try as they might.

In the Valley, we have this known, sunny view of entrepreneurs.
As a result, we could be tempted to think a totally fresh start will do it for the DIS. An experienced team of media and technology entrepreneurs with gobs of patient money from the likes of Kleiner Perkins, Sequoia or NEA, to names the firms ready to place big bets.

There is a small problem with the big idea: the business model doesn’t work like a venture investment, the rewards are too small for the risk.
As previous Monday Notes have pointed out, advertising revenue sharply declines when moving from paper to the Web. And there is Google whose riches come from pimping, sorry, selling advertising on, other media, not from being itself a new medium. So, we’re left with existing media groups. One gives us hope: Rupert Murdoch’s News Corp. He’s not exactly a kid fresh out of college who doesn’t know the word impossible. In an apparent paradox, his age, 77, is an advantage. He is, so to speak, not afraid to die, he’s repeatedly succeeded against the advice of the wise. Murdoch managed to take over choice properties such as the Times of London and, damned the Cassandras, improved them. Too early to say for the WSJ and no such luck for MySpace yet. The latter could be a case of cultural deafness. Still, my hope lies with a media group finding the will or the enlightened dictator to “cannibalize” its existing business rather than silently capitulating to its fate. This excludes most publicly traded groups, Wall Street hates cannibalism. As a result, the first step in the conversion to the DIS is a leverage buyout, the group becomes private so the surgery takes place behind the curtain. –JLG

Fiction: How Steve Jobs Cuckolds AT&T

Steve shimmers into a bar, materializes next to Dan Hesse, Sprint’s CEO, crying in his mojito and whispers: I can fulfill your fondest dream. You’re the Devil, go away! No, I’m merely Steve Jobs and I want nothing to do with your soul or your chiseled body. Relax, it’s just about money.

A little bit of context before we move to the How of Steve’s bargain.

In the US, we have three main carriers (sorry, T-Mobile), AT&T, Verizon and Sprint. Verizon appears to have the better, more modern (EVDO) network.
AT&T is rapidly upgrading to what is known as 3G, a world standard, competitive but not compatible with EVDO. Sprint, the smaller one, has EVDO, almost identical to Verizon, it is losing ground to the two big ones. The Sprint-Nextel merger is a disaster, to the point where Sprint wants to get rid of the company it acquired for $35 billions in 2005. Sprint’s revenue is falling: -11% when compared to the same second quarter last year, this in spite of introducing a $99 Everything plan, unlimited voice, data, music, video. “Some restrictions apply”: look at the minuscule print here, at the bottom of the screen, tiny white characters on a black background. In the almost illegible but instructive gibberish, they have the nerve to add: “Other restrictions apply. See store or sprint.com for details”. But I am on the Details Page on sprint.com!
(Intrigued, I checked: Verizon does a better job of spelling out its conditions and AT&T has the best organized one of all three.)

And, for the first six months of 2008, Sprint has lost 2 million subscribers, nothing to do with the reality and the perception of Apple smartphone sales: probably more than 10 million units in 2008, a majority of in the US.
Now we understand why the CEO is in his cups.

Steve whispers: Dan, look at the iPod Touch here. We’ve added a microphone, already available from third parties, and we grafted a Sprint radio, liberated from Jeff’s Kindle. It’s not a telephone. No, we have this exclusivity agreement with Ma Bell. In 2007, we let them say it was for five years. Now, with our 3G product, it’s been “extended” to 2010. Who knows, next year we’ll extend it to 2009.

Offer this iPod Touch with one of your All You Can Packetize plans. I’m sure the iPhone developers will put one or more Skype-like applications on it, VoIP software. You won’t mind, right? You’re not as uptight as AT&T outlawyering the use of an iPhone as a 3G laptop modem. This iPod is not a phone, it’s an Internet device, you’ll sell millions of them, your errant subscribers will return to Sprint’s fold. And you’ll keep your job. What do you say?

Awright, stop drinking that stuff and sign here. –JLG

Launchpad Chicken: MobileMe and Sync Trouble

by Jean-Louis Gassée

Simple is hard. Easy is harder. Invisible is hardest. So goes one of the many proverbs of our computer lore. As Apple found out last month with the MobileMe launch misfires, the lofty promise of “Exchange for the rest of us” translated into a user experience that was neither simple nor easy — in a highly visible way. Four weeks later, the service appears stable but doubts linger: Is Apple able to run a worldwide wireless data synchronization service for tens of millions of users.

What happened and what does it mean for MobileMe’s future?

Let’s start by decoding the “Launchpad Chicken” phrase. The game of Chicken is one by which two young males test their virility in the following way: from opposite directions, two cars speed towards each other on the same lane of a country road. The one who steers away first obviously lacks cojones and is derisively called chicken. You might ask about brains versus testes but here we are, the chicken is the one who “blinks first”. Now, let’s turn to the launchpad. Picture the NASA control room before the launch of an expedition to the Moon. Hundreds of (mostly) men in white short-sleeves shirts, pocket protectors and eyeglasses, hunched before screens, keyboards and telephones. Each one monitors a subsystem: left liquid hydrogen tank, backup gyroscopes, main engine telemetry… In the huge air-conditioned control room, five of these men are sweating, something’s not quite right with their baby. The temperature keeps rising, the pressure is falling, the telemetry link is weakening. Almost but not quite in the red zone. If the parameters keep drifting like this, they’ll have to pick up the red phone. But who wants to be the one who aborts the launch? So, they sweat some more and hope someone else blinks first. There you have it: Launchpad Chicken.

Now, move the imagery to projects with complicated subsystems. You see how the NASA metaphor made its way to Silicon Valley. There is always hope some other engineer will raise a hand and spare me the embarrassment of admitting my part of the project could crash the launch. This is what happened for MobileMe, with a twist on the cojones, so to speak. No one had enough brains and guts to risk humiliation, to raise a hand and say: Chief, we’re not ready here, let’s stop everything. As a result, MobileMe badly crashed on launch. A couple of weeks later, we have a leak: an “internal” memo from Steve Jobs. The email states the retroactively obvious, the project should have been delayed or at least launched in stages. No less obviously, a new leader is appointed, Eddy Cue, he’ll continue to run the iTunes systems as well. Charitably, the deposed MobileMe boss is granted anonymity, he might have been misinformed by his charges, or he might not have asked the right questions at the right times, it doesn’t matter anymore.

But, you’ll ask, that doesn’t tell us what went wrong, which liquid hydrogen tank sprung a leak. This now gets us into two more topics: sync and size. Sync here means keeping information identical, consistent over two or more devices. Less abstractly, for a simple example, I have a phone and a computer, I want their address books to identical or, at least, consistent. On simple cell phones, I use a cable (or a Bluetooth wireless connection) plus software to copy (parts of) my computer address book to the phone. But, wait a minute, I entered numbers on the phone that are not on my computer; I don’t want the copy from the computer to wipe out those new numbers. Trouble starts, as if connecting the cell phone to the computer and running the program wasn’t buggy enough. Tou want the software to compare the two address books, the phone’s and the laptop’s and decide what to keep and what to change, on both devices. But what about homonyms, or different numbers for the same person’s home? The program, hopefully, raises those “exceptions” and lets a human arbitrate.

We’re just warming up. Now picture a more real-life situation. One traveling consultant with one laptop, one smartphone, both carrying mail, address books and calendars and one assistant in the office with a desktop computer. In Microsoft Exchange’s lingo, the assistant is a “delegate”, has access, including modifications and new entries, to the traveling consultant’s data. Everything must be kept identical, consistent, in sync. How is this done?

Using the Exchange server as an example, it keeps the “true” data. And the “clients”, meaning the smartphone, the laptop, the assistant’s PC submit changes, new mail, an updated appointment, a new contact home phone to the Exchange server. In turn, the server propagates changes to the clients. We say the updates are “pushed” to the smartphone or the laptop, just as they “push” new mail or a new calendar item to the server. You can easily imagine conflict situations: the same appointment changed by the consultant and the assistant, address updates and the like. By now, at least on Exchange, these “exceptions” are well understood and generally well-handled. But it took years of practice. Just as it has taken years for RIM (founded in 1984), the Blackberry (launched in 1999) creators to polish what is the best-selling synchronized smartphone. Details, details and more subtle mistakes and special cases found and fixed. The Blackberry got its stardom from truly delivering the Simple, Easy, Invisible proposition referred to in the beginning of this essay.

MobileMe aspires to deliver a similarly invisible level of synchronization for people who don’t have an Exchange server, hence the “Exchange for the rest if us” slogan. But seeing the launch glitches, I wonder how many people at Apple stooped to using a Blackberry with an Exchange account. Doing this would have sobered them a little in advance of the launch, or delayed the whole thing, or tempered the boasts. Shortly after MobileMe’s first missteps, Apple publicly and smartly retracted its use of “Push” to describe MobileMe’s synchronization and the “Exchange for the rest of us” motto is no longer seen on the company’s Web site.

Moving to size: quantity begets nature. At some (often mysterious) point, more of the same becomes something different. One server, ten servers, more of the same. One thousand servers or, in Google’s case, running one million servers is of a different nature. Meaning different people with different knowledge and appetites than the ones needed to run a company’s email server. If every other iPhone customer wants to sync a PC or Mac with the newly (or old, with the 2.0 software update) purchased iPhone, MobileMe will soon serve millions and, in a not too distant future, tens of millions of iPhones. Besides knowing or not knowing the Buddha of sync, did the MobileMe team have the experience, the knowledge, the appreciation of the “size” problem before them? Very few people in our industry do. Ask Google’s rivals why they were trounced by someone coming late to the game but with a better handle on the “size” or “scale” problem. (See this paper from UC Berkeley, where ultra-large scale computing is actively researched, with private industry subsidies.)
In passing, 10 million MobileMe subscriptions at $100/year is a nice piece of change, one billion dollars, worth the trouble.

Let’s step back a little. Apple “pushes” somewhere between 100 and 200 megabytes of updates per month to each Mac user. Last week, the iPhone 2.0.1 update was announced, I connected two iPhones within minutes, the 200Mb files were downloaded and installed without a hitch and I haven’t heard any blogosphere complaints on the matter. iTunes has sold billions of songs, serves tens of millions of customers everyday and everything works with very few exceptions. In other words, some very large scale Apple systems do work. As discussed above, the iTunes boss (some say slave driver, a meliorative term in context) in now also in charge of MobileMe.

And, last week, parts of the Gmail service were down for 15 hours or so. Last month, Amazon’s respected Web Services went down. And, last year, RIM’s servers went down for about half a day in the Western Hemisphere, freaking out Wall Street investment bankers and management consultants. Even the best players must endure their share of false notes.

Back to MobileMe today:if you ask subscribers who’ve never experienced a Blackberry’s smooth delivery of sync, they love MobileMe. It works, it’s easy to set up and in the simple (most frequent) case of a PC/Mac with an iPhone, it does the wireless (OTA, Off The Air) sync job as now advertised. We’ll see how this scales once iPhones are sold in 21 more countries, 43 total starting August 22nd.


iPhone 3G — One Week Later

Contrary to what I expected, the dust hasn’t settled yet. A week later, people still queue, 2h30 Friday morning before being admitted to the sanctum sanctorum in San Francisco. Besides the long lines, there were glitches: activation problems, trouble with the new MobileMe service, with getting access to software updates for the “old” iPhones. Apple claims 1 million phones sold worldwide for the first weekend, probably 400,000 in the US alone. The latter number could explain the activation servers overload: in more normal times, AT&T must activate “only” 25,000 phones a day. Apple apologized for MobileMe problems and even conceded they should suspend some of the verbiage used to promote the service. Calling “Push” the way email and other information is coordinated between computers and the iPhone was found a little “anticipatory”, meaning promises made couldn’t yet be fulfilled. ["Push" means your phone or your computer will receive information without asking for it, without "Pulling". The Blackberry is still the king of "Push".]

But this is mostly folklore, fun but transitory. Something more important is taking place: the advent of the App Store. On iTunes, the App Store is a section where you find new applications for the iPhone. On the iPhone, the App Store is an icon that enables the one-click purchase and wireless download of new applications, just like a song and often costing the same, 99 cents, or less. In about the same time it took Apple to sell 1 million phones, users (this includes the updated first generation iPhones) downloaded 10 million applications. Half of these were free. For the paid for ones, about half were games, the rest range from software for general aviation pilots, medical students, bloggers, to light sabers, yes, you read it right, translation with voicing of phrases, nice when you go to China, subway maps, newsreaders, CRM, social networking, instant messaging and music streaming. Apple signed in with a nice, free, flourish: a program transforms your iPhone into a remote control for iTunes or AppleTV, works anywhere in the house through your WiFi network. And on and on… I was going to forget the Chanel Haute Couture Show. Free. Highest Karl Lagerfeld quality. How did this get in? Let me guess, friends in a common advertising agency? Is this one the new business models discussed below?

When the App Store opened a week ago, the catalog featured 27 pages, we’re now at 42. It’s fair to say some applications are silly, useless or unstable. The user review system in the App Store is merciless and deals harshly with stupidity, bad code or dysfunctional UI (User Interface). Also, there is an automatic update mechanism and applications such as Facebook have already been improved. The bad ones will die quickly.

The BFD, as in Big Fundable (or other F words) Deal here is the Great American Instant Gratification. The mental transaction cost of getting an application is very low: lots of choices, small price, one-click transaction. This is the magic of using the existing iTunes infrastructure and exisiting customer behavior. I can’t help but wonder whem Apple (or its competitors) will also use the model for desktop applications, Cloud Computing notwithstanding. I buy iTunes music for my personal computer, why not buy applications for my Mac or my PC from the same store?

Wait, as we say in America, there is more: business models. We’re beginning to see ads on the iPhone, with photos, music or the New York Times. We, VC, will be watching carefully as we wonder if advertising on such small screens will work, will generate real money. Another form of advertising looks more promising: free music channels on the Pandora application. You first set “channels” on Pandora.com from your PC, say Mozart, Bach, Miles Davis and Dave Brubeck. On your iPhone, you click Miles Davis and you either get Miles Davis works or music deemed to belong to the same genre, with a nice note explaining why the piece was put on this channel. And…, if you like it, one click buys it form iTunes. Clever and clever a second time because not convoluted.

Lastly, content presented as, wrapped in applications. For 99 cents you buy and load an application called The Art of War. You’ve recognized Sun Tzu’s book. But, instead of having a separate book reader and content purchased for it, with the risk of “unwanted duplication”, content and reader are now budled as one application for each book. When I pitch my next book to the publisher, I’ll make sur to mention the 45 million iPhones to be sold next year. This number is an admittedly wildly optimistic (and widely criticized) forecast by Gene Munster from Piper Jaffray. Unless RIM (Blackberry), Nokia and Google fight back, which is very likely, they don’t like Steve Jobs wiping his Birkenstocks on their back. —JLG

The Next Googlitzer Prizes

Let me build on my boss Frédéric Filloux’s point about bloggers. And, to do this, let me start with a quick linguistics lemma about California-speak.

In France, when two engineers review a project, the first one energetically “offers” (that’s an example of California-speak), hammers his views thusly: The only way to solve the problem is… And he expresses an opinion couched in Truth terms. The other techie retorts: You’re an idiot, this is brain-dead, the only way to solve the problem is… And another opinion follows, no less forceful. They’re just bantering, nothing personal and, soon, they get into the collaboration part of the review, give and take, get to a resolution and leave the meeting happy with themselves, the other person and the to-do list.

I tried this in Cupertino, when given charge of Apple’s engineers in 1985. They smiled politely: Thank you for sharing. But I sensed a transparent steel curtain descending between us and no actual communication took place after what I thought was just a manly opening. I knew that hypocrisy is the lubricant of social intercourse, I just forgot that it applied to conversations with techies. I had to learn to speak Californian: a set of euphemisms, mannerisms designed to equivocate and, as a result, to avoid giving offence. This is great, fantastic, I like what you do… All mean nothing, just filler speech designed to move the conversation forward without taking risk. Thank you for sharing means “I hate what you just said, asshole!” This is, as you well know, the land of neologism. Add the politics of large organizations and you get “grinfu–ing”, screwing someone with a big smile. Don’t say But, say And…

Back to the opening salvo above, in California-speak, Let me build on that point is what the French engineer must say to his California colleague in order to be heard. Actually, a gentler view of the deflection is that it encourages collaboration, let me use what you just said as a foundation, rather than excite confrontation.

With this in mind, allow me to register mild disagreement with Frédéric’s view of bloggers. I won’t fall for the easy characterization: the professional journalist versus the interlopers. I don’t write a blog, for reasons I don’t fully understand, but I read lots of them. Naively, I bought several newsreader applications and found out that the free Google Reader did the job very nicely. I can subscribe and unsubscribe to hundreds of blogs, ranging from the sublime to the sordid. (Try “Quantum Physics” and “Zoophilia” in the Reader’s search engine for blogs.) You can even “share”, that word again, items, stories with friends or even export your entire set of subscriptions and give it to a friend or family member as a way to let them see the blogosphere through your eyes.

I agree with FF, the bad news abound. There is a lot of garbage, nonsense, paid-for people and content parading as impartial views, bloggers echoing each other to the point where ten blogs spreading the same story could trip one to think: This must be true, there are ten sources for that story. No, it’s one unsubstanciated rumor repeated ten times over. We’re told there are 17 million blogs and growing, this is a gigantic garbage heap even Wall-E can’t mine for the gems. [I just saw the movie and can't comprehend the quasi-universal praise.]

All true but, sorry, and yet enough cream manages to ascend to the surface to make blogs and bloggers an alternative to the conventional newspaper. Experts and perverts of every stripe, yes, and when I’m burned a couple of times, the subscription dies. Speaking of subscriptions dying, I wonder how long I’ll keep longing for the noise of newspaper landing on my door steps in the wee hours. Between blogs and newspaper Web sites, when I open the paper in the morning, I often feel I’ve seen the news item the night before. If I want a knowledgeable discussion of the Microhoo saga, there are two or three bloggers, starting with the almost eponymous Blodget, Peter Kafka, I’m not making this up, and Michael Arrington who’ll give me better/faster food for thought than the Wall Street Journal or the Grey Lady’s Joe Nocera.

As we mention existing newspapers, for all their wrapping themselves in the mantel of professionalism, how often are they guilty of the sins of cronyism, re-writing stories seen elsewhere, when it’s not making them up altogether? Numerous New York Times accidents come to mind: Judith Miller’s “coverage” of the Iraq War build-up, Jason Blair’s fabrications, the scurrilous John McCain sex story and too many more.

Back to the excess(es) argument, there is no good culture without bad taste, without people “going too far”. How do we innovate without breaking things, making mistakes, giving people legitimate reasons to be upset? Yes, legitimate reasons to be upset, but missing the larger point. There are plenty of good reasons to take a dim view of technology, it does facilitate the expression of our lowest instincts. And the Internet is a true revolution for freedom of expression. New genres are emerging and will continue to do so as bandwidth increases change the gamut (and location, think mobility) of available media. As the eternal optimist, I welcome the excesses of bloggers, they’re stimulating, helpful, irritating and fun. And, some day not far in the future, we’ll crown a few of them with something like a Googler Prize. Who knows, a few of today’s journalists might be among them.

Technology / Multicore Processors: More is Better, Right?

Lies, damned lies and benchmarks. So goes an old industry joke setting up an ascending order of offenses to the truth. Old joke but alive and well in the latest industry trend: the recourse to multicore processors in our PCs.

Here, multicore means several processor modules (cores) on the same CPU (Central Processing Unit) chip, as opposed to multiprocessors, several separate chips inside the same computer. This means more computing power inside our computers, this must be good.
Not so fast. Yes, more raw power but do we know how much extra performance percolates to the surface of our user experience? Not as much as we’re led to believe.

Why this sudden conversion to multicores? The simple answer is Moore’s Law stopped working the way it did for almost 40 years; Moore’s Law used to predict a doubling every 18 months for the price/performance ratio of silicon chips. As expected, in about twenty years, we went from 1 MHz (the frequency at which the CPU processes instructions) for the Apple II, to 3 GHz (3,000 times faster) Intel chips — for about the same price. But, in the last few years, something happened: the clock frequency of top-of-the-line chips got stuck around 3 GHz. This didn’t happen because silicon technology stopped improving, we now speak of silicon building blocks as small as 35 nanometers (billionths of a meter) or even smaller in pre-production labs. A few years ago, we were happy with 120 nm or larger. So, the surface of things looks good: we still know how to cram more and more logic elements on a chip. But we have trouble making them run faster. Why?

Here easy basic physics come in. Let’s say I want to move a one gram mass up and down once, this will require a small amount of energy, say one Joule. If I repeat this once per second, we have one Joule per second, this is known as one Watt. Moving to 1,000 times a second, we’re now dealing with a Kilowatt. If the frequency moves to 1 GHz, one billion times per second, we need one Gigawatt. Going back to chips, they move electrons back and forth as the processor clock ticks. You see where I’m going: the electric power consumed by a chip climbs with the clock frequency. At the same time, the basic silicon elements kept shrinking. More and more electric power in smaller and smaller devices. One Intel scientist joked seriously that processors could become as hot as the inside of a nuclear reactor.
Back to our machines, we have desktop processors that dissipate as much as 150 Watts and require a liquid cooling element right on top of the chip. And we all complain our laptops are too hot for our… laps.

But now, imagine the computer industry calmly folding its arms and telling us: That’s all folks, this is as good as it gets. This after decades of more/faster/cheaper? No. That’s why our Valley is now peddling multicores. We can’t have faster processors (this is mostly left unsaid), let’s have more of them. And look at the benchmarks, more power than ever. This is where the question of performance delivered to the user versus raw power comes in.

First, 1+1 doesn’t equal 2. Simply because the two processors sometimes have to contend for a single resource such as memory. One processor must wait for the other to finish before proceeding. More cores, more such losses.

Second, and much more serious, most of today’s software has been written with a single processor in mind. There is no easy mechanism, either in the processors themselves, or the operating system, or the program itself to split code modules off and direct them to one processor or another. The situation is getting better as operating systems learn, at least, to dispatch ancillary, housekeeping functions to another module, leaving more computing power available to a program that only knows how to work on a single processor. And program themselves are slowly but surely being updated to split off modules that work independently. Sometimes it requires much programmer intervention, read time and money. In other cases, automated tools restructure some or most of the code. Still, today’s PC software is far from taking advantage of multicores. Hence the reference to benchmark painting an unrealistic picture of multicore performance in the real application software world.

And, third, there is yet another fly in the benchmark. Some activities are inherently parallelizable: ten people will look on ten library shelves for a single book faster (statistically) than a single person. Four people will definitely paint four walls faster than a lone painter (assuming no contention for a single paint bucket, see above). But other activities are inherently sequential: you must wait for the result of the previous operation before proceeding with the next. One can think of spreadsheets where a complex, real-world financial model cannot be computed in independent parts, each operation feeds the next until all the formulae have been computed and, in some cases, iterated. There are many such applications, weather simulation being one, because it relies on a type of equations that cannot be made to compute in parallel. As you can imagine, there is a whole body of computer science dedicated to parallelism. Let’s just say there is no real substitute for Gigahertz, for faster chips. That’s one of the reasons why weather forecasting hasn’t made much progress recently.

Multicores are nice, they do add some performance, but they’re only a band-aid until we find a way to make faster chips. — JLG

Nokia makes Symbian Open Source: Declaring Victory?

When a $oftware company experiences a sudden access of generosity and donates its first born to the world of Open Source, what are we to think? They made so much money it was embarrassing? Or, it’s an act of desperation: We can’t sell it, maybe be they’ll use it if we give it away. Uncharitable minds add: And then we’ll make money telling others how to decipher inscrutable code and by explaining away bugs — not to be confused with fixing them. More politely: Give away the code and sell services around it. It can work, ask IBM and Red Hat. Or look at Google, it wouldn’t exist without the Open Source movement and its star, Linux, powering its servers, one million of them and counting.

Back to Symbian, what’s the real story? Admitting defeat or, having found a way to make money with the OS — finally? Knowing Nokia, certainly not the former. It is today the number one smartphone maker before RIM (Blackberry) and Apple. Nokia has no intention to cede the throne. But it’s not about making money with the Symbian OS either, that’s impossible. Let me explain.

Once upon a time, that was before Newton, Palm and Pocket PC, Psion, a British company, was the king of “organizers”, later called PDA, Personal Digital Assistants. Through the twists and turns of the genre’s history, perhaps a topic for another column, Psion lost its crown and went out of the PDA business. But the OS inside the Psion was a gem, this is an ex-user speaking, it multi-tasked without crashing. More twists and turns and a joint venture is born led by Nokia and Motorola, with followers such as Sony Ericsson and Samsung. Called Symbian, the company got the Psion OS. Symbian was to develop software for smartphones and make money licensing it to its partners.

Bad business model, bad timing, bad structure. Bad business model because handset makers don’t (or didn’t) actually care for software and don’t want to pay anything of significance for it. They (and their masters, the carriers) spend much more money on the nicely printed cardboard box than on the software inside. Bad timing because the smartphone market wasn’t really there when Symbian was born 10 years ago. The smartphone market only woke up around 2005 when Nokia, RIM and Palm totaled a few millions of units shipped that year.

Lastly, bad structure. No one was really in charge, the owners/competitors each wanted different features, a different user interface, application compatibility was nonexistent, unwanted even in many cases and development tools weren’t up to the power and quality PC developers enjoyed. Symbian kept losing money and Nokia, viewed as the main beneficiary of the messy joint venture, kept pouring cash in.

Today, we see that the smartphone market did more than wake up. RIM’s business grows by more than 100% a year; Apple, while number three worldwide, manages to shake up the industry and to look bigger than it is — or to project an accurate picture of its future, we’ll see; Google announces its Open Source smartphone OS, Android; Microsoft acquires Danger, the maker of an interesting smartphone, the Sidekick, and proclaims its intent to “own” 40% of the market by 2012.

All this, in my view mostly Apple and Android, pushed Symbian to try and regain control of its OS future. To do so, Nokia buys out its partners and becomes the sole owner of Symbian, now called the Symbian Foundation, sounding very non-profit.

Good, you’ll say, they want to be in the driver’s seat (unintended obscure geek pun here…) but why go Open Source then? My guess is that was a condition of buying the partners out. Nokia: You have access to the source code, my dear friends, you have total freedom. My other hunch is that the license won’t be the most constraining of the Open Source variants. By this I mean there is the GPL license that obligates you to share every improvement (or bug) you make and that also forces you to put in the Open Source domain any code that uses, connects to the GPL software you’re enjoying. Everything must become Open Source. Other licensing arrangements let you make contributions to the public Open Source domain but let you keep a wall between your private code and the public one. This, “true” Open Source or not, is the topic of heated arguments hopelessly mixing principle and money. Type “Open Source arguments” in Google for a sample.

I doubt Motorola, Samsung and Sony Ericsson will keep using Symbian Open Source code for long, they’re likely to go to one of several mobile Linux vendors, this is better than developing their own OS code or safer than hoping Nokia will give away improved Symbian code. Just last week, the LIPS, the Linux Phone Standards group decided to merge into the LiMo, Linux Mobile Foundation.

This looks like a smart move by Nokia: Regain control of its OS future, look politically correct and throw its competitors into a jungle of platforms (more than 60 worldwide, I’m told) out there. A beautiful mess, opportunities galore, like microcomputers before Microsoft and Apple made them PC.

Nokia, control like Apple, sound like Google. –JLG

Wait, Wait, This Is My Stuff!

Social networks and PC becoming an arranged knwoledge network

Let me start with an example. Hopefully, the concept will emerge.
Facebook. The latest fracas is their conflict with Goggle’s Friend Connect,
technology that gives any web site simple tools to acquire social networking features.

As a result, users of my organic gardening site connect, share ideas, recipes, pictures with their friends on other participating sites, such as Facebook, hi5, Orkut and many others (social networking or not). The point of Friend Connect not being forced to become members of other sites, just sharing. A side-effect is it becomes easier to take my personal data from Facebook and move my information elsewhere.
No, no, says Facebook. After initially agreeing to the Friend Connect interchange, it blocked access.

This raises the question in the title: Is my Facebook information mine or not? The company has spent upwards to two hundred million dollars building a “free” service. The value Facebook counts on to generate advertising revenue is what they felicitously call the social graph. As the name suggests, this is information about me, about the people I connect to, what we like, picture we share, music recommendations, games we play, purchases we make, invitations to events.

Everything about everyone, arranged in a knowledge network. Slight exaggeration, but you see the idea. Not just tons of details about me but a web of such details. This leads to the advertiser’s wet dream: ads focused on one individual, at the right time. Gee, Joe just told his friends he’s got a new job, let’s see if he’s in the mood for a new car or a new suit, or inviting his best friends to a celebratory dinner. For you, special prrrrice today!

Facebook is currently investigated by Canadian authorities for its ways with user privacy and we’ll recall last Fall’s stumble with Beacon. Users weren’t pleased to discover Facebook passed information to merchants without their knowledge and consent. The plan was creepy: even when users weren’t logged on Facebook, some of their moves were recorded and passed on to “partners”. There is a pattern here: Facebook thinks it owns my data. This is the gold mine they want to exploit and they don’t like the idea of the data flowing somewhere else (read Google).

They are not alone. Many suppliers in our PC/Internet life clearly think they have extensive rights on our machines and our data. I recall the incessant Orwellian demands to download Windows Genuine Advantage (nice bit of newspeak) to enable operating system and Office updates. But I already proved last week I have a genuine copy of Windows! Never mind, do it again. In ironic ways, it gets worse with companies such as Symantec and their security products. Once installed, they are exceedingly difficult to remove. This is for your safety, you see. We conceal key bits so the virus bad guys can’t remove them. Well, no, you keep insisting and Symantec will reluctantly tell you where to download a removal tool the bad guys can use as well. –JLG