No predictions, no forecast, that’s above my pay grade, just sifting through this coming year’s most interesting trends. The Chinese curse, May you live in interesting times, being upon us, we might as well try and make the best of this New Year.
Simple is hard. Easy is harder. Invisible is hardest. So goes one of the many proverbs of our computer lore. As Apple found out last month with the MobileMe launch misfires, the lofty promise of “Exchange for the rest of us” translated into a user experience that was neither simple nor easy — in a highly visible way. Four weeks later, the service appears stable but doubts linger: Is Apple able to run a worldwide wireless data synchronization service for tens of millions of users.
Sorry, the document you have requested is not available.
Contrary to what I expected, the dust hasn’t settled yet. A week later, people still queue, 2h30 Friday morning before being admitted to the sanctum sanctorum in San Francisco. Besides the long lines, there were glitches: activation problems, trouble with the new MobileMe service, with getting access to software updates for the “old” iPhones. Apple claims 1 million phones sold worldwide for the first weekend, probably 400,000 in the US alone. The latter number could explain the activation servers overload: in more normal times, AT&T must activate “only” 25,000 phones a day. Apple apologized for MobileMe problems and even conceded they should suspend some of the verbiage used to promote the service. Calling “Push” the way email and other information is coordinated between computers and the iPhone was found a little “anticipatory”, meaning promises made couldn’t yet be fulfilled. [“Push” means your phone or your computer will receive information without asking for it, without “Pulling”. The Blackberry is still the king of “Push”.]
But this is mostly folklore, fun but transitory. Something more important is taking place: the advent of the App Store. On iTunes, the App Store is a section where you find new applications for the iPhone. On the iPhone, the App Store is an icon that enables the one-click purchase and wireless download of new applications, just like a song and often costing the same, 99 cents, or less. In about the same time it took Apple to sell 1 million phones, users (this includes the updated first generation iPhones) downloaded 10 million applications. Half of these were free. For the paid for ones, about half were games, the rest range from software for general aviation pilots, medical students, bloggers, to light sabers, yes, you read it right, translation with voicing of phrases, nice when you go to China, subway maps, newsreaders, CRM, social networking, instant messaging and music streaming. Apple signed in with a nice, free, flourish: a program transforms your iPhone into a remote control for iTunes or AppleTV, works anywhere in the house through your WiFi network. And on and on… I was going to forget the Chanel Haute Couture Show. Free. Highest Karl Lagerfeld quality. How did this get in? Let me guess, friends in a common advertising agency? Is this one the new business models discussed below?
When the App Store opened a week ago, the catalog featured 27 pages, we’re now at 42. It’s fair to say some applications are silly, useless or unstable. The user review system in the App Store is merciless and deals harshly with stupidity, bad code or dysfunctional UI (User Interface). Also, there is an automatic update mechanism and applications such as Facebook have already been improved. The bad ones will die quickly.
The BFD, as in Big Fundable (or other F words) Deal here is the Great American Instant Gratification. The mental transaction cost of getting an application is very low: lots of choices, small price, one-click transaction. This is the magic of using the existing iTunes infrastructure and exisiting customer behavior. I can’t help but wonder whem Apple (or its competitors) will also use the model for desktop applications, Cloud Computing notwithstanding. I buy iTunes music for my personal computer, why not buy applications for my Mac or my PC from the same store?
Wait, as we say in America, there is more: business models. We’re beginning to see ads on the iPhone, with photos, music or the New York Times. We, VC, will be watching carefully as we wonder if advertising on such small screens will work, will generate real money. Another form of advertising looks more promising: free music channels on the Pandora application. You first set “channels” on Pandora.com from your PC, say Mozart, Bach, Miles Davis and Dave Brubeck. On your iPhone, you click Miles Davis and you either get Miles Davis works or music deemed to belong to the same genre, with a nice note explaining why the piece was put on this channel. And…, if you like it, one click buys it form iTunes. Clever and clever a second time because not convoluted.
Lastly, content presented as, wrapped in applications. For 99 cents you buy and load an application called The Art of War. You’ve recognized Sun Tzu’s book. But, instead of having a separate book reader and content purchased for it, with the risk of “unwanted duplication”, content and reader are now budled as one application for each book. When I pitch my next book to the publisher, I’ll make sur to mention the 45 million iPhones to be sold next year. This number is an admittedly wildly optimistic (and widely criticized) forecast by Gene Munster from Piper Jaffray. Unless RIM (Blackberry), Nokia and Google fight back, which is very likely, they don’t like Steve Jobs wiping his Birkenstocks on their back. —JLG
Lies, damned lies and benchmarks. So goes an old industry joke setting up an ascending order of offenses to the truth. Old joke but alive and well in the latest industry trend: the recourse to multicore processors in our PCs.
Here, multicore means several processor modules (cores) on the same CPU (Central Processing Unit) chip, as opposed to multiprocessors, several separate chips inside the same computer. This means more computing power inside our computers, this must be good.
Not so fast. Yes, more raw power but do we know how much extra performance percolates to the surface of our user experience? Not as much as we’re led to believe.
Why this sudden conversion to multicores? The simple answer is Moore’s Law stopped working the way it did for almost 40 years; Moore’s Law used to predict a doubling every 18 months for the price/performance ratio of silicon chips. As expected, in about twenty years, we went from 1 MHz (the frequency at which the CPU processes instructions) for the Apple II, to 3 GHz (3,000 times faster) Intel chips — for about the same price. But, in the last few years, something happened: the clock frequency of top-of-the-line chips got stuck around 3 GHz. This didn’t happen because silicon technology stopped improving, we now speak of silicon building blocks as small as 35 nanometers (billionths of a meter) or even smaller in pre-production labs. A few years ago, we were happy with 120 nm or larger. So, the surface of things looks good: we still know how to cram more and more logic elements on a chip. But we have trouble making them run faster. Why?
Here easy basic physics come in. Let’s say I want to move a one gram mass up and down once, this will require a small amount of energy, say one Joule. If I repeat this once per second, we have one Joule per second, this is known as one Watt. Moving to 1,000 times a second, we’re now dealing with a Kilowatt. If the frequency moves to 1 GHz, one billion times per second, we need one Gigawatt. Going back to chips, they move electrons back and forth as the processor clock ticks. You see where I’m going: the electric power consumed by a chip climbs with the clock frequency. At the same time, the basic silicon elements kept shrinking. More and more electric power in smaller and smaller devices. One Intel scientist joked seriously that processors could become as hot as the inside of a nuclear reactor.
Back to our machines, we have desktop processors that dissipate as much as 150 Watts and require a liquid cooling element right on top of the chip. And we all complain our laptops are too hot for our… laps.
But now, imagine the computer industry calmly folding its arms and telling us: That’s all folks, this is as good as it gets. This after decades of more/faster/cheaper? No. That’s why our Valley is now peddling multicores. We can’t have faster processors (this is mostly left unsaid), let’s have more of them. And look at the benchmarks, more power than ever. This is where the question of performance delivered to the user versus raw power comes in.
First, 1+1 doesn’t equal 2. Simply because the two processors sometimes have to contend for a single resource such as memory. One processor must wait for the other to finish before proceeding. More cores, more such losses.
Second, and much more serious, most of today’s software has been written with a single processor in mind. There is no easy mechanism, either in the processors themselves, or the operating system, or the program itself to split code modules off and direct them to one processor or another. The situation is getting better as operating systems learn, at least, to dispatch ancillary, housekeeping functions to another module, leaving more computing power available to a program that only knows how to work on a single processor. And program themselves are slowly but surely being updated to split off modules that work independently. Sometimes it requires much programmer intervention, read time and money. In other cases, automated tools restructure some or most of the code. Still, today’s PC software is far from taking advantage of multicores. Hence the reference to benchmark painting an unrealistic picture of multicore performance in the real application software world.
And, third, there is yet another fly in the benchmark. Some activities are inherently parallelizable: ten people will look on ten library shelves for a single book faster (statistically) than a single person. Four people will definitely paint four walls faster than a lone painter (assuming no contention for a single paint bucket, see above). But other activities are inherently sequential: you must wait for the result of the previous operation before proceeding with the next. One can think of spreadsheets where a complex, real-world financial model cannot be computed in independent parts, each operation feeds the next until all the formulae have been computed and, in some cases, iterated. There are many such applications, weather simulation being one, because it relies on a type of equations that cannot be made to compute in parallel. As you can imagine, there is a whole body of computer science dedicated to parallelism. Let’s just say there is no real substitute for Gigahertz, for faster chips. That’s one of the reasons why weather forecasting hasn’t made much progress recently.
Multicores are nice, they do add some performance, but they’re only a band-aid until we find a way to make faster chips. — JLG
When a $oftware company experiences a sudden access of generosity and donates its first born to the world of Open Source, what are we to think? They made so much money it was embarrassing? Or, it’s an act of desperation: We can’t sell it, maybe be they’ll use it if we give it away. Uncharitable minds add: And then we’ll make money telling others how to decipher inscrutable code and by explaining away bugs — not to be confused with fixing them. More politely: Give away the code and sell services around it. It can work, ask IBM and Red Hat. Or look at Google, it wouldn’t exist without the Open Source movement and its star, Linux, powering its servers, one million of them and counting.
Back to Symbian, what’s the real story? Admitting defeat or, having found a way to make money with the OS — finally? Knowing Nokia, certainly not the former. It is today the number one smartphone maker before RIM (Blackberry) and Apple. Nokia has no intention to cede the throne. But it’s not about making money with the Symbian OS either, that’s impossible. Let me explain.
Once upon a time, that was before Newton, Palm and Pocket PC, Psion, a British company, was the king of “organizers”, later called PDA, Personal Digital Assistants. Through the twists and turns of the genre’s history, perhaps a topic for another column, Psion lost its crown and went out of the PDA business. But the OS inside the Psion was a gem, this is an ex-user speaking, it multi-tasked without crashing. More twists and turns and a joint venture is born led by Nokia and Motorola, with followers such as Sony Ericsson and Samsung. Called Symbian, the company got the Psion OS. Symbian was to develop software for smartphones and make money licensing it to its partners.
Bad business model, bad timing, bad structure. Bad business model because handset makers don’t (or didn’t) actually care for software and don’t want to pay anything of significance for it. They (and their masters, the carriers) spend much more money on the nicely printed cardboard box than on the software inside. Bad timing because the smartphone market wasn’t really there when Symbian was born 10 years ago. The smartphone market only woke up around 2005 when Nokia, RIM and Palm totaled a few millions of units shipped that year.
Lastly, bad structure. No one was really in charge, the owners/competitors each wanted different features, a different user interface, application compatibility was nonexistent, unwanted even in many cases and development tools weren’t up to the power and quality PC developers enjoyed. Symbian kept losing money and Nokia, viewed as the main beneficiary of the messy joint venture, kept pouring cash in.
Today, we see that the smartphone market did more than wake up. RIM’s business grows by more than 100% a year; Apple, while number three worldwide, manages to shake up the industry and to look bigger than it is — or to project an accurate picture of its future, we’ll see; Google announces its Open Source smartphone OS, Android; Microsoft acquires Danger, the maker of an interesting smartphone, the Sidekick, and proclaims its intent to “own” 40% of the market by 2012.
All this, in my view mostly Apple and Android, pushed Symbian to try and regain control of its OS future. To do so, Nokia buys out its partners and becomes the sole owner of Symbian, now called the Symbian Foundation, sounding very non-profit.
Good, you’ll say, they want to be in the driver’s seat (unintended obscure geek pun here…) but why go Open Source then? My guess is that was a condition of buying the partners out. Nokia: You have access to the source code, my dear friends, you have total freedom. My other hunch is that the license won’t be the most constraining of the Open Source variants. By this I mean there is the GPL license that obligates you to share every improvement (or bug) you make and that also forces you to put in the Open Source domain any code that uses, connects to the GPL software you’re enjoying. Everything must become Open Source. Other licensing arrangements let you make contributions to the public Open Source domain but let you keep a wall between your private code and the public one. This, “true” Open Source or not, is the topic of heated arguments hopelessly mixing principle and money. Type “Open Source arguments” in Google for a sample.
I doubt Motorola, Samsung and Sony Ericsson will keep using Symbian Open Source code for long, they’re likely to go to one of several mobile Linux vendors, this is better than developing their own OS code or safer than hoping Nokia will give away improved Symbian code. Just last week, the LIPS, the Linux Phone Standards group decided to merge into the LiMo, Linux Mobile Foundation.
This looks like a smart move by Nokia: Regain control of its OS future, look politically correct and throw its competitors into a jungle of platforms (more than 60 worldwide, I’m told) out there. A beautiful mess, opportunities galore, like microcomputers before Microsoft and Apple made them PC.
Nokia, control like Apple, sound like Google. –JLG
Social networks and PC becoming an arranged knwoledge network
Let me start with an example. Hopefully, the concept will emerge.
Facebook. The latest fracas is their conflict with Goggle’s Friend Connect,
technology that gives any web site simple tools to acquire social networking features.
As a result, users of my organic gardening site connect, share ideas, recipes, pictures with their friends on other participating sites, such as Facebook, hi5, Orkut and many others (social networking or not). The point of Friend Connect not being forced to become members of other sites, just sharing. A side-effect is it becomes easier to take my personal data from Facebook and move my information elsewhere.
No, no, says Facebook. After initially agreeing to the Friend Connect interchange, it blocked access.
This raises the question in the title: Is my Facebook information mine or not? The company has spent upwards to two hundred million dollars building a “free” service. The value Facebook counts on to generate advertising revenue is what they felicitously call the social graph. As the name suggests, this is information about me, about the people I connect to, what we like, picture we share, music recommendations, games we play, purchases we make, invitations to events.
Everything about everyone, arranged in a knowledge network. Slight exaggeration, but you see the idea. Not just tons of details about me but a web of such details. This leads to the advertiser’s wet dream: ads focused on one individual, at the right time. Gee, Joe just told his friends he’s got a new job, let’s see if he’s in the mood for a new car or a new suit, or inviting his best friends to a celebratory dinner. For you, special prrrrice today!
Facebook is currently investigated by Canadian authorities for its ways with user privacy and we’ll recall last Fall’s stumble with Beacon. Users weren’t pleased to discover Facebook passed information to merchants without their knowledge and consent. The plan was creepy: even when users weren’t logged on Facebook, some of their moves were recorded and passed on to “partners”. There is a pattern here: Facebook thinks it owns my data. This is the gold mine they want to exploit and they don’t like the idea of the data flowing somewhere else (read Google).
They are not alone. Many suppliers in our PC/Internet life clearly think they have extensive rights on our machines and our data. I recall the incessant Orwellian demands to download Windows Genuine Advantage (nice bit of newspeak) to enable operating system and Office updates. But I already proved last week I have a genuine copy of Windows! Never mind, do it again. In ironic ways, it gets worse with companies such as Symantec and their security products. Once installed, they are exceedingly difficult to remove. This is for your safety, you see. We conceal key bits so the virus bad guys can’t remove them. Well, no, you keep insisting and Symantec will reluctantly tell you where to download a removal tool the bad guys can use as well. –JLG
No, no, not Steve Jobs but an even higher entity smiling upon the company. As I hope to show, Apple’s hard work years ago is now about to pay huge unexpected dividends on the iPhone. When the iPhone first came out of Steve Jobs’ quasi-divine hands in January 2007, it was a hack, the result of clever handcrafting by Apple engineers, a crazed last-minute rush to the show deadline. As such, it lacked the basics of what we call a platform, an industry term of art – or BS. Here, a platform means a combination software, or hardware, or both on which software developers build applications. A platform requires documentation, where the building blocks are, what they do, how to use them. The platform also comes with tools, software to build and test the applications. Last but not least, a platform implies some stability, meaning it works often enough, and it’s predictable, it doesn’t take brutal turns that undo the work of developers.
Early 2007, the iPhone had none of these attributes. So, Steve resorted to proven industry maneuver: If you can’t fix it, feature it. No need for “native” (meaning running on the iPhone itself) applications. This is the New World of Web 2.0, bleated the propagandastaffel. Use the iPhone’s browser (the best in the business, it helped immensely) to run server-based applications. No need to download anything, centralized maintenance, easy updates… The faithful heretics would have none of that and a new game started. One week the hackers managed to break Apple’s barriers preventing the installation of native applications. A few days later Apple issued an update to the iPhone firmware that broke the hacks.
Let’s pause for a lemma, a building block in the story: from day one, the iPhone had something no competitor had: iTunes. Apple made having an iTunes account a sine qua non requirement for using an iPhone. For downloading songs and movies, just like its younger brother the iPod? That and more. With iTunes you backup your iPhone, you bring it back to “factory settings”, helpful if a hack “bricked” it, meaning if it became as lively of a brick, you install software updates, most of which defeated the impudent hacks.
Moving forward, the pressure was building: Apple made a very smart move by using a trimmed down version of OS X (the Mac’s software… platform) as the software engine for the iPhone. We know and love OS X, said the developers. Mr. Jobs, tear down that wall! It now looks like Google’s Android helped Dear Leader make up his mind. Rumors were mounting: RSN (Real Soon, Now), Google would announce a free, open-source platform for smartphones. Just as Steve smartly turned around and touted Intel processors after years of expounding the superior PowerPC architecture, on October 17th, 2007, he stood up and announced the SDK (Software Development Kit) for the iPhone. Availability by the end of February 2008.
The belief in Providence benignly smiling on Apple now comes in. In 2001, Apple sweated the servers, the legal agreements with publishers, the one-click payment system, the client software on PC and Mac. All this to create the still-unequaled iTunes experience. Now, one bright 2007 morning, they have an epiphany: Songs are zeroes and ones. One click and they land in a bin, a directory in the iPhone. But applications are also strings of zeroes and ones. If we put up iPhone applications in the iTunes store, they land in a different bin inside the iPhone but the one-click purchase and download is the same. Halleluiah! All the work to build the iTunes business now pays off for the applications. We must be The Chosen Ones. This is no small detail. Today, if you’re an independent software developer, writing good code is the easy part. The Evil S&M, Sales and Marketing, await you. Shelf space, physical or on the Web, is very expensive. Setting up download and payment systems isn’t for the faint of wallet either. With the iPhone, Apple removes (most of) these hurdles. All you have to do is write good code.
Picture the young developer still living in his mother’s basement, he sells 50,000 copies of his work for $10, the price of an iTunes album. Apple keeps $3, he gets $7. Times 50,000, he makes $350,000 and can now pay rent to his mother and buy her a car. (For perspective, the current forecast is for between 30 and 45 million iPhones sold by the end of 2009.) Picture also the competition. No one else has such a well-oiled, widely known system to add applications to a smartphone as Itunes. (Google says they will eventually offer one for Android.) This is a billion dollars business. Actually, $1.2 billion in 2009, according to Gene Munster a Piper Jaffray analyst. (For a healthy counterpoint, see the snarky comments on TechCrunch.) Regardless, the arrival of native applications on the iPhone is a big event, one made possible by an unintended – and rather amusing – consequence of the iTunes music distribution system. How will this be written up in books and Harvard Business School case studies? –JLG
From a distance, it must be hard to comprehend Silicon’s Valley position on the 2008 presidential election. Isn’t Stanford University, the heart of the region, a private, capitalistic university? Aren’t all rich investors and entrepreneurs siding with the party of money, Republicans? This is the capital of capital, the world-center of free enterprise, how can we support Tax & Spend Big Government Liberals? There are many answers to that, suspiciously too many, perhaps.
Let’s start with the Caviar Left posture: now that we’ve made our money, we tell others to make sacrifices. See Al Gore pontificating about carbon footprints while traveling by private jet and living in a huge energy-hungry mansion. A note in passing: Al is a partner in a Kleiner Perkins venture capital fund. John Doerr, one of Kleiner’s lead VC is a hyperactive fund raiser for Democrats. And while we note things in passing, see Colin Powell on the masthead as a Strategic Limited Partner.
A year ago, we were all for Hillary. Out with Bush and the litany of Iraq War, frightening deficit, torture, domestic spying, healthcare, education, infrastructure neglect. Not that we were in love with Hillary but, based on her and Bill’s track record, we knew she could be bought, we could do business with her. In a not so perverse way, we like Obama for the exact opposite reason: he can’t be bought. Hillary took money from big donors, from the lobbies our elected officials sold us to. Barack, on the other hand, handily outraised Hillary by an almost two to one margin, getting money from small donors, mostly on the Web. This gets us in what I think is the real reason we like Obama: He’s one of us. I’m not saying this because he’s been spotted using an iPhone. No, what we see is someone who connects with the connected generation. We see someone like us, venture investors and entrepreneurs, who holds an optimistic and meritocratic picture of the future. The latter adjective, meritocratic, got him in trouble. Used without enough discretion at a San Francisco meeting, it upset the more pessimistic market segment, the white lower middle class with a justifiably gloomy view of their prospects.
Then, while Hillary banked on her inevitability, Barack out-strategized, outraised, out-organized and outspoke her. And as inevitability switched sides, so did we. The really real reason came into play: visionary sheep that we are, we flocked to the winner. Hillary tried every dirty trick in the Clinton playbook to try and stop him. From raising prospects of Kennedy and Martin Luther King-like assassinations to bad-faith answers to questions about Obama’s own faith. What do you think of rumors that Obama is a Moslem? Instead of saying such libel had no place in a campaign for the highest office of the land, she replied she took him “at his own word he is a Christian”. The interviewer insisted: Come on, you know these rumors are false. “I take him at his own word.” Hillary had one more opportunity to rise above the gutter, to look presidential. Instead, the highly visible low blow, this was on 60 Minutes, strengthened her reputation for being Bill’s even less principled half.
I was half-kidding when I wrote above we back Obama because we like to back the winner. To us, he looks like a mestizo of JFK and MLK, minus the women and the pharmacy. To us, he looks like he will return the US to a position of exporting hope instead of exporting fear. That’s why we allow ourselves to hope we’ll make history together. In the end, how could we support the Clintons in their re-conquest of the White House, they don’t email, they don’t use Blackberries… Seriously, the BFD (Big Fundable Deal, in VC parlance) this coming week is the iPhone Applications Developers Conference in SFO. Watch this space next week. — JLG
A word (or two, or three) of explanation is in order. D6 is a conference organized by the Walt Mossberg, the personal technology guru of the Wall Street Journal. Over the years, Walt’s finely tuned columns earned him the position of high tech kingmaker. From there, a conference was born for his subjects to meet once a year near San Diego, California.
Second, “schmooze” , evolved from its Yiddish origin to designate an social networking activity. Sorry, for our younger readers, we’re referring to the BFB (Before Facebook) version of networking. There, we smell each other’s pheromones, make small talk, pin decorations on each other’s chest, discreetly but feverishly check we’re not missing the next Big Idea or slipping down the pecking order.
Third, Honesty. At such an event? With speakers ranging from Bill Gates and Steve Ballmer to Michael Dell, Jeff Bewkes (TimeWarner’s CEO) and Kevin Martin (Chairman of the FCC), there risk of honesty is infinitesimal. And that’s part of the fun. In the audience you have entrepreneurs, corpocrats, journalists, bloggers, investment bankers and venture capitalists. On stage, Walt Mossberg and Kara Swisher, his associate, pretend to interview the magnate of the moment. The fun is trained bullshit artists in the audience watching fellow artists prevaricating on stage. We admire the high wire act or lament the lame obvious “misstatement”.
The Gates & Ballmer show was highly professional, a testament to their experience, focus and preparation. We were first treated to the mollifying bit of schmaltz, how the love story between the two of them started at Harvard. Thus supposedly oiled, we got into more scabrous topics: Vista and Yahoo! No problem, we sold a lot of Vista, it’s been massively well received and, as always, we look forward to make our product even better. And, you know what, here is a quick taste of the even more wonderful Windows 7, available in 18 months or so, with our new invention: Multi-touch. And the coda: We avoid monopolies, we love to compete. The connoisseurs in the room nodded their appreciation: impeccable, first-class chutzpah, not a single hairline crack in the dam. Moving to Yahoo! things got a little less polished, a whiteboard was brought out and Ballmer did his Scale number: We need Scale in advertising, we’re still talking to Yahoo! about ways to gain Scale while not buying the company. But we’ll gain Scale by ourselves anyway because we never give up, we keep coming back, and coming back and coming back. The pros thought this was protesting a little too much.
But, Yahoo’s Jerry Yang and Sue Decker, the next day, made the Micro-couple look like the consummate fabulists that they are. Jerry Yang went through a “he said – she said” recount of the aborted deal and was caught flat-footed when asked to define Yahoo’s business. His minder, Sued Decker, regurgitated the party line but the damage was done, we were looking at a future has-been.
Jeff Bezos, his usual happy smart self unfortunately couldn’t resist bullshitting the bullshitters and danced clumsily around his refusal to release Kindle statistics. Too bad because the rest of his act was pitch perfect. He is loved and respected for all the right reasons: vision, execution and culture of the great Amazon.
Mark Zuckerberg brought his new adult guardian with him, the terrifying Sheryl Sandberg. Terrifying? See here quasi-Hillary résumé here. Unfortunately, her professional supervision didn’t spare us a dozen Zuckerberg robotic repetitions of the We help people share information and share themselves. Possibly a good company but definitely bad BS.
I saved the best for last. There was one straight shooter: Rupert Murdoch, the head of News Corp, owner of MySpace, a flock of TV and newspaper properties such as the Times of London, the tabloid New York Post and… the Wall Street Journal. Walt and Kara were interviewing their new master. Everyone in the room was paying attention, wondering who was on the high wire, Murdoch or his hosts. The boss doesn’t miss a beat, didn’t worry about admitting misfires or slow progress in places like MySpace, changing his mind a bit about the strategy – not the goal, depose the New York Times – for the WSJ. The man was speaking honestly, holding forth about media, newspapers – not the news – in trouble, the economy, in recession. And then came the moment: Who caused his New York Post to endorse Obama? Me. What? You support Obama? Well, I need to meet him but if he his the way he looks like, I might. Not a word of Clinton. We knew we were in the presence of a 78 years-old man who had reached a position of power without fear. No wonder the next day 23andMe, the personal genomics company (co-founded by Ann Wojcicki, Sergey Brin’s wife) asked for a sample of Rupert… More artful use of the American-English language here. –JLG