android

Nokia Goes Android – Part II

 

Next week, we might see Nokia’s entry-level feature phones replaced by a low-end device running Android Open Source Project software. The phone may just be a fantasy, but the dilemma facing Nokia’s feature phone business is quite real: Embrace Android or be killed by it. 

Nokia will announce an Android phone! So says the persistent rumor, started about three months ago by an @evleaks tweet, and followed by more details as weeks went by. Initially code-named Normandy, the hypothetical feature phone is now called Nokia X, and it already has its own Wikipedia page and pictures:

Nokia X

Nokia is on the path to being acquired by Microsoft. Why introduce an Android-based phone now? The accepted reasoning is simple…

  • Even though it doesn’t generate much revenue per handset (only $42), Nokia’s feature phone business is huge and must be protected. Nokia’s Form 20-F for 2012 (the 2013 report hasn’t been published, yet) shows its phone numbers compared to the previous year:
    • 35M smartphones (-55%) at an average price (ASP) of $210 (+ 11%)
    • 300M feature phones (-12%) with an ASP of $42 (- 11%)
  • These 300 million feature phones — or “dumbphones” — keep the Nokia flag waving, particularly in developing economies, and they act as an up-ramp towards more profitable smartphones.
  • Lately, dumbphones have become smarter. With the help of Moore’s Law, vigorous competition, and Android Open Source Project (AOSP) software, yesterday’s underfed, spartan feature phones are being displaced by entry-level smartphones. Asha, Nokia’s offering in this category, has been mowed down by low-end Android devices from China.
  • Nokia can’t help but notice that these AOSP-based feature phones act as a gateway drug to the full-blown Android smartphone experience (and much larger profits) offered by competitors such as Samsung, Huawei, and Motorola’s new owner Lenovo.
  • So Nokia drops its over-the-hill Symbian software core, adopts Android, adds its own (and Microsoft’s) services, design expertise, and carrier relationships, and the result is Nokia X, a cleaner, smarter feature phone.

That’s it. Very tactical. Business as usual, only better. Move along, nothing to see.

It’s not that simple.

There’s an important difference between the Android Open Source Project (AOSP), and the full Android environment that’s offered by Samsung, LG, HTC and the like.

The Android Open Source Project is really Open Source, you can download the source code here, modify it as you see fit for your application, add layers of services, substitute parts…anything you like.

Well, almost anything. The one thing you can’t do is slap a figurative “Android Inside” sticker on your device. To do that, you must comply with Google’s strict compatibility requirements that force licensees to bundle Google Mobile (Maps, Gmail, YouTube, etc.) and Google Play (the store for apps and other content). The result isn’t open or free, but smartphone makers who want the Android imprimatur must accept the entire stack.

As an added incentive to stay clean, a “Full Android” licensee cannot also market devices that use a different, incompatible version (or “fork”) of the Android code published by Google. A well-know example of forking is Amazon’s use of Android source code to create the software engine that runs its high-end Kindle Fire tablets. You won’t find a single instance of the word “Android” on these devices: Google won’t license the name for such uses.

(For more on the murky world of Android licensing, bundling, and marketing agreements, see Ben Edelman’s research paper: Secret Ties in Google’s “Open” Android.)

The hypothetical, entry-level Nokia X can’t offer an entire Android stack — it can’t be allowed to compete with the higher-end Lumias powered by Microsoft’s Windows Phone — so it would have to run an “unmentionable” Android fork.

Even without the “Android Inside” label, everyone would soon know the truth about the Android code inside the new device. This could give pause to software developers, carriers, and, the more curious users. “Where is Microsoft going with this? Won’t the Android beast inside soon work its way up the product line and displace the Windows Phone OS?”

Microsoft will make soothing sounds: “Trust us, nothing of the sort will ever happen.  Nokia X is a purely tactical ploy, a placeholder that will give Windows Phone enough time to reveal its full potential.” We know how well attempts to create a Reality Distortion Field have worked for Microsoft’s Post-PC denials.

The Redmond not-so-mobile giant faces a dilemma: Lose the Asha feature phone business to aggressive forked-Android makers, or risk poisoning its Windows Phone business by introducing potentially expansionist Android seeds at the bottom of its handset line.

Several observers (see Charles Arthur’s penetrating Guardian column as an example) have concluded that Microsoft should follow Amazon’s lead and accept the “Come To Android” moment. It should drop Windows Phone and run a familiar Embrace and Extend play: Embrace Android and Extend it with Bing, Nokia’s Here Maps, Office, and other Microsoft properties.

Critics, such as Peter Bright, an energetic Microsoft commenter, contend that forking Android isn’t feasible:

“Android isn’t designed to be forked. With GMS, Google has deliberately designed Android to resist forking. Suggestions that Microsoft scrap its own operating system in favor of such a fork simply betray a lack of understanding of the way Google has built the Android platform.”

Dianne Hackborn, a senior Android engineer (and a former comrade of mine during a previous OS war) contradicts Bright in great point-by-point detail and concludes:

“Actually, I don’t think you have an understanding of how Google has built Android. I have been actively involved in designing and implementing Android since early on, and it was very much designed to be an open-source platform… Android creates a much more equal playing field for others to compete with Google’s services than is provided by the proprietary platforms it is competing with. I also think a good argument can be made that Android’s strategy for addressing today’s need to integrate cloud services into the base platform is an entirely appropriate model for a ‘real’ open-source platform to take.”

In the end, Microsoft probably doesn’t trust Google to refrain from the same games that Microsoft itself knows (too well) how to play. Microsoft used its control of Windows to favor its Office applications. Now it’s Google’s turn. The Mountain View company appears set to kill Microsoft Office, slowly but surely, and using all means available: OS platforms and Cloud services.

None of this draws a pretty picture for Microsoft’s mobile future. Damned if it introduces Android bits at the low end, damned if it lets that same software kill its Asha feature phone business.

JLG@mondaynote.com
@gassee
———————-
PS: Almost four years ago, I wrote a light-hearted piece titled Science Fiction: Nokia goes Android. It was actually less fictional than I let on at the time. In June 2010, I was asked to give a talk at Nokia’s US HQ in White Plains, NY. I was supposed to discuss Apple but I declined to spend too much time on that topic arguing that the Cupertino company was too “foreign” to Nokia’s culture. Instead, I made two suggestions: Fire your CEO and drop your four or five software platforms — Symbian and Linux variants — and adopt Android. Nokia’s combination of industrial design expertise, manufacturing might, and long-standing, globe-spanning carrier relationships could make it a formidable Android smartphone maker.

The first recommendation was warmly received — there was no love for Olli-Pekka Kallasvuo, the accountant cum attorney CEO.

The second was met with indignation: “We can’t lose control of our destiny”. I tried to explain that the loss had already taken place, that too many software platforms were a sure way to get killed at the hands of monomaniacal adversaries.

Three months later Kallasvuo was replaced…by a Microsoft alum who immediately osborned Nokia’s smartphone business by pre-announcing the move to Windows Phone almost a year before the new devices became available.

—–

64 bits. It’s Nothing. You Don’t Need It. And We’ll Have It In 6 Months

 

Apple’s A7 processor, the new iOS 7 and “house” apps are all industry firsts: genuine, shipping 64-bit mobile hardware and software. As we’ve seen before with the iPhone or the iPad, this new volley of Apple products is first met with the customary bursts of premature evaluations and counterfactual dismissals.

On September 10th, Apple revealed that the new iPhone 5s would be powered by its new 64-bit A7 processor. The initial reactions were less than enthused. We were treated to exhumations of lazy bromides…

“I don’t drink Kool-Aid. Never liked the stuff and I think we owe it to ourselves to collectively question whether or not Apple’s ‘reality distortion field’ is in effect when we consider how revolutionary the iPhone 5S is and if Apple’s 64-bit A7 processor under its shiny casing will be all its [sic] cracked up to be when the device hits the market in volume.” [Forbes]

…and equally lazy “markitecture” accusations…

“With current mobile devices and mobile apps, there really is no advantage [to 64 bits] other than marketing — the ability to say you’re the first to have it…” [InfoWorld]

…and breezy brush-offs, such as this tweet from an industry expert:

“We’ll see just how good Apple’s marketing team is trying to leverage 64-bit. 64-bit add more memory and maybe registers. Period.” [Twitter]

Rather than wonder what these commenters were drinking, let’s turn to AnandTech, widely regarded as one of the best online hardware magazines.

Founded by Anand Lal Shimpi when he was all of 14-years-old, AnandTech is known for its exhaustive (and sometimes exhausting) product reviews. The 14-section September 17th iPhone 5S review doesn’t disappoint. Among other things, it provides detailed iPhone 5S vs. iPhone 5 performance comparisons such as this:

5S GeekBench Anand Edited

There are many other charts, comparisons, and considerations of the new 64-bit ARMv8 instruction set, the move from 16 to 32 floating-point NEON 128-bit registers, the hardware acceleration of cryptography operations… It’s a very long read, but not a boring one (at least not for interested geeks).

The bottom line is plain: The A7 processor is a substantial improvement that’s well supported by the 64-bit iOS7. (And I’d like to meet the author and bow to his encyclopedic knowledge.)

Was it because of AnandTech’s cool analysis that the doubters have changed their tune?

As I predicted, Apple A7 benchmarks well due to CPU arch (for IPC), new GPU, ARM v8′

Now that the A7 had become a Benchmarking Beast, the author of the previous week’s brush-off tweet (“more memory and maybe registers. Period”) has revised his position [emphasis mine]:

“The improvements Apple made with the A7 are truly incredible, and they really went against the grain in their choices. With an industry obsessed with more cores, they went with fewer, larger and efficient cores. With people expecting v8 and 64-bit ARM in late 2014, Apple brings it out in 2013 with full Xcode support and many performance optimizations.” [...] “Apple has done it again, but this time in unexpected fashion.”

That all-purpose defense, unexpected, provides a key to the wrong-footing of many “experts”.

When Apple entered the microprocessor field a mere five years ago with its acquisition of Palo Alto Semiconductor, the move was panned: Apple had no future competing with established industry leaders such as Intel, Qualcomm, Nvidia, and Samsung.

But with the successive, increasing refinement of the A4, A5, and A6, the designs were ultimately viewed as good, very good, roughly on par with the rest of the industry. What these processors lacked in raw power was more than made up for by they way they were integrated into Apple’s notion of a purposeful, usable mobile device: Enhanced UI responsiveness, reduced power consumption, obeisance to the unique requirements of media and communications.

The expectation was that Apple would either fail, or produce a “competent” (meaning not particularly interesting) iteration of previous A4-5-6 designs. No one expected that the processor would actually work, with all in-house apps running in 64-bit mode from day one.

But let’s back up and rewrite a bit of history, ourselves:

On September 10th, Samsung announced its flagship 64-bit Exynos processor, supported by Android 5.0, the 64-bit version of Google’s market-leading mobile OS. The new Galaxy S64 smartphone, which will ship on September 20th, features both 64-bit hardware and software components. Samsung and Google receive high praise:

“Supercomputer-class processor… Industry-leading performance… Tightly integrated 64-bit software and hardware open a new era of super high-performance applications previously impossible on mobile devices…”

And Apple gets its just deserts:

“Once again, Apple gets out-innovated…This confirms the trend we’ve seen since Tim Cook took over… iPhones have become second-class devices… The beginning of a long decline…”

Apple can be thankful this is fantasy: The real world would never treat it like this (right?).

My fantasy isn’t without basis: Within 24 hours of Apple’s September announcement, Samsung’s mobile business chief Shin Jong-kyun said his company will have its own 64-bit Exynos processor:

“Not in the shortest time. But yes, our next smartphones will have 64-bit processing functionality…” [The Korea Times]

As for Android support, no problem: 64-bit versions of the underlying Linux kernel already exist. Of course, the system software layer that resides on top of the Linux kernel — the layer that is Android — will also need to be converted to take advantage of the 64-bit processor, as will the Software Development Kit (SDK) that third-party developers use to create apps. It’s a sizable challenge, but one that’s well within the Android’s team skills and resources; the process has certainly been under way for a while already.

The real trouble starts outside of Google. Which 64-bit processor? Intel’s (the company says it will add 64-bit “capabilities” to Android)? Samsung’s? Qualcomm’s?

Who writes and supports device drivers for custom SoC modules? This sounds a lot like Windows device driver complications, but the complexity is multiplied by Google’s significantly weaker control over hardware variants.

Apple’s inherent control over all of the components in its platform will pay dividends in the speed and quality of the transition. There will be glitches — there will always be new, factory-fresh bugs — but the new 64-bit hardware is designed to run existing 32-bit apps, and it seems to actually do so in practice.

Now let’s go beyond the iPhone 5S. In his September 10th presentation, Phil Schiller, Apple’s Marketing Supremo, called the A7′s performance “desktop class”. These words were carefully calibrated, rehearsed, and approved. This isn’t a “Can’t innovate anymore? My asssaeta, blurted while seized by religious fervor at last Spring’s Apple Developers Conference.

Does “desktop class” imply that Apple could use future versions of its 64-bit processor to replace Intel chips in its Mac devices?

In the AnandTech post quoted above, several benchmarks compare Apple’s A7 to a new x86 chip, Intel’s Baytrail, with interesting results:

AnandTech Baytrail A7

So, yes, in theory, a future Apple 64-bit processor could be fast enough to power a Mac.

But let’s consider a 3GHz iMac running a high-end media creation application such as Photoshop or Autodesk. The processor doesn’t want to be constrained by power consumption requirements, it’s optimized for performance (this even ignores the upcoming MacPro and its thermal management prowess).

Can we see a split in the Mac product line? The lower, more mobile end would use Apple’s processors, and the high-end, the no-holds-barred, always plugged to the wall desktop devices would still use x86 chips. With two code bases to maintain ß OS X applications to port? Probably not.

Apple could continue to cannibalize its (and others’) PC business by producing “desktop-class” tablets. Such speculation throws us back to a well-known problem: How do you compose a complex document without a windowing system and a mouse or trackpad pointer?

We’ve seen the trouble with Microsoft’s hybrid PC/tablet, its dual Windows 8 UI which is considered to be “confusing and difficult to learn (especially when used with a keyboard and mouse instead of a touchscreen).”

The best suggestion I’ve seen so far comes from “a veteran design and management surgeon” who calls himself Kontra and proposes An interim solution for iOS ’multitasking‘ based on a multi-slot clipboard.

If Apple provides a real way to compose complex documents on a future iPad, a solution that normal humans will embrace, then it will capture desktop-class uses and users.

Until such time, Macs and iPads are likely to keep using different processors and different interaction models.

JLG@mondaynote.com

 

Android vs. Apple. Market Share vs. Profit Share, Part 255

 

Conventional wisdom and badly reconstructed history can lead to seemingly comfortable but in reality fragile conclusions. Prepare to be confused. 

Ever since the Android platform emerged as the only real competitor to Apple’s iOS devices, we’ve been treated to a debate which I’ll oversimplify: If Apple makes all the money but Android gets all the volume, who will win? A cursory survey of tech journals and blogs would lead one to believe that the case is closed: Market Share trumps Profit Share. It always does.

So Apple should call it a day? I’m skeptical. Not about the conclusion — Market Share isn’t exactly a dark horse — but about the arguments that are trotted out. False memories of Apple’s past have become a template for its future. For example, a recent Wall Street Journal article ends thus [and, sorry, you need a subscription to see the entire article]:

“Unfortunately, Apple has seen this movie before. A generation ago, it also had a top product whose market share was undercut by cheap, inferior rivals. It hopes the iPhone’s story isn’t a sequel to the Mac’s.”

(I emailed the WSJ writer asking three simple, clarifying questions. No answer, but that’s standard practice, as witheringly described by Philip Elmer-DeWitt at the end of this post.)

I was there “a generation ago”. In 1981, when IBM introduced the IBM PC, I was starting Apple France. Big Blue had made startling changes to its old ways, boldly calling its new machine The Personal Computer (we thought the “The” was ours). In an even bolder move, IBM loosened its tie and its dress code, and tried (successfully) to speak to the “common man” by using a Charlie Chaplin imitator as a mascot:

An interesting choice, particularly when juxtaposed with the real Chaplin’s cine-commentary on “labor-saving devices”:

The original PC from IBM’s Boca Raton group was a faithful homage to the Apple ][, right down to the cassette interface. But it wasn't a cheap imitation. There was one important difference:  Where the Apple ][ used a 8-bit 6502 processor, IBM splurged on the much-more-powerful 16-bit Intel chip.

Almost overnight, the pages of InfoWorld, previously replete with salivating reviews of Apple products, were filled with IBM PC articles. The new machine got a major boost with the launch of Lotus 1-2-3, a multi-function spreadsheet that became the gold standard for office applications, especially on desktops that sported hard disks and large color screens. Against the Apple ][, the IBM PC was a superior product -- and deftly marketed.

For the next few years, the Apple ][ family stumbled. The Apple ///, beset by early hardware failures, didn't answer the 16-bit question. It wasn't the modernization of the Apple ][ that the company had promised. The Apple II GS was even worse, not compatible enough with the Apple ][ and not powerful enough to attract developers, particularly Bill Gates who saw no potential for Microsoft applications.

That brings us to 1984. The Macintosh changed the game, right?

Hardly. At its coming out party, the Mac was two years behind schedule. I recall the "Mac's Last Slip" jibes at company meetings. No one would deny the obvious potential, the elegance, the innovative user interface, the clean square pixels on the bit-mapped screen, the fonts, the LaserWriter connection... But the Mac didn't support external hard drives until 1986, and it would be another year before internal disks, additional modularity, and a great Trinitron color monitor were added.

By that time, IBM had had the market to itself for half a decade, and its PC creation had morphed into the Wintel clone industry.

Contrary to the revisionist WSJ story, the "generation ago" Mac never had a market share to undercut. Apple's flagship product -- innovative, elegant, a generation ahead – was a dreamer's machine. Down-to-earth market wisdom said the Mac was perfect for Stanford undergrads, but not serious enough for real business use. The common view was application developers wouldn't be able to afford the investment in time and hardware. Starved of competitive software, the Macintosh was doomed to irrelevance and, ultimately, failure.

It almost happened, especially after Apple's desperate attempt to prop up platform share numbers by licensing Mac clones, a move that resulted in a brutal drop in Apple's margins. Market share vs. Profit Share...

The Mac was saved by Gil Amelio's unintentionally self-sacrificing decision to hand the Apple reins back to Steve Jobs. What followed was the most amazing turnaround our industry has ever seen, and it started with two controversial moves: Jobs rescinded the Mac OS license, and he made a deal with the Microsoft Devil. He convinced Gates' company to "invest" $150M in non-voting Apple shares and develop new Mac versions of the Explorer browser and Office apps (although, in reality, the agreement was part of a settlement of an older IP dispute).

We know the rest of the story, including a meme-adverse fact: For close to seven years, the Mac has consistently gained market share at the expense of PC clones.

Since the advent of another flagship product, the iPhone this time, the riches-to-rags Mac meme has led to predictions of a similar fate: Death by drowning in a sea of "cheap" Android clones. Apple's high price ($650 per iPhone on average) gives too much low-end room for competitors. The price will be undercut, there will be a decline in unit share that, in turn, will lead to lower profits, lower developer interest, lower ability to invest in future products. The road to irrelevance is paved with high margins and low market share.

Never mind two differences. First, the iPhone never lacked apps, 750,000 of them at last count. And never mind that it is immensely profitable, that Apple is embarrassingly flush with more cash than all its high-tech colleagues combined. The pundits won't accept evidence as an answer. Market Share will trump Profit Share. Why let facts cloud a good argument?

One is tempted to point to the race to the bottom that PC clone makers have experienced over the past decade. HP enjoys the largest Market Share of all PC makers, but it also "enjoys" less than 4% operating profit for its efforts. Meanwhile, Apple's margin is in the 25% range for its Mac line. That may not be as enjoyable as the 60% margin for the iPhone, but it's a solid business, particularly when you consider that the clone makers, HP and Dell foremost, are angling to get out of the business altogether. (See an earlier MN: Post-PC: Wall Street Likes the View.)

Returning to the iOS vs Android debate, I will state an opinion - not to be confused with a prediction, let alone The Truth: I think the vertical simplicity of Apple's business will tilt the field in its favor as the complicated Android world devolves into anarchy. Apple vs Google isn't Apple vs Microsoft/Intel/IBM.

Let's back up a bit. Google's 2005 acquisition of Android was a visionary move. (Some say Google's vision was sharpened by Eric Schmidt's presence on Apple's Board as the company worked on the future iPhone. Jobs was furious about Google's decision and summarily asked Schmidt to leave.) Android's unprecedented growth -- more than 50% share of the smartphone market in the US, and even more worldwide – is a testament to the "open" approach. Google gives away the Open Source Android OS; processors are another kind of "open", custom-designed under ARM licenses open to all payers.

But Android is a "cushion shot", it's an indirect way for Google to make money. Android is a Trojan horse that infects smartphones so it can install services that collect the user data that feeds Google's true business: advertising.

Now, Google faces several problems. Android's openness leads to incompatibilities between devices, a problem for developers that didn't happen under Microsoft's rule in the PC era. Worse (for Google), the many diverging versions of Android (a.k.a. forks) -- especially those created in China -- carry no Google services. They harvest no data and so they bring no advertising revenue potential back to Google.

This is clearly a concern for Google, so much so that the company now offers "pure" Android smartphones by Samsung (for $650) and HTC (for $599) on its Google Play site.

On the other hand, Android 2013 is a mature, stable OS. It isn't Windows '95, which was nothing more than a shell bolted on top of DOS. While the Mac's system software wasn't fully developed when it first came out, many saw it as superior -- or potentially superior -- to Microsoft's OS. Android is a tougher competitor than Windows was at the same age.

Then there is Google's subsidiary Motorola Mobility and the relationship with Samsung, the most powerful Android handset maker. As discussed last week, Motorola's stated intention is to push Android phone prices well below the $650 (unsubsidized) level. Is Samsung in a position to wag the Android dog? And if so, how will they react to Motorola's moves?

Let's not forget "the small matter of execution", one that might prove more important than lofty "strategic" considerations. And, to further complicate predictions, we have the herd's tendency to assume Company X will make all the mistakes while its competitors will play a perfect game.

Confused? Then I have accomplished one of my goals, to show how unhelpful the old bromides are when trying to guess what will happen next.

-- JLG@mondaynote.com

PS: I'd be remiss if I didn't direct you recently discovered articles by John Kirk, who calls himself a recovering attorney and indeed writes tightly reasoned posts on Techpinions. I'll whet your appetite with two quotes. One from Does The Rise Of Android's Market Share Mean The End of Apple's Profits? [emphasis mine]:

Steve Jobs wanted, and Apple wants, market share. But they want the RIGHT market share. Apple wants customers who are willing to pay for their products. And Apple wants customers who are good for their platform. In other words, Apple wants market share in their target demographic. Based on the fact that Apple is taking in 72% of the mobile phone profits with only 8% or 9% of the market share, it sure sounds like they’ve aquired the right market share to me.

Does the rise of Android’s market share mean the end of Apple’s profits? Hardly. You can argue as loudly as you like that developers and profit share must necessarily follow market share. But the facts will shout you down.

The other is from 4 Mobile Business Models, 4 Ways To Keep Score where he concludes:

And if you’re going to prophesy that market share alone gives Google data that will someday, somehow, be worth something to someone, then you need to go back and re-read how the “razor-and-blades” business model is scored.

What we desperately need in analyzing mobile computing is far more attention paid to profits and far less attention paid to prophets.

—-

Facebook Home: Another Android Lock Pick

 

Facebook’s new Home on Android smartphone is an audacious attempt to demote the OS to a utility role, to keep to itself user data Android was supposed to feed into Google’s advertising business. Google’s reaction will be worth watching.

Amazon’s Kindle Fire, announced late September 2011, is viewed as a clever “Android lock pick“. Notwithstanding the term’s illicit flavor, Amazon’s burglary is entirely legal, an intended consequence of Google’s decision to Open Source their Android mobile operating system. Download the Android source code here, modify it to your heart’s — or business needs’ — content, load it onto a device and sell as many as you’d like.

Because it doesn’t fully meet the terms of the Android Compatibility Program, Amazon’s proprietary version isn’t allowed to use the Android trademark and the company had to open its own App Store. In industry argot, Amazon “forked” Android; they spawned an incompatible branch in the Android Source Tree.

The result of this heretic version of Android is a platform that’s tuned to Amazon’s own needs: Promoting its e-commerce without feeding Google’s advertising money pump.

And that brings us to Facebook’s new Home.

(The company’s slick presentation is here. Business Insider’s also provides a helpful gallery.)

Zuckerberg’s new creation is the latest instance of the noble pursuit of making the user’s life easier by wrapping a shell around existing software. Creating a shell isn’t a shallow endeavor; Windows started its life as a GUI shell wrapped around MS-DOS.  Even venerable Unix command line interfaces such as C shell, Bourne, and Bash (which can be found inside OS X) are user-friendly — or “somewhat friendlier” — wrappers around the Unix kernel. (Sometimes this noble pursuit is taken too far — remember Microsoft’s Bob? It was the source of many jokes.)

Facebook Home is a shell wrapped around Android; it’s a software layer that sits on top of everything else on your smartphone. Your Facebook friends, your timeline, conversations, everything is in one place. It also gives you a simple, clean way to get to other applications should you feel the need to leave the Facebook corral… but the intent is clear: Why would you ever want to leave Home?

This is audacious and clever, everything we’ve come to expect from the company’s founder.

To start with, and contrary to the speculation leading up to the announcement, Facebook didn’t unveil a piece of hardware. Why bother with design, manufacture, distribution and support, only to sell a few million devices — a tiny fraction of your one billion users — when you can sneak in and take over a much larger number of Android smartphones at a much smaller cost?

Second, Home is not only well-aligned with Facebook’s real business, advertising revenue, it’s even more aligned with an important part of the company’s business strategy: keeping that revenue out of Google’s hands. Android’s only raison d’être is to attract a captive audience, to offer free services (search, email, maps…) in order to gain access to the users’ actions and data, which Google then cashes in by selling eyeballs to advertisers. By “floating” above Android, Home can keep these actions and data to itself, out of Google’s reach.

Facebook, like Amazon, wants to keep control of its core business. But unlike Amazon, Facebook didn’t “fork” Android, it merely demoted it to an OS layer that sits underneath the Home shell.

On paper and in the demos, it sounds like Zuckerberg has run the table… but moving from concept to reality complicates matters.

First, Facebook Home isn’t the only Android shell. An important example is Samsung, the leading Android player: it provides its own TouchWiz UI. Given that the Korean giant is obviously determined to stay in control of its own core business, one wonders how the company will welcome Facebook Home into the family of Galaxy phones and phablets. Will it be a warm embrace, or will Samsung continually modify its software in order to keep Home one step behind?

More generally, Facebook has admitted that differences in Android implementations prevent the first release of Home from working on all Android phones. In order to achieve the coverage they’ll need to keep Google (and its Google+ social networking effort) at bay, Facebook could be sucked into a quagmire of development and support.

Last but not least, there’s Google’s reaction.

So far, we’ve heard little but mellifluous pablum from Google in response to Home. (Microsoft, on the other hand, quickly attempted to point out that they were first with an all-your-activities-friends-communications shell in Windows Phone but, in this game, Android is the new Windows and Microsoft is the Apple of the early 90′s.)

Google has shown that it can play nice with its competitors — as long as they aren’t actually competing on the same turf. The Mountain View company doesn’t mind making substantial ($1B or more) Traffic Acquisition payments to Apple because the two don’t compete in the Search and Advertising business. Facebook taking over an Android smartphone is another matter entirely. Google and Facebook are in the same game; they both crave access to user data.

Google could sit back and observe for a while, quantify Facebook’s actual takeover of Android phones, keep tabs on users’ reactions. Perhaps Home will be perceived as yet another walled garden with a massive handover of private data to Facebook.

But Google already sees trouble for its Android strategy.

Many Asian handset makers now adopt Android without including services such as Google Search, Gmail, and Google Maps, the all-important user data pumps. Samsung still uses many of these services but, having gained a leading role on the Android platform, it might demand more money for the user data it feeds to Google, or even fork the code.

In this context, Facebook Home could be perceived as yet another threat to the Android business model.

A number of possible responses come to mind.

In the computer industry, being annoyed or worse by “compatible” hardware or software isn’t new. As a result, the responses are well honed. You can keep changing the interface, thus making it difficult for the parasitic product to bite into its host and suck its blood (data, in this case), or you change the licensing terms.

Google could change or hide its APIs (Application Programming Interfaces) in order to limit Home’s functionality, or even prevent it from running at all (at least until a particularly nasty “bug” is fixed). Worse, Google could makes changes that cause the Facebook shell to still run, but poorly.

I’ll hasten to say that I doubt Google would do any of this deliberately — it would violate the company’s Don’t Be Evil ethos. But… accidents could happen, such as when a hapless Google engineer mistakenly captured Wifi data.

Seriously, FaceBook Home is yet another pick of the Android lock, a threat against Google’s core strategy that will have to be addressed, either with specific countermeasures or with more global changes in the platform’s monetization.

JLG@mondaynote.com

Google’s Red Guide to the Android App Store

 

As they approach the one million apps mark, smartphone and tablet app stores leave users stranded in thick, uncharted forests. What are Google and Apple waiting?

Last week, Google made the following announcement:

Mountain View, February 24th, 2013 — As part of an industry that owes so much to Steve Jobs, we remember him on this day, the 58th anniversary of his birth, with great sadness but also with gratitude. Of Steve’s many achievements, we particularly want to celebrate the Apple App Store, the venerable purveyor of iPhone software. 

Introduced in 2008, the App Store was an obvious and natural descendant of iTunes. What wasn’t obvious or foreseen was that the App Store would act as a catalyst for an entire market segment, that it would metamorphose the iPhone from mere smartphone to app phone. This metamorphosis provided an enormous boost to the mobile industry worldwide, a boost that has benefitted us all and Google more than most.

But despite the success of the app phone there’s no question that today’s mobile application stores, our own Google Play included, are poorly curated. No one seems to be in charge, there’s no responsibility for reviewing and grading apps, there’s no explanation of the criteria that goes into the “Editors’ Picks”, app categorization is skin deep and chaotic.

Today, we want to correct this fault and, at the same time, pay homage to Steve’s elegant idea by announcing a new service: The Google Play Red Guide. Powered by Google’s human and computer resources, the Red Guide will help customers identify the trees as they wander through the forest of Android apps. The Red Guide will provide a new level of usefulness and fun for users — and will increase the revenue opportunities for application developers.

With the Google Play Red Guide, we’ll bring an end to the era of the uncharted, undocumented, and poorly policed mobile app store.

The Red Guide takes its name from another great high-tech company, Michelin. At the turn of the 20th century, Michelin saw it needed to promote automotive travel in order to stimulate tire sales. It researched, designed and published great maps, something we can all relate to. To further encourage travel, Michelin published Le Guide Rouge, a compendium of hotels and restaurant. A hundred years later, the Michelin Red Guide is still considered the world’s standard; its inspectors are anonymous and thus incorruptible, their opinions taken seriously. Even a single star award (out of three) can put an otherwise unknown restaurant on the map — literally.

Our Red Guide will comprise the following:

- “Hello, World”, a list of indispensable apps for the first time Android customer (or iPhone apostate), with tips, How-To guides, and FAQs.
- “Hot and Not”. Reviews of new apps and upgrades — and the occasional downgrade.
- “In Our Opinion”. This is the heart of the Guide, a catalogue of reviews written by a select group of Google Play staff who have hot line access to Google’s huge population of in-house subject matter experts. The reviews will be grouped into sections: Productivity, e-Learning, Games, Arts & Creativity, Communication, Food & Beverage, Healthcare, Spirituality, Travel, Entertainment, Civics & Philanthropy, Google Glass, with subcategories for each.

Our own involvement in reviewing Android apps is a novel — perhaps even a controversial — approach, but it’s much needed. We could have taken the easy path: Let users and third-parties provide the reviews. But third party motives are sometimes questionable, their resources quickly exhausted. And with the Android Store inventory rapidly approaching a million titles, our users deserve a trustworthy guide, a consistent voice to lead them to the app that fits.

We created the Red Guide because we care about our Android users, we want them to “play safe” and be productive, and we feel there’s no better judge of whether an application will degrade your phone’s performance or do what it claims than the people who created and maintain the Android framework. For developers, we’re now in a position to move from a jungle to a well-tended garden where the best work will be recognized, and the not-so-great creations will be encouraged to raise their game.

We spent a great deal of time at Google identifying exactly the right person to oversee this delicate proposition…and now we can reveal the real reason why Google’s Motorola division hired noted Macintosh evangelist, auteur, and investor Guy Kawasaki as an advisor: Guy will act as the Editor in Chief of the Google Play Red Guide.

With Guy at the helm, you can expect the same monkish dedication and unlimited resources we deployed when we created Google Maps.

As we welcome everyone to the Google Play Red Guide, we again thank Steve Jobs for his leadership and inspiration. Our algorithms tell us he would have approved.

The Red Guide is an open product and will be published on the Web at AppStoreRedguide.com as well as in e-book formats (iBookstore and Kindle formats pending approval) for open multi-platform enjoyment.
——– 

No need to belabor the obvious, you’ve already figured out that this is all a fiction. Google is no better than Apple when it comes to their mobile application store. Both companies let users and developers fend for themselves, lost in a thick forest of apps.

That neither company seems to care about their online stores’ customers makes no sense: Smartphone users download more apps than songs and videos combined, and the trend isn’t slowing. According to MobiThinking:

IDC predicts that global downloads will reach 76.9 billion in 2014 and will be worth US$35 billion.

Unfortunately, Apple appears to be resting on its laurels, basking in its great App Store numbers: 40 billion served, $8B paid to developers. Perhaps the reasoning goes like this: iTunes served the iPod well; the App Store can do the same for the iPhone. It ain’t broke; no fix needed.

But serving up music and movies — satisfying the user’s established taste with self-contained morsels of entertainment — is considerably different from leading the user to the right tool for a job that may be only vaguely defined.

Apple’s App Store numbers are impressive… but how would these numbers look like if someone else, Google for example, showed the kind of curation leadership Apple fails to assert?

JLG@mondaynote.com

Carnival Barker Edition: Show me your iOS licensing certificate!

Apple is doing it wrong, Apple is living on borrowed time! Apple will Fail Again!

This idea, this meme, isn’t new. For more than 30 years we’ve heard a number of versions of the “Apple is doomed” requiem.

December 12th 1980 — the day of Apple’s IPO, coincidentally — I’m in Geneva, signing my employment agreement with Apple. My mission: start Apple France. Back in Paris I meet a chorus of naysayers: You’re deranged. Look at the respectable companies you’ve worked for: HP, Data General, Exxon Office Systems. (They don’t know that I can’t wait to leave the latter.) And now you’re going to work for these California hippies? They don’t have CP/M; the Apple ][ has a 40-column screen and lacks standard 8” floppies…and Fortune Systems is coming up with a Wang emulator that will wipe Apple off the planet’s surface!

The latest Dies Irae comes from a trio of highly skilled artists: Henry Blodget of Wall Street and Business Insider fame; Fred Wilson, co-founder of the VC firm Union Square Ventures and an eloquent and insightful blogger (AVC blog); and Dan Lyons, the sharp and eerily hilarious author of the Fake Steve Jobs parody blog (currently on hiatus), now writing for the Daily Beast and Newsweek. (See here, here and here but a few examples of their refrain. Google will oblige with more.)

I’ll start by intoning their cantus firmus.

In 1984, Apple comes out with a superior personal computer, the Macintosh. And then they lost the market to an inferior genus: the IBM PC clone.

Why?

Ignoring universal advice — including Billl Gates’ — Apple arrogantly refused to license the Mac operating system, leaving the field to Microsoft’s technically inferior product. DOS and Windows clones proliferated and almost exterminated the Mac, relegating it to a minuscule, irrelevant market share.

With the iPhone — and out of the same deeply ingrained arrogance — Apple is making the same mistake. Apple won’t license its iOS software platform. As a result, Android-powered smartphones and tablets will do to the iPhone and the iPad what Windows did to the Mac.

The story ends with Andy Rubin at the wheel of the Android steamroller. Behind him we see Henry, Fred, and Dan throwing rose petals on themselves and singing I Told You So.

(I have personal reasons to like Android. Several of my Be associates moved on to Google where they were instrumental in the creation of the platform. I admire what the engineering team accomplished in a very short time. There’s little wonder that Nokia and RIM have lost their footing. One of my two smartphones is an Android device, from Motorola; I see everyday why the platform is so successful. And as an iPhone user, I’m glad Google is fueling Apple’s competitive fires.) More

iPhone = Mac 2.0

by Jean-Louis Gassée

There are two ways to interpret the equation above.

Doomsayers will sing the licensing blues. By refusing to license the operating system—iOS, in this case—the iPhone will drown in a sea of Android smartphones. We’ve seen it before: Apple is repeating the mistake that allowed Windows clones to scuttle the Mac.

Others, such as yours truly, see the iPhone—or, more properly, its pole position in the smartphone race—as a perfect illustration of lessons learned from the Mac’s struggle to find breathing room in the PC industry.

We know how the first reading of the equation continues. The Mac had immense promise, a much better personal computer than the 16-bit clone of the Apple ][ called the IBM PC. But Apple’s arrogance beleaguered the platform. Instead of following the Microsoft model—focusing on software and letting licensees create a prosperous ecosystem—Apple repeatedly nixed Mac clones and was marginalized, with the Mac market share sinking as low as 2%.

The iPhone is equally promising and, the argument goes, just as equally destined to a marginal role. Like the original Mac, the iPhone has inaugurated a new era, and will ultimately see others dominate the market.

This is a resilient meme, one that gives rise to regular kommentariat pieces predicting trouble for Jobs and his company. Last October, a New York Times piece asked: Will Apple’s Culture Hurt the iPhone? Just last week, a Fortune columnist joined the herd and declared ‘2011 will be the year Android explodes’.

Unsurprisingly, others tore the “closed = marginalization” formula apart. The new smartphone world isn’t a replica of the PC industry, the analogy doesn’t apply. John Gruber argues here that the real race is in reducing the cost of monthly agreements: A “free” Android smartphone versus a $99 or $199 iPhone won’t make much of a difference if the monthly plan costs $80 to $100. Another observer, whose nom de plume is Kontra, thinks we’ll reach a different kind of duopoly where Android will get the volume and Apple will make all the money. See “The Unbearable Inevitability of Being Android, 1995”. And take a look at this great piece felicitously titled “Fragmandroid: Google’s mad dash to Microsoftdom”.

I have my own set of questions about the Mac’s “failure”.

First, shall we agree that Microsoft “open” model is the exception rather than the rule? How many other examples of the Microsoft platform licensing model, with its caveats, prohibitions, and insistence on fealty, do we see? Have we forgotten that Microsoft’s methods led to a conviction of being a monopolist?

Second, there is the Mac’s rebirth. Last year, its US market share approached 10%, with a 90% unit share in the $1k-and-greater segment. For the past five years, Mac unit sales have grown faster than the PC industry.

Even more important: profits. HP is the leading PC manufacturer, with quarterly revenues in the $10B range and 5% operating income. Apple makes only a third of HP’s PC dollar volume per quarter—but with an operating income in the 30% to 35% range. (More details in this May 2nd, 2010 Monday Note.) We’ll have numbers for the October-December quarter in a few days. We’re likely to see a continuation of the dual rise of Mac market share and profits. In the meantime, Apple, for its sins, has been punished with the highest market cap of all high-tech companies, close to $300B.

This could be a blueprint for the iPhone’s future: smaller market share, bigger profits.

Back to the equation and my own interpretation: Applying the lessons from the Mac’s troubled beginnings.

When the Mac came out, it showed immense promise. The execution wasn’t flawless and it suffered from several important shortcomings—the lack of a hard disk, next to nothing in the way of application software compared to the PC. Steve Jobs tried—and tried hard—to get Lotus, Microsoft, and Software Publishing (of PFS: fame) to write apps for the Mac. In a pre-introduction Sales Conference in Honolulu in 1983, we were treated to a mock Dating Game where Mitch Kapor, Bill Gates and Fred Gibbons pledged to date the Mac, to write applications for the new wonder-PC. Ironically, the only “date” that produced anything helpful was Gates with Excel and Word. This was in exchange for a UI licensing agreement that produced no end of trouble.

“Never again.” This must have been Steve Jobs’ motto when, in 1997, he finally assumed undisputed leadership of the company he had co-founded. From then on, Apple was going to control its own future.

Fast-forward to the iPhone: It has the polish the early Mac lacked, it has the support of Apple’s own retail network, it has rid itself of the carriers’ mucking around with handsets and content distribution and, thanks to the iTunes infrastructure, it has its App Store, giving it a huge lead in the breadth and depth of available applications. Not everything works flawlessly but it has been an amazingly well organized campaign that has taken the establishment by surprise.

The result? A fundamentally different situation: While the Mac struggled from day one, the iPhone immediately took the prize.

So, will Android ultimately win, just as Windows prevailed?

My own guess is we’ll get to today’s version of the Mac vs. Windows wars, only faster and better. Faster meaning the iPhone skipped over the Mac’s early struggles. Better means profits. While Android clones proliferate and race to the bottom, iOS devices are likely to retain a substantial share of consumer dollars. Today, Apple reaps close to half of all smartphone profits, (see this Asymco post). That dominance probably won’t last, but in a sea of Android clones, Apple is likely to remain the most profitable smartphone maker. And this is without considering the other devices the iOS platform will power: tablets, iPods, Apple TV…

JLG@mondaynote.com

The OS Doesn’t Matter…

by Jean-Louis Gassée

Once upon a time, operating systems used to matter a lot; they defined what a computer could and couldn’t do. The “old” OS orchestrated the use of resources: memory, processors, I/O (input/output) to external devices (screen, keyboard, disks, network, printers…). It’s a complicated set of tasks that requires delicate juggling of conflicting constraints, and every OS handled them differently—or not at all. In those days, the OS was married to the hardware and only a handful of “wizards” with Electrical Engineering degrees—and a deep understanding of circuitry—understood (and invented, and protected) the arcana of OS construction.

Over time, the secrets of these illuminati leaked out. About 20 years ago, the OS lost its mystery. We had learned enough about writing an OS kernel that it became a college-level topic and a Christmas break hack.

Today, there’s only one operating system: Unix. (Okay, there are two, but we’ll get to that.) This is why I contend that the OS doesn’t matter—or that we need to take another look at the word’s content, at what we mean when we say ‘Operating System’.

When RIM decides to go with QNX for its upcoming tablet, the PlayBook, tongues wag. After calling its Blackberry OS the “best of breed” (a tired markitecture phrase), RIM is surrendering to reality: The “proven OS” foundation proved to be unfixable. Because of the layers of software silt that had accumulated over the years, the edifice couldn’t be modernized. Better to quit and make a fresh start. QNX is based on a Unix OS for embedded applications that dates back to 1982(!) when it was first released for the Intel 8088 microprocessor.

The same thing happened at Apple…twice. The Apple ][ OS (or lack thereof, purists will say) couldn’t be brought up to modern standards, so the Macintosh had to be built on a fresh foundation. The original Mac OS foundered on its own beachhead and was replaced by OS X. Based on the Mach kernel, OS X is another Unix derivative, co-authored at CMU by Avie Tevanian. Mr Tevanian improved the system during his tenure as head of software at NeXT and was instrumental in convincing Apple that their purchase of NeXT would breathe new life into the company.

Open the Terminal application on a Mac and what do you see? A noble and worthy Unix “shell”, a program that geeks use to interact with the OS. Terminal uses the bash shell (for Bourne Again Shell. Created by Brian Fox, bash is based on the sh shell, which was invented by Stephen Bourne. Unix mavens love their word-play acronyms).

And now we have the Apple iOS, an OS X derivative that uses bits from the same kernel.

Regard Palm. The sine qua non of handset makers saw that their PalmOS couldn’t be fixed, so they pressed the restart button and created WebOS, a Linux derivative.

Android? It’s based on a Linux kernel. Nokia’s MeeGo? Ditto.

The list goes on. We have the spiritual children of Unix living inside the Cloud, powering the millions of Linux servers running at Google, Facebook, Amazon…

The only exception is Windows. Initially built on top of DOS, Microsoft painstakingly added version after version, always striving for backward compatibility while, at the same time, adding new features. It didn’t always work well (who wants to remember Windows Me and Vista?) but it worked well enough because Microsoft never gave up. They fixed mistakes that they claimed didn’t exist, and now we have the well-respected Windows 7. (Inevitably, critics will say that Microsoft wouldn’t have gotten away with such a tortuous path if it weren’t for its vigorously enforced monopoly.)

Windows will live on — in a PC industry now at a plateau. But otherwise, in the high-growth Cloud and smartphone segments, it’s a Unix/Linux world. We need to look elsewhere to find the differences that matter.

The technical challenges have migrated to two areas: UI (User Interface, or the more poetic—and more accurate—UX, for User Experience) and programming tools.

Now that all “system functions” are similar, the game for hardware and software makers is to convince the user that his/her experience will be smooth and intuitive. Your device will walk on water (with the programmer right under the surface), catch you as you fall, make sure you don’t get your feet wet.

For the developer, what we now call the OS must supply ever-growing expressive power—think a fife versus a twelve-keyboard organ. To wield that expressive power, the programmer needs software tools. The industry uses acronyms such as API (Application Programming Interface), IDE (Integrated Development Environment) or phrases such as Application Frameworks. They define the rules and conventions—which ideas are allowed and how to express them—and the software tools that programmers need to develop an application.

This is today’s OS. User experience. Development tools.

One last element that is and isn’t the OS: This new creature called an App Store (or Marketplace, depending upon the…OS). In my non-technical view, the App Store must be considered part of the OS totality, part of its gestalt. Applications have always been in a feedback loop with the OS. A program can only do as much as the OS allows it, so it played tricks to create multi-tasking, to allow smooth audio/video playback. These “tricks” were incorporated into the OS (and the hardware—think GPU), which then bred another generation of apps that wanted more, and so on.

The App Store genre, invented or not in Cupertino, is now part of that loop, a killer OS component, one that deserves a Monday Note of its own.

JLG@mondaynote.com

The Carriers’ Rebellion

Before the Steve Jobs hypnosis session, AT&T ruled. Handsets, their prices, branding, applications, contractual terms, content sales…AT&T decided everything and made pennies on each bit that flowed through its network. Then the Great Mesmerizer swept the table. Apple provided the hardware, the operating system, and “everything else”: applications, music, ringtones, movies, books… The iTunes cash register rang and AT&T didn’t make a red cent on content.

In the eyes of other carriers, AT&T sold its birthright. But they didn’t sell cheap. The industry-wide ARPU (Average Revenue Per User per month) is a little more than $50. AT&T’s iPhone ARPU hovers above $100. Subtract $25 kicked back to Apple, and AT&T still wins. More important, AT&T’s iPhone exclusivity in the US “stole” millions of subscribers from rivals Verizon, Sprint, and T-Mobile—more than 1 million per quarter since the iPhone came out in June, 2007.

(Legend has it that Jobs approached Verizon before AT&T, but Apple’s demands were deemed “obscene”. If the story is true, Verizon’s disgust lost them 10 million subscribers and billions in revenue—much more than it would have made in content sales putatively under its control. Another theory, unprovable but preferable, is that Apple went for the worldwide “GSM’’ standard, hence AT&T.)

To the industry at large, the damage had been done. Jobs disintermediated carriers. Consumers woke up to a different life, one where the carrier supplied the bit pipe and nothing else. Yesterday’s smartphones became today’s mobile personal computers and carriers devolved into wireless ISPs, their worst fear.

Enter Android.

Android is like Linux, it’s Open Source, it’s free. And it’s very good, and rabidly getting better. But with two important differences. Android is Linux with money, Google’s money. And Android is Linux without a Microsoft adversary. There’s no legally—or illegally—dominant player in the smartphone/really personal computer space. Nokia, Palm, Microsoft, and RIM were and still are much larger than the Disintermediating Devil from Cupertino.

Handset makers and software developers love Android, new handsets and new applications are released daily; see the Android Market here. The current guess is that Android will grab the lion’s share of the handset market by 2012. Nokia, RIM, and Microsoft may disagree with that forecast, and Apple is certain to stick to its small market share/high margin, vertical, bare-metal-to-flesh strategy.

Carriers get excited about Android, too. For two reasons. First, Android (and the very good bundled Google apps) allows handset makers to make inexpensive devices. Carriers and Google both encourage a race to the bottom where handsets are commoditized, but smart.

Second, because Android is an Open Source platform, carriers can work with handset makers, they can dictate the feature set and, as a result, revitalize the revenue stream. They can promote their favorite apps, content, and services sales that have been choked by disintermediation.

But it’s not a straight shot. Android lays out the playing field for a contest between Google and carriers. More

Very Personal Computing

The center of financial gravity in the computing world—the Center of Money—has shifted. No longer directed at the PC, the money pump now gushes full blast at the smartphones market. One of my colleagues, Bob Ackerman, calls smartphones the very personal computers. Measured by size and potential, they’re both smaller and bigger than today’s PCs.

The Math

Consider the numbers: HP, the world’s foremost PC maker, sold $10B of “Personal Systems” in its last reported quarter:

(turn “on” display image in your mail reader
to see the graphics)

Despite their premier position, HP isn’t making much PC money: $500M, 5% Operating Profit. (The full HP Q1 report in PDF can be found here.)

Now let’s turn to Apple’s most recent quarter. Smartphones constituted 40% of the company’s revenue:

When we add up the numbers, we see that the iPhone = Mac + iPods. And this rough calculation “misunderestimates” the weight of the iPhone OS. In the more mature iPod category, the iPod Touch (the iPhone without a phone) grew by 63% year-to-year according to Apple COO Tim Cook in the most recent earnings conference call. (Full Q2 2010 SEC filing available here.) More