hardware

64 bits. It’s Nothing. You Don’t Need It. And We’ll Have It In 6 Months

 

Apple’s A7 processor, the new iOS 7 and “house” apps are all industry firsts: genuine, shipping 64-bit mobile hardware and software. As we’ve seen before with the iPhone or the iPad, this new volley of Apple products is first met with the customary bursts of premature evaluations and counterfactual dismissals.

On September 10th, Apple revealed that the new iPhone 5s would be powered by its new 64-bit A7 processor. The initial reactions were less than enthused. We were treated to exhumations of lazy bromides…

“I don’t drink Kool-Aid. Never liked the stuff and I think we owe it to ourselves to collectively question whether or not Apple’s ‘reality distortion field’ is in effect when we consider how revolutionary the iPhone 5S is and if Apple’s 64-bit A7 processor under its shiny casing will be all its [sic] cracked up to be when the device hits the market in volume.” [Forbes]

…and equally lazy “markitecture” accusations…

“With current mobile devices and mobile apps, there really is no advantage [to 64 bits] other than marketing — the ability to say you’re the first to have it…” [InfoWorld]

…and breezy brush-offs, such as this tweet from an industry expert:

“We’ll see just how good Apple’s marketing team is trying to leverage 64-bit. 64-bit add more memory and maybe registers. Period.” [Twitter]

Rather than wonder what these commenters were drinking, let’s turn to AnandTech, widely regarded as one of the best online hardware magazines.

Founded by Anand Lal Shimpi when he was all of 14-years-old, AnandTech is known for its exhaustive (and sometimes exhausting) product reviews. The 14-section September 17th iPhone 5S review doesn’t disappoint. Among other things, it provides detailed iPhone 5S vs. iPhone 5 performance comparisons such as this:

5S GeekBench Anand Edited

There are many other charts, comparisons, and considerations of the new 64-bit ARMv8 instruction set, the move from 16 to 32 floating-point NEON 128-bit registers, the hardware acceleration of cryptography operations… It’s a very long read, but not a boring one (at least not for interested geeks).

The bottom line is plain: The A7 processor is a substantial improvement that’s well supported by the 64-bit iOS7. (And I’d like to meet the author and bow to his encyclopedic knowledge.)

Was it because of AnandTech’s cool analysis that the doubters have changed their tune?

As I predicted, Apple A7 benchmarks well due to CPU arch (for IPC), new GPU, ARM v8′

Now that the A7 had become a Benchmarking Beast, the author of the previous week’s brush-off tweet (“more memory and maybe registers. Period”) has revised his position [emphasis mine]:

“The improvements Apple made with the A7 are truly incredible, and they really went against the grain in their choices. With an industry obsessed with more cores, they went with fewer, larger and efficient cores. With people expecting v8 and 64-bit ARM in late 2014, Apple brings it out in 2013 with full Xcode support and many performance optimizations.” [...] “Apple has done it again, but this time in unexpected fashion.”

That all-purpose defense, unexpected, provides a key to the wrong-footing of many “experts”.

When Apple entered the microprocessor field a mere five years ago with its acquisition of Palo Alto Semiconductor, the move was panned: Apple had no future competing with established industry leaders such as Intel, Qualcomm, Nvidia, and Samsung.

But with the successive, increasing refinement of the A4, A5, and A6, the designs were ultimately viewed as good, very good, roughly on par with the rest of the industry. What these processors lacked in raw power was more than made up for by they way they were integrated into Apple’s notion of a purposeful, usable mobile device: Enhanced UI responsiveness, reduced power consumption, obeisance to the unique requirements of media and communications.

The expectation was that Apple would either fail, or produce a “competent” (meaning not particularly interesting) iteration of previous A4-5-6 designs. No one expected that the processor would actually work, with all in-house apps running in 64-bit mode from day one.

But let’s back up and rewrite a bit of history, ourselves:

On September 10th, Samsung announced its flagship 64-bit Exynos processor, supported by Android 5.0, the 64-bit version of Google’s market-leading mobile OS. The new Galaxy S64 smartphone, which will ship on September 20th, features both 64-bit hardware and software components. Samsung and Google receive high praise:

“Supercomputer-class processor… Industry-leading performance… Tightly integrated 64-bit software and hardware open a new era of super high-performance applications previously impossible on mobile devices…”

And Apple gets its just deserts:

“Once again, Apple gets out-innovated…This confirms the trend we’ve seen since Tim Cook took over… iPhones have become second-class devices… The beginning of a long decline…”

Apple can be thankful this is fantasy: The real world would never treat it like this (right?).

My fantasy isn’t without basis: Within 24 hours of Apple’s September announcement, Samsung’s mobile business chief Shin Jong-kyun said his company will have its own 64-bit Exynos processor:

“Not in the shortest time. But yes, our next smartphones will have 64-bit processing functionality…” [The Korea Times]

As for Android support, no problem: 64-bit versions of the underlying Linux kernel already exist. Of course, the system software layer that resides on top of the Linux kernel — the layer that is Android — will also need to be converted to take advantage of the 64-bit processor, as will the Software Development Kit (SDK) that third-party developers use to create apps. It’s a sizable challenge, but one that’s well within the Android’s team skills and resources; the process has certainly been under way for a while already.

The real trouble starts outside of Google. Which 64-bit processor? Intel’s (the company says it will add 64-bit “capabilities” to Android)? Samsung’s? Qualcomm’s?

Who writes and supports device drivers for custom SoC modules? This sounds a lot like Windows device driver complications, but the complexity is multiplied by Google’s significantly weaker control over hardware variants.

Apple’s inherent control over all of the components in its platform will pay dividends in the speed and quality of the transition. There will be glitches — there will always be new, factory-fresh bugs — but the new 64-bit hardware is designed to run existing 32-bit apps, and it seems to actually do so in practice.

Now let’s go beyond the iPhone 5S. In his September 10th presentation, Phil Schiller, Apple’s Marketing Supremo, called the A7′s performance “desktop class”. These words were carefully calibrated, rehearsed, and approved. This isn’t a “Can’t innovate anymore? My asssaeta, blurted while seized by religious fervor at last Spring’s Apple Developers Conference.

Does “desktop class” imply that Apple could use future versions of its 64-bit processor to replace Intel chips in its Mac devices?

In the AnandTech post quoted above, several benchmarks compare Apple’s A7 to a new x86 chip, Intel’s Baytrail, with interesting results:

AnandTech Baytrail A7

So, yes, in theory, a future Apple 64-bit processor could be fast enough to power a Mac.

But let’s consider a 3GHz iMac running a high-end media creation application such as Photoshop or Autodesk. The processor doesn’t want to be constrained by power consumption requirements, it’s optimized for performance (this even ignores the upcoming MacPro and its thermal management prowess).

Can we see a split in the Mac product line? The lower, more mobile end would use Apple’s processors, and the high-end, the no-holds-barred, always plugged to the wall desktop devices would still use x86 chips. With two code bases to maintain ß OS X applications to port? Probably not.

Apple could continue to cannibalize its (and others’) PC business by producing “desktop-class” tablets. Such speculation throws us back to a well-known problem: How do you compose a complex document without a windowing system and a mouse or trackpad pointer?

We’ve seen the trouble with Microsoft’s hybrid PC/tablet, its dual Windows 8 UI which is considered to be “confusing and difficult to learn (especially when used with a keyboard and mouse instead of a touchscreen).”

The best suggestion I’ve seen so far comes from “a veteran design and management surgeon” who calls himself Kontra and proposes An interim solution for iOS ’multitasking‘ based on a multi-slot clipboard.

If Apple provides a real way to compose complex documents on a future iPad, a solution that normal humans will embrace, then it will capture desktop-class uses and users.

Until such time, Macs and iPads are likely to keep using different processors and different interaction models.

JLG@mondaynote.com

 

Apple Market Share: Facts and Psychology

 

Remember netbooks? When Apple was too greedy and stupid to make a truly low-cost Macintosh? Here we go again, Apple refuses to make a genuinely affordable iPhone. There will be consequences — similar to what happened when the Mac refused to join netbooks circling the drain. 

My first moments with the iPad back in April 2010 were mistaken attempts to use it as a Mac. Last year, it took a long overdue upgrade to my eyeglasses before I warmed to the nimbler iPad mini, never to go back to its older sibling.

With that in mind, I will withhold judgment on the new iPhone until I have a chance to play customer, buy the product (my better half seems to like the 5C while I pine for a 5S), and use it for about two weeks — the time required to go beyond my first and often wrong impressions.

While I wait to put my mitts on the new device, I’ll address the conventional hand-wringing over the 5C’s $549 pricetag (“It’s Too Damned High!” cry the masses).

iphone5c copie

Henry Blodget, who pronounced the iPhone Dead In Water in April 2011, is back sounding the alarm: Apple Is Being Shortsighted — And This Could Clobber The Company. His argument, which is echoed by a number of pundits and analysts, boils down to a deceptively simple equation:

Network Effect + Commoditization = Failure

The Network Effect posits that the power of a platform is an exponential function of the number of users. Android, with 80% of the smartphone market will (clearly) crush iOS by sucking all resources into its gravitational well.

Commoditization means that given an army of active, resourceful, thriving competitors, all smartphones will ultimately look and feel the same. Apple will quickly lose any qualitative advantage it now enjoys, and by having to compete on price it could easily fall behind.

Hence the preordained failure.

As a proof-of-concept, the nay-sayers point to the personal computer battle back in the pre-mobile dark ages: Didn’t we see the same thing when the PC crushed the Mac? Microsoft owned the personal computer market; PC commoditization drove prices into the bargain basement…

Interpret history how you will, the facts show something different. Yes, the Redmond Death Star claimed 90% of the PC market, but it failed to capture all the resources in the ecosystem. There was more than enough room for the Mac to survive despite its small market share.

And, certainly, commoditization has been a great equalizer and price suppressant — within the PC clone market. Microsoft kept most of the money with the de facto monopoly enjoyed by its Windows + Office combo, while it let hardware manufacturers race to the bottom (netbooks come to mind). Last quarter, this left HP, the (still) largest PC maker, with a measly 3% operating profit for its Personal Systems Group. By contrast, Apple’s share of the PC market may only be 10% or less, but the Mac owns 90% of the $1000+ segment in the US and enjoys a 25% to 35% margin.

After surviving a difficult birth, a ruthlessly enforced Windows + Office platform, and competition from PC makers large and small, the Mac has ended up with a viable, profitable business. Why not look at iDevices in the same light and see a small but profitable market share in its future?

Or, better yet, why not look at more than one historical model for comparison? For example, how is it that BMW has remained so popular and profitable with its One Sausage, Three Lengths product line strategy? Aren’t all cars made of steel, aluminium (for Sir Jony), plastic, glass, and rubber? When the Bavarian company remade the Mini, were they simply in a race to the bottom with Tata’s Nano, or were they confidently addressing the logical and emotional needs of a more affluent — and lasting — clientèle?

Back to the colorful but “expensive” 5C, Philip Elmer-DeWitt puts its price into perspective: For most iPhone owners, trading up to the 5C is ‘free‘ due to Apple’s Reuse and Recycle program. We’ll have to see if The Mere Matter of Implementation supports the theory, and where these recycled iPhones end up. If the numbers work, these reborn iPhones could help Apple gain a modest foothold in currently underserved price segments.

Still thinking about prices, I just took a look at the T-Mobile site where, surprise, the 5C is “free“, that is no money down and 24 months at $22 — plus a $10 “SIM Kit” (read the small print.) You can guess what AT&T offers: 24 months at $22/month (again, whip out your reading glasses.) Verizon is more opaque, with a terrible website. Sprint also offers a no-money-down iPhone 5C, although with more expensive voice/data plans.

This is an interesting development: Less than a week ago, Apple introduced the iPhone 5C with a “posted price” of $99 — “free” a few days later.

After much complaining to the media about “excessive” iPhone subsidies, carriers now appear to agree with Horace Dediu who sees the iPhone as a great “salesman” for carriers, because it generates higher revenue per user (ARPU). As a result, the cell philanthropists offer lower prices to attract and keep users — and pay Apple more for the iPhone sales engine.

Of course, none of this will dispel the anticipation of the Cupertino company’s death. We could simply dismiss the Apple doomsayers as our industry’s nattering nabobs of negativism, but let’s take a closer look at what insists under the surface. Put another way, what are the emotions that cause people to reason against established facts, to feel that the small market share that allowed the Mac to prosper at the higher end will inevitably spell failure for iDevices?

I had a distinct recollection that Asymco’s Horace Dediu had offered a sharp insight into the Apple-is-doomed mantra. Three searches later, first into my Evernote catchall, then to Google, then to The Guardian, I found a Juliette Garside article where Horace crisply states the problem [the passage quoted here is from a longer version that's no longer publicly available; emphasis and elision mine]:

“[There's a] perception that Apple is not going to survive as a going concern. At this point of time, as at all other points of time in the past, no activity by Apple has been seen as sufficient for its survival. Apple has always been priced as a company that is in a perpetual state of free-fall. It’s a consequence of being dependent on breakthrough products for its survival. No matter how many breakthroughs it makes, the assumption is (and has always been) that there will never be another. When Apple was the Apple II company, its end was imminent because the Apple II had an easily foreseen demise. When Apple was a Mac company its end was imminent because the Mac was predictably going to decline. Repeat for iPod, iPhone and iPad. It’s a wonder that the company is worth anything at all.”

This feels right, a legitimate analysis of the analysts’ fearmongering: Some folks can’t get past the “fact” that Apple needs hit products to survive because — unlike Amazon, as an example — it doesn’t own a lasting franchise.

In the meantime, we can expect to see more hoses attached to Apple’s money pump.

Next week, I plan to look at iOS and 64-bit processing.

JLG@mondaynote.com

Apple’s Wearables Future

 

Wearable technologies have a huge future. For Apple, they’ll create a new product category with an iPhone-like revenue stream! No so fast. Smartwatches and other wearable consumer products lack key attributes for breaking out of the novelty prison. 

‘I Think the Wrist Is Interesting’ Thus spake Tim Cook on the opening night of last May’s D11 conference.

When pressed to discuss his company’s position on wearable technologies, Cook was unusually forthcoming: Instead of pleading Apple’s Fifth, Cook launched into a substantial discussion of opportunities for his company to enter the field, calling wearables “a very key branch of the tree”.

But when asked about the heavily publicized Google Glass he parried the question by suggesting that people who don’t otherwise wear glasses might be reluctant to don such an accoutrement.

I don’t find Tim Cook’s dismissal of eyewear very insightful: Just go to a shopping center and count the eyewear stores. Many belong to the same rich Italian conglomerate, Luxottica, a company with about ten house brands such as Oakley, Persol, and Ray-Ban, and a supplier to more than twenty designer labels ranging from Armani to Versace. (As the perturbing Sixty Minutes exposé on Luxottica pointed out, the company nicely rounds out its vertical dominance of the sector through its ownership of EyeMed, a vision insurance business.)

Eyewear, necessary or not, is a pervasive, fashionable, rich product category, a fact that hasn’t escaped Google’s eye for numbers. The company is making an effort to transmute their geeky spectacles into fashion accessories. Courtesy of Counternotions I offer this picture of Sergey Brin and fashionista Diane von Furstenberg proudly donning the futuristic eyewear at the NY Fashion Week:

Glass Fashion Brin

On a grander scale, we have a Vogue article, Google Glass and a Futuristic Vision of Fashion:

Glass en Vogue 2

The company’s efforts to make Google Glass fashionable might be panned today for pushing the envelope a little too far but, in a not-too-distant future, they stand a chance of being viewed as truly visionary.

If eyewear doesn’t excite Tim Cook, what does? To him, the wrist feels more natural, more socially acceptable. We all wear one or more objects around our wrist(s).

The wristwear genre isn’t new (recall Microsoft’s 2004 Spot). Ask Google to show you pictures of smartwatches, you get 23M results and screen after screen like this one:

smartwatch_ggl

The genre seems to be stuck in the novelty state. Newer entries such as Samsung’s Gear have gotten mixed reviews. Others contend a 2010 iPod nano with a wristband makes a much nicer smartwatch.

Regardless, by comparison, pre-iPod MP3 players and pre-iPhone smartphones were getting better press – and more customers. Considering the putative iWatch, the excitement about Apple getting into this class of devices appears to be excessive.

The litmus test for the potential of a device is the combination of pervasiveness and frequency of use. Smartphones are a good example, they’re always with us, we look at their screens often (too often, say critics who pretend to ignore the relationship between human nature and the Off button).

The iWatch concept makes two assumptions: a) we’ll wear one and, b) we’ll only wear that one.

Checking around we see young adults who no longer wear watches — they have a smartphone; and middle-agers use watches as jewelry, possessing more than one. This defeats both pervasiveness and frequency of use requirements.

Then there’s the biometry question: How much useful information can a wearable device extract from its wearer?

To get a better idea about what’s actually available (as opposed to fantasized), I bought a Jawbone UP wristband a little over a month ago. With its accelerometers and embedded microprocessors, UP purports to tell you how many steps you took, how long you’ve been inactive during your days, it logs your stretches of light and deep sleep, and even “makes it fun and easy to keep track of what you eat”.  Once or twice a day, you plug it into your smartphone and it syncs with an app that displays your activity in graphic form, tells you how well you’re doing versus various goals and averages. It also suggests that you log your mood in order to “discover connections that affect how you feel.”

At first, I found the device physically grating. I couldn’t accept it the way I’m oblivious to my watch, and I even found it on the floor next to my bed a couple of mornings. But I stuck with it. The battery life is as promised (10 days) and I’ve experienced none of the first versions troubles. I traveled, hiked and showered with it without a hitch other than the cap covering the connecting pin getting a bit out of alignment.

Will I keep using it? Probably not.

Beyond the physical discomfort, I haven’t found the device to be very useful, or even accurate. It’s not that difficult to acquire a useful approximation of hours slept and distance walked during the day — you don’t need a device for these things.

As for accuracy, the other day it declared that I had exhibited a substantial level of physical activity… while I was having breakfast. (I may be French, but I no longer move my hands all that much as I speak.)

The app’s suggestion that I log my food consumption falls into the magical thinking domain of dieting. A Monday morning step on a scale tells us what we know already: Moderation is hard, mysterious, out of the reach of gadgets and incantations.

For a product to start a new worthy species for a company as large as Apple, the currency unit to consider is $10B. Below that level, it’s either an accessory or exists as a member of the ecosystem’s supporting cast. The Airport devices are neat accessories; the more visible Apple TV supports the big money makers — Macs, iPads and iPhones — by enhancing their everyday use.

With this in mind, will “wearables” move the needle, will they cross the $10B revenue line in their second or third year, or does their nature direct them to the supporting cast or accessory bins?

Two elements appear to be missing for wearable technologies to have the economic impact that companies such as Apple would enjoy:

  • The device needs to be easily, naturally worn all the time, even more permanently than the watch we tend to take off at night.
  • It needs to capture more information than devices such as the Jawbone do.

A smartwatch that’s wirelessly linked to my smartphone and shows a subset of the screen in my pocket…I’m not sure this will break out of the novelty category where the devices have been confined thus far.

Going back to Tim Cook’s oracular pronouncement on wearables being “a very key branch of the tree”, I wonder: Was he having fun misdirecting his competition?

JLG@mondaynote.com

—————————————–

PS: After two July Monday Notes on the company, I’ll wait for the Microsoft centipede to drop one or two more shoes before I write about the Why, When, How and Now What of Ballmer’s latest unnatural acts. There in an Analyst Day coming September 19th — and the press has been disinvited.

PPS: In coming days, to keep your sanity when trying to drink from the Apple kommentariat fire hydrant, you can safely direct your steps to three sites/blogs:

  • Apple 2.0 , where Philip Ellmer-DeWitt provides rational news and commentary, skewers idiots and links to other valuable fodder.
  • Asymco, where Horace Dediu provides the absolute best numbers, graphs and insights into the greatest upheaval the tech industry has ever seen. Comments following his articles are lively but thoughtful and civilized.
  • Apple Insider. You might want to focus on learned, detailed editorials by Daniel Eran Dilger such as this one where he discusses Microsoft and Google (partially) shifting to an Apple-like business model. Daniel can be opinionated, animated even, but his articles come with tons of well-organized data.

Blackberry’s Future

 

by Jean-Louis Gassée

Once the king of smartphones for business uses, Blackberry got a new CEO, a new operating system and new devices, with and without the traditional keyboard. In spite of these changes, the company’s latest numbers don’t paint a picture of revival.

Thorsten Heins doesn’t suffer a lack of enthusiasm. During the run up to the March release of its BlackBerry 10 operating system, RIM’s CEO painted an optimistic picture of a company on the rebound, a company that would correct the mistakes of the industry leaders:

“It’s still the same,” Heins said of the iPhone. “It is a sequential way to work and that’s not what people want today anymore. They want multitasking.”

Rechristened as BlackBerry, Heins told us that the company would energize the develop community and spawn devices that are too exciting to describe:

“There’s one new product I’m really excited about, but I can’t really share it,” Heins told CNET in an interview today.

Last week, the company released its latest quarterly numbers and they are exciting, although not in the sense that Heins would like. The forecast was $3.4B in revenue and $0.07 in earnings per share; the reality was $3.1B in sales and, more important, a loss of $0.13 per share.

The numbers “excited” traders so much that BBRY shares lost 28% of their value in a single trading session, putting them back to their one-year-ago level.

The earnings release was followed by the customary conference call where the CEO and CFO review the numbers and answer questions from Wall Street analysts. Courtesy of Seeking Alpha, the call transcript is here and contains the obligatory pablum, including an excessive abuse of the F-word (22 occurrences):

Embracing our heritage of mobility first is very important as we build our culture and go through this transition. We don’t have to be all things to all people and all markets, and embracing this focus allows us to drive efficiency, be flexible and agile, and to ultimately drive best-in-class innovations. [...] We’re continuing to focus on improving all areas of the business…

Curiously, there’s no breakdown of the sales of BlackBerry devices. How much of their revenue was “energized” by the BB10? Without actual numbers, we’re left in a cloud of doubt about how well the new platform is actually doing.

The disquietude continues: There are no subscriber numbers, and no guidance other than an expectation of more losses next quarter. The glowing comments about cash-flow from operations ($630M, a nice number) are undercut by the disclosure of a substantial tax refund, without which the company would have eaten through $400M to $500M of cash.

As for tablets, the Blackberry PlayBook is no more, says the CEO. He’s unhappy with the device’s performance and is determined to focus on the company’s “core hardware portfolio“. (The company’s website no longer describes the product and only offers a software update for existing customers.)

Inevitably, the How Many Moves Remain? question comes up. Blackberry professes to do more than just devices, it claims to offer strong enterprise services and says it will propagate its BBM (Blackberry Messenger) to other platforms including Android and iOS. It also promotes a form of (limited) compatibility for (some) Android apps on its newer smartphones. But is anyone buying and in numbers that can save the company?

More to the point: Who wants to buy Blackberry (the company), for what reasons, and at what price?

Let’s back up. Last week, we heard that Microsoft had once again given up on its perennial hunt to capture a handset maker. This time, the prey was Nokia, Microsoft’s “special” Windows Phone licensee.

The official explanation for the Nokia blowup was that the price tag was too high, but price clearly wasn’t an issue. Nokia’s $14B market capitalization weighs in at about 5% of Microsoft’s $288B. Even when you tack on a 25% acquisition premium, the purchase should have been a reasonably easy sell, especially given Microsoft’s desire to take the handset business into its own hands, if only to counter (or mimic) the strategy established by Google and Motorola.

There’s really only one explanation, as I speculated last week: The engagement was dissolved because of Microsoft’s bleak view of Nokia’s business, that the Finnish company no longer has the technological acumen and brand loyalty that Microsoft needs to make Windows Phone a legitimate competitor with Android and iOS.

BlackBerry’s market capitalization now stands at about $6B. That’s less than half of Nokia’s. If Nokia, supported by Microsoft, can’t gain ground on Google and Apple devices, what gives us confidence that BlackBerry isn’t sliding into insignificance?

The BlackBerry name, as a brand, is strong. But a brand only exists as the carrier of a promise. A brand writes checks that the product cashes. Without a successful product, the brand dies (go ask Kodak).

While Nokia could be acquired by someone interested in the Windows Phone business, one is hard pressed to form a similar thought for Blackberry. It may be struggling, but there is a Windows Phone ecosystem, including handset makers. There is no such thing around BlackBerry. Developers aren’t writing apps for BB10 in ecosystem-making numbers, carriers have taken a wait-and-see posture, even the core group of dedicated users (I used to be one of them) appears to be losing faith.

This isn’t a brightly optimistic picture. Today, Blackberry finds itself caught between Samsung and Apple at the high end, and a rabidly fermenting crowd of Android (official or not) clones at the lower price range.

So, why not consider heresy, or apostasy: Ditch the newer BlackBerry OS too few developers believe in, and bet on Android devices to support BlackBerry’s enterprise services.

The answer is probably the same as it is for Nokia: It’s too late.

JLG@mondaynote.com

Microsoft and Nokia won’t beget a Googorola clone

 

by Jean-Louis Gassée

Microsoft, after its highly visible 2011 bet on Nokia, could have decided to go one step further and buy Nokia to become a fully integrated smartphone. That it didn’t happen doesn’t portend a great future for Windows Phone.

Last week, the Wall Street Journal outed Microsoft’s unsuccessful attempt to acquire Nokia:

Microsoft recently held advanced talks with Nokia about buying its handset business, people familiar with the matter said, as laggards in the fast-moving mobile market struggle to gain ground.

Many saw an acquisition as an inevitable next step, that by acquiring the Finnish handset maker Microsoft could “finish the job” that they started when they licensed a special Windows Phone to Nokia. It would be a blessed union of two vigilant, watchful companies: Microsoft had watched as Android and iOS made its own OS a distant also ran; Nokia, once the world’s largest cell phone maker, couldn’t help but notice that Google and Apple had killed its handset business from both the high and low ends.

But, according to the WSJ, the parlay came to a negative and apparently definitive end:

The discussions faltered over price and worries about Nokia’s slumping market position, among other issues, these people said. One of the people said talks took place as recently as this month but aren’t likely to be revived.

To call Nokia’s fall a “slump” is more than polite. The company saw its market share fall from 39% in 2009 — more than 100 million handsets per quarter — to an estimated (and angrily debated) 3% by the end of 2012.

Microsoft hasn’t done much better with its mobile software. In 2008, Windows Mobile OS held a 11% market share, even as the underlying Windows CE engine was getting long in the tooth, particularly when compared to the Unix-ish Android and iOS engines. With a modern NT kernel, Microsoft’s mobile OS was reborn as Windows Phone 8 and scored a modest 3.2% market share in Q1 2013.  This number comes from IDC, the “research” group that has assured us that come 2016, Microsoft will be the number 2 mobile OS provider with a 19.2% share:

09-table nokia

Behold the vision and precision of IDC’s psychics: Back in June 2012, they could see four years into the future and predict that Windows Phone would edge out iOS… by two tenths of a percent!

We’ve heard the Microsoft-is-buying-a-handset-maker rumors before. Starting in 2007 and recurring year after year, Microsoft was said to be eyeing RIM/Blackberry. For some, yours truly included in January 2012, the RIM story was compellingly straightforward: RIM’s clientèle of loyal, hardcore Blackberry users in businesses and governments made it an ideal fit for the Redmond giant.

Microsoft’s defenders will argue that RIM ’07 was too expensive. Priced at $200 a share (they’re running at about $14 today), RIM would have cost more than a $100B before any acquisition premium. At the time, Microsoft was valued at approximately $250B (similar to today’s $277B). Ideal or not, the match didn’t make sense for Microsoft shareholders. Then, when RIM’s price began to slide, the Blackberry was seen as having lost too much of its shine, too much of its market momentum. The company was damaged goods. (Or, as we might have forgotten, the two co-CEOs, Mike Lazaridis and Jim Balsillie, the ones who spoke in tongues, may have proved too difficult for even Steve Ballmer to deal with.)

Someday, Microsoft’s inability to grab RIM might be seen as a signal failure, a key episode in the company’s slide into irrelevance in the smartphone market. I doubt anyone will see Nokia in a similar light, as the “one who got away”.

The “MicroNokia” relationship has been challenging from the start. In February 2011, Nokia committed itself to a special partnership with Microsoft. It would ditch its operating systems (Symbian, Meego, QT) and become a beacon and standard bearer for Windows Phone 7. Money changed hands: $250M of “platform support” per quarter was sent from Redmond to Espoo in order to offset the unspecified Windows Phone licensing payments that flowed in the opposite direction.

This messy, technologically and culturally unsound arrangement only got worse when Stephen Elop, the former Microsoft exec now running Nokia, announced the switch to Windows Phone ten months before the company would end up shipping devices that ran the new (and problematic) OS. Unsurprisingly, Nokia’s revenue evaporated, leaving it with losses and a minuscule 5% market share (including Symbian-based smartphones).

Why Elop would make an announcement that effectively Osborned the business still mystifies and enrages Nokia supporters such as Tomi Ahonen who keeps calling for Elop’s head in long, irate blog posts. (In industry lore, to “Osborne” is to prematurely announce a product that so clearly obsoletes your current offering that it kills revenue. The suicidal maneuver is named in loving memory of portable computer pioneer Adam Osborne who destroyed his business by bragging that his next product would be so much better than the current one.)

I’m also mystified, but for another reason. I can’t fathom why Nokia picked Windows Phone instead of Android, whose explosive success was obvious even as early as 2010 when the company ditched its CEO. (I’m a little biased here as, in June 2010, I wrote a tongue-in-cheek piece titled Science Fiction: Nokia goes Android.)

Nokia’s excuses for not adopting Android were vague, ranging from “we don’t want to lose control of our destiny”, to Microsoft being a “stronger partner” (read: They paid us). The potential-loss-of-destiny rhetoric falls flat, especially when you look at Android’s licensing terms and see the freedom Samsung and others enjoy with their interpretations of the platform. (We’ve heard that Nokia and Google once talked, but we don’t yet know the reason for their not becoming highly visible partners.)

Today, investors say Nokia is worth about $15B, a tenth of its 2007 peak (I’m excluding the 2000 Internet Bubble number from the comparison). Even with a “25% acquisition premium”, a Nokia acquisition would cost Microsoft less than 10% of its capitalization. So, contrary to the charitable explanation offered to the WSJ by “persons familiar with the matter”, price couldn’t have been an obstacle. That leaves us with Nokia’s “slump”: Microsoft thinks Nokia would be unable to carry Windows Phone to an influential, sustainable market position.

Now, what?

Nokia’s revenue keeps sliding down and, after a brief incursion into the black, it keeps losing money. Is there anything in sight that will reverse the trend? It’s doubtful that the company can try for the high end by offering better hardware than Samsung, nor can they squeeze into a low end that’s inhabited by official and unofficial Android clones that are swiftly killing off feature phones. This leaves Nokia’s future as an independent company in doubt and logically gives rise to more acquisition speculation.

And what will happen to Windows Phone? We now hear that Microsoft is paying developers as much as $100,000 to write or port an application to the platform. This is a rational move on Microsoft’s part, an attempt to create the critical mass that doesn’t seem to be able to happen naturally. But it can also be seen as desperation, an admission that Windows Phone is having trouble gaining momentum as developers and customers are embraced in a downward spiral.

One can’t imagine that Ballmer will call it a day and cede the field to Google and Apple. Personally, I admire his never-give-up attitude, always talking up the future, unfazed by past bold pronouncements gone wrong, but enthusiasm isn’t a strategy. And in the smartphone market, Microsoft doesn’t have many moves left. Regardless of the technical merits of its new mobile OS, momentum seems elusive; market forces that once worked against Windows competitors in the PC field now seem to confine Windows Phone to an insignificant market share against the two dominant and their complementary business models.

We don’t know yet how Google’s acquisition of Motorola will fare, but the Android platform is healthy enough without it. The same can’t be said of Windows Phone without Nokia, which leads one to believe there will be a forced marriage between the once proud Finnish handset maker and an ambitious player, probably Chinese — with Microsoft providing a substantial dowry once again.

In the meantime, we can count on IDC to provide fresh numbers… for 2017.

JLG@mondaynote.com

Android vs. Apple. Market Share vs. Profit Share, Part 255

 

Conventional wisdom and badly reconstructed history can lead to seemingly comfortable but in reality fragile conclusions. Prepare to be confused. 

Ever since the Android platform emerged as the only real competitor to Apple’s iOS devices, we’ve been treated to a debate which I’ll oversimplify: If Apple makes all the money but Android gets all the volume, who will win? A cursory survey of tech journals and blogs would lead one to believe that the case is closed: Market Share trumps Profit Share. It always does.

So Apple should call it a day? I’m skeptical. Not about the conclusion — Market Share isn’t exactly a dark horse — but about the arguments that are trotted out. False memories of Apple’s past have become a template for its future. For example, a recent Wall Street Journal article ends thus [and, sorry, you need a subscription to see the entire article]:

“Unfortunately, Apple has seen this movie before. A generation ago, it also had a top product whose market share was undercut by cheap, inferior rivals. It hopes the iPhone’s story isn’t a sequel to the Mac’s.”

(I emailed the WSJ writer asking three simple, clarifying questions. No answer, but that’s standard practice, as witheringly described by Philip Elmer-DeWitt at the end of this post.)

I was there “a generation ago”. In 1981, when IBM introduced the IBM PC, I was starting Apple France. Big Blue had made startling changes to its old ways, boldly calling its new machine The Personal Computer (we thought the “The” was ours). In an even bolder move, IBM loosened its tie and its dress code, and tried (successfully) to speak to the “common man” by using a Charlie Chaplin imitator as a mascot:

An interesting choice, particularly when juxtaposed with the real Chaplin’s cine-commentary on “labor-saving devices”:

The original PC from IBM’s Boca Raton group was a faithful homage to the Apple ][, right down to the cassette interface. But it wasn't a cheap imitation. There was one important difference:  Where the Apple ][ used a 8-bit 6502 processor, IBM splurged on the much-more-powerful 16-bit Intel chip.

Almost overnight, the pages of InfoWorld, previously replete with salivating reviews of Apple products, were filled with IBM PC articles. The new machine got a major boost with the launch of Lotus 1-2-3, a multi-function spreadsheet that became the gold standard for office applications, especially on desktops that sported hard disks and large color screens. Against the Apple ][, the IBM PC was a superior product -- and deftly marketed.

For the next few years, the Apple ][ family stumbled. The Apple ///, beset by early hardware failures, didn't answer the 16-bit question. It wasn't the modernization of the Apple ][ that the company had promised. The Apple II GS was even worse, not compatible enough with the Apple ][ and not powerful enough to attract developers, particularly Bill Gates who saw no potential for Microsoft applications.

That brings us to 1984. The Macintosh changed the game, right?

Hardly. At its coming out party, the Mac was two years behind schedule. I recall the "Mac's Last Slip" jibes at company meetings. No one would deny the obvious potential, the elegance, the innovative user interface, the clean square pixels on the bit-mapped screen, the fonts, the LaserWriter connection... But the Mac didn't support external hard drives until 1986, and it would be another year before internal disks, additional modularity, and a great Trinitron color monitor were added.

By that time, IBM had had the market to itself for half a decade, and its PC creation had morphed into the Wintel clone industry.

Contrary to the revisionist WSJ story, the "generation ago" Mac never had a market share to undercut. Apple's flagship product -- innovative, elegant, a generation ahead – was a dreamer's machine. Down-to-earth market wisdom said the Mac was perfect for Stanford undergrads, but not serious enough for real business use. The common view was application developers wouldn't be able to afford the investment in time and hardware. Starved of competitive software, the Macintosh was doomed to irrelevance and, ultimately, failure.

It almost happened, especially after Apple's desperate attempt to prop up platform share numbers by licensing Mac clones, a move that resulted in a brutal drop in Apple's margins. Market share vs. Profit Share...

The Mac was saved by Gil Amelio's unintentionally self-sacrificing decision to hand the Apple reins back to Steve Jobs. What followed was the most amazing turnaround our industry has ever seen, and it started with two controversial moves: Jobs rescinded the Mac OS license, and he made a deal with the Microsoft Devil. He convinced Gates' company to "invest" $150M in non-voting Apple shares and develop new Mac versions of the Explorer browser and Office apps (although, in reality, the agreement was part of a settlement of an older IP dispute).

We know the rest of the story, including a meme-adverse fact: For close to seven years, the Mac has consistently gained market share at the expense of PC clones.

Since the advent of another flagship product, the iPhone this time, the riches-to-rags Mac meme has led to predictions of a similar fate: Death by drowning in a sea of "cheap" Android clones. Apple's high price ($650 per iPhone on average) gives too much low-end room for competitors. The price will be undercut, there will be a decline in unit share that, in turn, will lead to lower profits, lower developer interest, lower ability to invest in future products. The road to irrelevance is paved with high margins and low market share.

Never mind two differences. First, the iPhone never lacked apps, 750,000 of them at last count. And never mind that it is immensely profitable, that Apple is embarrassingly flush with more cash than all its high-tech colleagues combined. The pundits won't accept evidence as an answer. Market Share will trump Profit Share. Why let facts cloud a good argument?

One is tempted to point to the race to the bottom that PC clone makers have experienced over the past decade. HP enjoys the largest Market Share of all PC makers, but it also "enjoys" less than 4% operating profit for its efforts. Meanwhile, Apple's margin is in the 25% range for its Mac line. That may not be as enjoyable as the 60% margin for the iPhone, but it's a solid business, particularly when you consider that the clone makers, HP and Dell foremost, are angling to get out of the business altogether. (See an earlier MN: Post-PC: Wall Street Likes the View.)

Returning to the iOS vs Android debate, I will state an opinion - not to be confused with a prediction, let alone The Truth: I think the vertical simplicity of Apple's business will tilt the field in its favor as the complicated Android world devolves into anarchy. Apple vs Google isn't Apple vs Microsoft/Intel/IBM.

Let's back up a bit. Google's 2005 acquisition of Android was a visionary move. (Some say Google's vision was sharpened by Eric Schmidt's presence on Apple's Board as the company worked on the future iPhone. Jobs was furious about Google's decision and summarily asked Schmidt to leave.) Android's unprecedented growth -- more than 50% share of the smartphone market in the US, and even more worldwide – is a testament to the "open" approach. Google gives away the Open Source Android OS; processors are another kind of "open", custom-designed under ARM licenses open to all payers.

But Android is a "cushion shot", it's an indirect way for Google to make money. Android is a Trojan horse that infects smartphones so it can install services that collect the user data that feeds Google's true business: advertising.

Now, Google faces several problems. Android's openness leads to incompatibilities between devices, a problem for developers that didn't happen under Microsoft's rule in the PC era. Worse (for Google), the many diverging versions of Android (a.k.a. forks) -- especially those created in China -- carry no Google services. They harvest no data and so they bring no advertising revenue potential back to Google.

This is clearly a concern for Google, so much so that the company now offers "pure" Android smartphones by Samsung (for $650) and HTC (for $599) on its Google Play site.

On the other hand, Android 2013 is a mature, stable OS. It isn't Windows '95, which was nothing more than a shell bolted on top of DOS. While the Mac's system software wasn't fully developed when it first came out, many saw it as superior -- or potentially superior -- to Microsoft's OS. Android is a tougher competitor than Windows was at the same age.

Then there is Google's subsidiary Motorola Mobility and the relationship with Samsung, the most powerful Android handset maker. As discussed last week, Motorola's stated intention is to push Android phone prices well below the $650 (unsubsidized) level. Is Samsung in a position to wag the Android dog? And if so, how will they react to Motorola's moves?

Let's not forget "the small matter of execution", one that might prove more important than lofty "strategic" considerations. And, to further complicate predictions, we have the herd's tendency to assume Company X will make all the mistakes while its competitors will play a perfect game.

Confused? Then I have accomplished one of my goals, to show how unhelpful the old bromides are when trying to guess what will happen next.

-- JLG@mondaynote.com

PS: I'd be remiss if I didn't direct you recently discovered articles by John Kirk, who calls himself a recovering attorney and indeed writes tightly reasoned posts on Techpinions. I'll whet your appetite with two quotes. One from Does The Rise Of Android's Market Share Mean The End of Apple's Profits? [emphasis mine]:

Steve Jobs wanted, and Apple wants, market share. But they want the RIGHT market share. Apple wants customers who are willing to pay for their products. And Apple wants customers who are good for their platform. In other words, Apple wants market share in their target demographic. Based on the fact that Apple is taking in 72% of the mobile phone profits with only 8% or 9% of the market share, it sure sounds like they’ve aquired the right market share to me.

Does the rise of Android’s market share mean the end of Apple’s profits? Hardly. You can argue as loudly as you like that developers and profit share must necessarily follow market share. But the facts will shout you down.

The other is from 4 Mobile Business Models, 4 Ways To Keep Score where he concludes:

And if you’re going to prophesy that market share alone gives Google data that will someday, somehow, be worth something to someone, then you need to go back and re-read how the “razor-and-blades” business model is scored.

What we desperately need in analyzing mobile computing is far more attention paid to profits and far less attention paid to prophets.

—-

Post-PC: Wall Street Likes the View

 

The conventional PC business is now on the decline and yet share prices for of key players Microsoft and HP are moving up. Why?

In an April press release, IDC painted a bleak picture for the PC. Compared to last year’s first quarter, worldwide shipments of PCs are down 13.9%, the “steepest decline ever in a single quarter”. US numbers are about the same: -12.7%. On a graph, the trend is unmistakable:

Is this a trend Wall Street likes?

When you consider Microsoft, it seems so. In a corporate blog post titled Windows 8 at 6 months, the company proudly claims to have “recently surpassed the 100 million licenses sold mark for Windows 8.” This is an interesting number. A quarter ago, MS announced it had sold 60 million licenses, meaning that only 40 million were sold in the last three months. That’s a 33% drop…hardly a rousing success. (The “licenses sold” phrase requires caution, it doesn’t only mean “sold with new PCs”, there are also updates to existing machines, with or without enthusiasm for the new Windows OS.)

“Ignore the Windows 8 numbers and IDC analysis”, says Wall Street. While the tech-heavy Nasdaq climbed only 6.6% in the last 60 days, Microsoft shares went up by 21%.

The same apparent illogic holds for Hewlett-Packard. Last week, the largest PC maker disclosed its second quarter numbers. Compared to the same quarter last year, they’re not exactly pretty:

Revenue down by 10% to $27.6B
Operating Margin at 5.8%, down by about 20% (HP prefers “down 1.4 points”)
EPS (Earnings Per Share) at 55 cents, down 31%

Zeroing on HP’s PC business, things look worse:

Revenue down by 20% to $7.6B
Operating Margin at 3.2%, down 44% (“down 2.2 points” sounds better)

As one would expect, Wall Street reacted, and HP shares went…up. By 17.8% the day after the announcement:

What was the good news for investors? Resorting to one of the usual bromides, HP “handily beat Street expectations” by posting Earnings Per Share (EPS) of $0.55 vs. a projected $0.30 to $0.40.

As discussed in the December 16th Monday Note, Chapter 2 of the Turnaround Artist Manual prescribes exactly what we’re seeing: Drastically lower expectations within days of taking on the job. “Things are worse than I was told. We’ll have to touch bottom before we bounce back…’”

Following the script, HP CEO Meg Whitman called 2013 a “fix and rebuild year”. Everyone should expect a “broad-based profit decline”. But a 17% rebound in the stock price can’t be explained solely by a collective sigh of relief when the actual numbers aren’t as bad as the CEO had led everyone to expect.

(In its earnings release, HP still calls itself “The world’s largest technology company”. I guess they think smartphones and tablets aren’t “technology”, but PCs and printers are…)

As quoted in a VentureBeat post, Whitman thinks that the other US PC maker, Dell, is in no better shape:

“You saw a competitor, Dell, completely crater earnings,” Whitman said in response to a question. “Maybe that is what you do when you are going private. We are setting up the company for the long term.”

Ironically, and without a hint of self-awareness, she accuses Dell of playing the Setting Artificially Low Expectations game:

She implied that Dell did that on purpose, since Michael Dell is motivated to repurchase shares in the company as cheaply as possible, and deliberately lowering earnings is a good way to get the share prices to fall.

 Actually, Whitman must envy what Dell is attempting to do: Get out of the PC clone Race To The Bottom. Because PCs make half of Dell’s revenue, getting out of that hopelessly commoditized business would cause trouble if done in public. Going private allows Dell to close the curtain, perform the unappetizing surgery out of view and, later, return to Wall Street with a smaller company endowed with a more robust earnings engine, focused on higher-enterprise gear and services.

This helps explain the apparent paradox: Wall Street doesn’t like HP and Microsoft shares despite their lower PC numbers but because of them. Investors want to believe that future earnings (the ones they count on when buying shares today) will come from “Post-PC” products and services instead of being weighed down by shrinking PC volumes and margins. In particular, those who buy HP shares must believe that the company will sooner or later exit the PC clone business. For Microsoft, the bet is that the company will artfully manage a smooth transition to higher Enterprise and Entertainment revenues and their fatter margins.

I’m not in fond of the “Post-PC” label, it lacks nuance and it’s premature. The desktop and laptop machines we’ve known for more than three decades may no longer be the sole incarnations of our personal computing – our affection, time, and money have shifted smartphones and tablets – but the PC will continue to live in our offices and homes.

Regard Lenovo, the Chinese company that seized on IBM’s PC business when Big Blue decided to exit the race. They’re doing quite well, posting a record $34B in revenue for this year.

There is life left in the PC business, just not for US incumbents.

JLG@mondaynote.com

 

Otellini’s Striking Confession

 

We know Intel shunned ARM processors and played virtually no role in the smartphone revolution. But we now learn Steve Jobs asked Intel to build the iPhone microprocessor. Paul Otellini, Intel’s departing CEO, admits he should have followed his gut – and made the smartphone world a very different place.

CEO valedictions follow a well-known script: My work is done here, great team, all mistakes are mine, all good deeds are theirs, I leave the company in strong hands, the future has never been brighter… It’s an opportunity for a leader to offer a conventional and contrived reminiscence, what the French call la toilette des souvenirs (which Google crudely translates as toilet memories instead of the affectionate and accurate dressing up memories).

For his farewell, Paul Otellini, Intel’s departing CEO, chose the interview format with The Atlantic Monthly’s senior editor Alexis Madrigal. They give us a long (5,700+ words) but highly readable piece titled Paul Otellini’s Intel: Can the Company That Built the Future Survive It?

Photo: Guardian.co.uk

The punctuation mark at the title’s end refers to the elephantine question in the middle of Otellini’s record: Why did Intel miss out on the smartphone? Why did the company that so grandly dominates the PC market sit by while ARM architecture totally, and perhaps irretrievably, took over the new generation of phones — and most other embedded applications?

According to Otellini, it was the result of Intel’s inertia: It took a while to move the machine.

Madrigal backfills this uneasy explanation with equal unease:

“The problem, really, was that Intel’s x86 chip architecture could not rival the performance per watt of power that designs licensed from ARM based on RISC architecture could provide. Intel was always the undisputed champion of performance, but its chips sucked up too much power. In fact, it was only this month that Intel revealed chips that seem like they’ll be able to beat the ARM licensees on the key metrics.”

Note the tiptoeing: Intel’s new chips “seem like” they’ll be fast enough and cheap enough. Madrigal charitably fails to note how Intel, year after year, kept promising to beat ARM at the mobile game, and failed to do so. (See these 2010, 2011 and 2012 Monday Notes.) Last year, Intel was still at it, dismissively predicting “no future for ARM or any of its competitors“. Tell that to ARM Holdings, whose licensees shipped 2.6 billions chips in the first quarter of this year.

Elsewhere in the article, Otellini offers a striking revelation: Fresh from anointing Intel as the microprocessor supplier for the Mac, Steve Jobs came back and asked Intel to design and build the CPU for Apple’s upcoming iPhone. (To clarify the chronology, the iPhone was announced early January, 2007; the CPU conversation must have taken place two years prior, likely before the June, 2005 WWDC where Apple announced the switch to x86. See Chapter 36 of Walter Isaacson’s Jobs bio for more.)

Intel passed on the opportunity [emphasis mine]:

“We ended up not winning it or passing on it, depending on how you want to view it. And the world would have been a lot different if we’d done it, […]

Indeed, the world would have been different. Apple wouldn’t be struggling through a risky transition away from Samsung, its frenemy CPU supplier; the heart of the iPhone would be Made In America; Intel would have supplied processors for more than 500 million iOS devices, sold even more such chips to other handset makers to become as major a player in the smartphone (and tablet) space as it is in the PC world.

Supply your own adjectives…

Indulging briefly in more What If reverie, compare the impact of Intel’s wrong turn to a better one: How would the world look like if, at the end of 1996, Gil Amelio hadn’t returned Apple back to Steve Jobs? (My recollection of the transaction’s official wording could be faulty.)

So, again, what happened?

At the end of the day, there was a chip that they were interested in that they wanted to pay a certain price for and not a nickel more and that price was below our forecasted cost. I couldn’t see it. It wasn’t one of these things you can make up on volume. And in hindsight, the forecasted cost was wrong and the volume was 100x what anyone thought.

A little later, Otellini completes the train of thought with a wistful reverie, a model of la toilette des souvenirs:

“The lesson I took away from that was, while we like to speak with data around here, so many times in my career I’ve ended up making decisions with my gut, and I should have followed my gut,” he said. “My gut told me to say yes.”

The frank admission is meant to elicit respect and empathy. Imagine being responsible for missing the opportunity to play a commanding role in the smartphone revolution.

But perhaps things aren’t as simple as being a “gut move” short of an epochal $100B opportunity.

Intel is a prisoner of its x86 profit model and Wall Street’s expectations. It’s dominant position in the x86 space give Intel the pricing power to command high margins. There’s no such thing in the competitive ARM space, prices are lower. Even factoring in the lower inherent cost of the somewhat simpler devices (simpler for the time being; they’ll inevitably grow more complex), the profit-per-ARM chip is too thin to sustain Intel’s business model.

(Of course, this assumes a substitution, an ARM chip that displaces an x86 device. As it turns out, the smartphone business could have been largely additive, just as we now see with tablets that cannibalize classical PCs.)

Another factor is the cultural change that would have been required were Intel to have gotten involved in making ARM devices. As both the designer and manufacturer of generation after generation of x86 microprocessors, Intel can wait until they’re good and ready before they allow PC makers to build the chips into their next products. The ARM world doesn’t work that way. Customers design their own chips (often called a System on a Chip, or SoC), and then turn to a semiconductor manufacturer (a foundry) to stamp out the hardware. Taking orders from others isn’t in Intel’s DNA.

And now?

The answer might lie in another French expression: L’histoire ne repasse pas les plats. Google Translate is a bit more felicitous this time: History does not repeat itself. I prefer the more literal image — History doesn’t come around offering seconds — but the point remains: Will there be seconds at the smartphone repast?

Officially, Intel says its next generation of x86 processors will (finally!) topple the ARM regime, that their chips will offer more computing might with no cost or power dissipation penalty. In their parlance “the better transistor” (the basic unit of logic processing) will win.

I doubt it. The newer x86 devices will certainly help Microsoft and its OEMs make Windows 8 devices more competitive, but that won’t prevent the spread of ARM in the legion of devices on which Windows is irrelevant. For these, Intel would have to adopt ARM, a decision Otellini has left to the new tandem leadership of Brian Krzanich (CEO) and Renée James (President). Will they stick to the old creed, to the belief Intel’s superior silicon design and manufacturing technology will eventually overcome the disadvantages of the more complex x86 architecture? Or will they take the plunge?

They might be helped by a change in the financial picture.

In 2006, that is after throwing Jobs in Samsung’s arms (pun unintended), Intel sold its ARM business, the XScale line, to Marvell. The reason was purely financial: for similar capital expenditures (costly fabs), ARM processors achieved much lower per-unit profit, this because of the much more competitive scene than in the x86 space.

Now, if Intel really wants to get a place at the smartphone table with new and improved x86 devices, the company will have to price those to compete with established ARM players. In other words, Intel will have to accept the lower margins they shunned in 2006. Then, why not do it with the ARM-based custom processors Apple and others require?

JLG@mondaynote.com

—————————-

(I’ll confess a weakness for The Atlantic and, in particular, for its national correspondent James Fallows, a literate geek and instrument-rated pilot who took upon himself to live in Beijing for a while and, as a result, can speak more helpfully about China than most members of the Fourth Estate. Going back to last week’s reference to the Gauche Caviar, when my Café de Flore acquaintances fall into their usual rut of criticizing my adopted country for its lack of “culture”, I hold out that The Atlantic — which sells briskly at the kiosk next door — is one of many examples of American journalistic excellence.

And, if you’re interested in more strange turns, see this other string Alexis Madrigal piece in the same Atlantic: The Time Exxon Went Into the Semiconductor Business (and Failed). I was there, briefly running an Exxon Information Systems subsidiary in France and learning the importance of corporate culture.)–JLG

This Wristband Could Change Healthcare

 

Jawbone is launching is UP wristband in Europe. Beyond the quirky gadget lies a much larger project: Changing healthcare — for better or for worst. 

 Hyperkinetic as he is, Hosain Rahman, the Jawbone founder, must be saturating his Jawbone UP wristband with data. The rubberized band, nicely designed by Yves Behar, is filled with miniaturized electronics: accelerometers and sensors monitor your activity through out the day, recording every motion in your life, from walking in the street to the micro-movements of your hand in a paradoxical sleep phase. For the fitness freak, the Up is a great stimulus to sweat even more; for the rest of us, it’s more like an activity and sleep monitoring device. (For a complete product review, see this article from Engadget, and also watch Hosain Rahman’s interview by Kevin Rose, it’s well worth your time.) Last week in Paris, after my meeting with Hosain, I headed straight to the nearest Apple Store to pick-up my Up (for €129), with the goal of exploring my sleeping habits in greater depth.

After using the device for a couple of days, the app that comes with it tells me I’m stuck in a regime of 5 to 6 hours of bad sleep — including less than three hours of slow-wave sleep commonly known as deep sleep. Interesting: Two years ago, I spend 36 hours covered with electrodes and sensors in a hospital specializing in studying and (sometimes) treating insomnia — after a 6 months on a wait list to get the test. At one point, to monitor my sleep at home, doctors lent me a cumbersome wristband, the size of a matchbox. The conclusion was unsurprising: I was suffering from severe insomnia, and there was very little they could do about it. The whole sleep exploration process must have cost 3000€ to the French public health care system, 20 times more than the Jawbone gadget (or the ones that do a similar job). I’m not contending that medical monitoring performed by professionals can be matched by a wristband loaded with sensors purchased in an electronics store. But, aside from the cost, there is another key difference: the corpus of medical observations is based on classic clinical tests of a small number of patients. On the other hand, Jawbone thinks of the UP wristband — to be worn 24/7 by millions of people — in a Big Data frame of mind. Hosain Rahman is or will soon be right when he says his UP endeavor contributes to the largest sleep study ever done.

Then it gets interesting. As fun as they can be, existing wearable monitoring devices are in the stone age compared to what they will become in three to five years. When I offered Hosain a list of features that could be embedded in future versions of the UP wristband — such as a GPS module (for precise location, including altitude), heartbeat, blood pressure, skin temperature and acidity sensors, bluetooth transmitter — he simply smiled and conceded that my suggestions were not completely off-track. (Before going that far, Jawbone must solve the battery-life issue and most likely design its own, dedicated super-low consumption processor.) But Hosain also acknowledges his company is fueled by a much larger ambition than simply build a cool piece of hardware aimed at fitness enthusiasts or hypochondriacs.

His goal is nothing less than disrupting the healthcare system.

The VC firms backing Jawbone are on the same page. The funding calendar compiled by Crunchbase speaks for itself: out of the stunning $202m raised since 2007, most of it ($169m), has been raised since 2011, the year of the first iteration of the UP wristband (it was a failure due to major design flaws). All the big houses are on board: Khosla Ventures, Sequoia, Andreessen-Horowitz, Kleiner Perkins, Deutsche Telekom… They all came with an identical scheme in mind: a massive deployment of the monitoring wristband, a series of deals with the biggest healthcare companies in America to subsidize the device. All this could result in the largest health-related dataset ever build.

The next logical step would be the development of large statistical models based on customers’ recorded data. As far as privacy is concerned, no surprise: Jawbone is pretty straightforward and transparent: see their disclosure here. It collects everything: name, gender, size and weight, location (thanks to the IP address) and, of course, all the information gathered by the device, or entered by the user, such as the eating habits. A trove of information.

Big Data businesses focusing on health issues drool over what can be done with such a detailed dataset coming from, potentially, millions of people. Scores of predictive morbidity models can be built, from the most mundane — back pain correlated to sleep deprivation — to the most critical involving heart conditions linked to various lifestyle factors. When asked about privacy issues, Hosain Rahman insists on Jawbone’s obsessive protection of his customers, but he also acknowledges his company can build detailed population profiles and characterize various risk factors with substantially greater granularity.

This means serious business for the health care and insurance sectors — and equally serious concerns for citizens. Imagine, just for a minute, the impact of such data on the pricing structure of your beloved insurance company? What about your credit rating if you fall into a category at risk? Or simply your ability to get a job? Of course, the advent of predictive health models potentially benefits everyone. But, at this time, we don’t know if and how the benefits will outweigh the risks.

frederic.filloux@mondaynote.com

Apple Buys Intel

 

Getting rid of Samsung as a processor supplier and, at the same time, capturing the crown jewel of the American semiconductor industry. How could Apple resist the temptation to solve its cash problem and make history again?

Halfway through the second quarter of the 2013 fiscal year, most of Apple’s top execs meet at an undisclosed location (Eddy Cue’s chair is empty – he’s been called away to a Ferrari board meeting). They’re joined by a few trusted industry insiders: Bill “the Coach” Campbell, Apple and Intuit Director and adviser to Google’s founders, Mssrs. Page and Brin; Larry Sonsini, the Silicon Valley consigliere of more than three decades; and Frank Quattrone, the star investment banker with nine lives.

The meeting isn’t about the company’s dwindling profit margins. The smaller margins were expected and invited: The reduced-price iPad and heavy promotion of the “old” iPhone 4 as an entry-level product are part of the long term strategy of guarding Apple’s lower end (so to speak). And no whining about AAPL’s grim slide over the last six months, a problem that has only one solution: Apple needs to record a series of better quarters.

The problem of the day is, once again, what to do with Apple’s obscene pile of cash.

By the end of December 2012, the company held about $137B in cash (or equivalents such as marketable securities), including $23B from operations for the quarter.

CFO Peter Oppenheimer delivers the bad news: It looks like operations will disgorge another $35B this quarter. The stock buy-back and dividend program that was designed to bleed off $45B over the next few years (see this March 2012 Monday Note) won’t be enough if the company continues at this rate.

Apple needs something bigger.

Quattrone has been sitting quietly at the end of the table. He clears his throat and speaks:

Buy Intel.

Well, yes, Frank (says Tim Cook), we’ve been buying Intel processors for the Mac since 2005.

Not the chips. The company. The planets are aligned for Apple to strike a blow that will leave the industry forever changed. Make history, acquire Intel.

Quattrone has their attention. He unfolds the celestial calibration:

  • Apple needs to extract itself from the toxic relationship with Samsung, its ARM supplier.
  • Intel is the best large-scale silicon manufacturer in the world. They have the people, the technology, and the plant capacity to match Apple’s needs for years to come.
  • “But Intel doesn’t do ARM!” you say. Indeed, Intel has no interest in the fierce competition and small margins in the ARM-based SoC market. Joining the ARM fray would severely disrupt Intel’s numbers and infuriate Wall Street. But if Intel were to essentially “go private” as Apple’s semiconductor manufacturing arm (pun intended), catering to all of Apple’s x86 and ARM needs (and whatever else Bob Mansfield is secretly plotting), Wall Street would have no such objection.
  • Intel is flailing. The traditional PC market – Intel’s lifeblood – continues to shrink, yet the company does nothing to break into the ARM-dominated mobile sector. In the meantime, the company makes perplexing investments such as buying McAfee for $7.68B.
  • There’s a leadership vacuum at Intel. Six months after announcing CEO Paul Otellini‘s “retirement”, Intel’s Board has yet to find a replacement who can sail the ship in more competitive waters. Apple could commission Pat Gelsinger, a 30-year Intel veteran and former CTO (Intel’s first) who fled to VMware after his career stalled at Intel. Despite being a bit of a Bill Gates look-alike (once upon a time), Gelsinger is a real technologist who would fit well within Apple, especially if he were given the opportunity to really “go for” the ARM architecture instead of iteratively tweaking x86 devices.
  • Last but not least, Intel’s market cap is about $115B, eminently affordable. The company is profitable and generates a good deal of cash, even after the heavy capital expenditures required by its constant need to build new and expensive manufacturing plants.
  • …oh, and one more thing: Wouldn’t it be fun to “partner” more closely with Microsoft, HP and Dell, working on x86 developments, schedules and… pricing?

A lively discussion ensues. Imagine solving many of Apple’s problems with a single sweeping motion. This would really make Cupertino the center of the high-tech world.

It’s an interesting idea, but there will be obstacles, both cultural and legal.

The Coach goes first: “Knowing both of these companies more than a little bit, I can attest to the pride they have in their respective cultures. They’re both disinclined to reconsider their beliefs in any meaningful way. Merging these two dissimilar groups, shedding unnecessary activities such as McAfee and the like would be dangerously disruptive to Apple’s well-honed, cohesive culture. As a general rule, merging two large organization rarely succeeds… unless you consider merging airlines a success…”

Finally, the Consigliere speaks: “It’s a tempting fantasy, it will mean years of work for my firm and many, many others, but as a friend of the company, as a past confidant of your departed Founder, don’t do it. There will be too much legal trouble with the Feds, with competitors, with Intel partners. Most fantasies aren’t meant to be enacted.”

I won’t dwell on the reality of the meeting: I made it up as a way to explain why Apple really has no choice other than submit to another cash phlebotomy, this time for an additional $60B. And, as with real-world phlebotomies, the procedure will treat the problem, but it won’t cure it. With $30B from operations per quarter, the $60B lancing will have to be repeated.

Some read the decision to return gobs of cash to shareholders as an admission of defeat. Apple has given up making big moves, as in one or more big acquisitions.

I don’t agree: We ought to be glad that the Apple execs (and their wise advisers) didn’t allow themselves to succumb to transaction fever, to a mirage of ego aggrandizement held out by a potential “game changing” acquisition.

A final word on taxes. To return the additional $60B (for a total of $100B when including the ongoing program announced last year) through increased dividends and repurchased shares, Apple will have to borrow money.

Borrow? When they have so much cash?

Yes, thanks to our mangled tax code. As explained here, about $100B of Apple’s cash is stored overseas. If repatriated, it would be “heavily” (read “normally”) taxed. Like most US companies that have international operations, Apple plays complicated, entirely legal tax games that allow their international profits to be taxed at very low rates as long as the profits — and the resulting cash — stay outside Uncle Sam’s reach. And thus we have the apparent paradox of borrowing money when cash-rich.

The benefit of these tax code contortions is difficult to explain to normal humans — as opposed to legislators who allowed the loopholes.

All this now makes Apple a different company. Once a fledgling challenger of established powerhouses such as IBM, Microsoft or HP, it now makes “too much cash” and is condemned to a life of paying dividends and buying back shares — like the old fogies it once derided.

JLG@mondaynote.com