hardware

BlackBerry: The Endgame

 

The BlackBerry was the first truly modern smartphone, the king of Personal Information Management On The Go. But under its modern presentation lurked its most fatal flaw, a software engine that couldn’t be adapted to the Smartphone 2.0 era.

Jet-lagged in New York City on January 4th 2007, just back from New Years in Paris, I left my West 54th Street hotel around 6am in search of coffee. At the corner of the Avenue of the Americas, I saw glowing Starbucks stores in every direction. I walked to the nearest one and lined up to get my first ration of the sacred fluid. Ahead of me, behind me, and on down the line, everyone held a BlackBerry, checking email and BBM messages, wearing a serious but professional frown. The BlackBerry was the de rigueur smartphone for bankers, lawyers, accountants, and anyone else who, like me, wanted to be seen as a four-star businessperson.

Five days later, on January 9th, Steve Jobs walked on stage holding an iPhone and the era of the BlackBerry, the Starbucks of smartphones, would soon be over. Even if it took three years for BlackBerry sales to start their plunge, the iPhone introduction truly was a turning point In BlackBerry’s life.

RIM (as the company was once called) shipped 2M Blackberries in the first quarter of 2007 and quickly ascended to a peak of 14.6M units by Q4 2010, only to fall back to pre-2007 levels by the end of 2013:

337_unnamed-1

Last week, BlackBerry Limited (now the name of the company) released its latest quarterly numbers and they are not good: Revenue plunged to $916M vs. $1.57B a year ago (-42%); the company lost $207M and shipped just 2.1M smartphones, more than a half-million shy of the Q1 2007 number. For reference, IDC tells us that the smartphone industry shipped about 300M units in the second quarter of 2014, with Android and iOS devices accounting for 96% of the global market.

Explanations abound for BlackBerry’s precipitous fall.

Many focus on the company’s leaders, with ex-CEO Jim Balsillie and RIM founder Mike Lazaridis taking the brunt of the criticism. In a March 2011 Monday Note uncharitably titled The Inmates Have Taken Over The Asylum, I quoted the colorful but enigmatic Jim Balsillie speaking in tongues:

“There’s tremendous turbulence in the ecosystem, of course, in mobility. And that’s sort of an obvious thing, but also there’s tremendous architectural contention at play. And so I’m going to really frame our mobile architectural distinction. We’ve taken two fundamentally different approaches in their causalness. It’s a causal difference, not just nuance. It’s not just a causal direction that I’m going to really articulate here—and feel free to go as deep as you want—it’s really as fundamental as causalness.”

This and a barely less bizarre Lazaridis discussion of “application tonnage” led one to wonder what had happened to the two people who had so energetically led RIM/BlackBerry to the top of the industry. Where did they take the wrong turn? What was the cause of the panic in their disoriented statements?

Software. I call it the Apple ][ syndrome.

Once upon a time, the Apple ][ was a friendly, capable, well-loved computer. Its internal software was reliable because of its simplicity: The operating system launched applications and managed the machine’s 8-bit CPU, memory, and peripherals. But the Apple ][ software wasn’t built from the modular architecture that we see in modern operating systems, so it couldn’t adapt as Moore’s Law allowed more powerful processors. A radical change was needed. Hence the internecine war between the Apple ][ and Steve Jobs’ Mac group.

Similarly, the BlackBerry had a simple, robust software engine that helped the company sell millions of devices to the business community, as well as to lay consumers. I recall how my spouse marveled at the disappearance of the sync cable when I moved her from a Palm to a Blackberry and when she saw her data emails, calendar and address book effortless fly from her PC to her new smartphone. (And her PC mechanic was happy to be freed from Hotsync Not Working calls.)

But like the Apple ][, advances in hardware and heightened customer expectations outran the software engine’s ability to evolve.

This isn’t something that escaped RIM’s management. As recounted in a well-documented Globe and Mail story, Mike Lazaridis quickly realized what he was against:

“Mike Lazaridis was at home on his treadmill and watching television when he first saw the Apple iPhone in early 2007. There were a few things he didn’t understand about the product. So, that summer, he pried one open to look inside and was shocked. It was like Apple had stuffed a Mac computer into a cellphone, he thought.

[…] the iPhone was a device that broke all the rules. The operating system alone took up 700 megabytes of memory, and the device used two processors. The entire BlackBerry ran on one processor and used 32 MB. Unlike the BlackBerry, the iPhone had a fully Internet-capable browser.”

So at a very early stage in the shift to the Smartphone 2.0 era, RIM understood the nature and extent of their problem: BlackBerry’s serviceable but outdated software engine was against a much more capable architecture. The BlackBerry was a generation behind.

It wasn’t until 2010 that RIM acquired QNX, a “Unix-ish” operating system that was first shipped in 1982 by Quantum Software Systems, founded by two Waterloo University students. Why did Lazaridis’ company take three years to act on the sharp, accurate recognition of its software problem? Three years were lost in attempts to tweak the old software engine, and in fights between Keyboard Forever! traditionalists and would-be adopters of a touch interface.

Adapting BlackBerry’s applications to QNX was more complicated than just fitting a new software engine into RIM’s product line. To start with, QNX didn’t have the thick layer of frameworks developers depend on to write their applications. These frameworks, which make up most of the 700 megabytes Lazaridis saw in the iPhone’s software engine, had to be rebuilt on top of a system that was well-respected in the real-time automotive, medical, and entertainment segment, but that was ill-suited for “normal” use.

To complicate things, the company had to struggle with its legacy, with existing applications and services. Which ones do we update for the new OS? which ones need to be rewritten from scratch? …and which ones do we drop entirely?

In reality, RIM was much more than three years behind iOS (and, later, Android). Depending on whom we listen to, the 2007 iPhone didn’t just didn’t stand on a modern (if incomplete) OS, it stood on 3 to 5 years of development, of trial and error.

BlackBerry had lost the software battle before it could even be fought.

All other factors that are invoked in explaining BlackBerry’s fall — company culture, hardware misdirections, loss of engineering talent — pale compared to the fundamentally unwinnable software battle.

(A side note: Two other players, Palm and Nokia, lost the battle for the same reason. Encumbered by once successful legacy platforms, they succumbed to the fresh approach taken by Android and iOS.)

Now under turnaround management, BlackBerry is looking for an exit. John Chen, the company’s new CEO, comes with a storied résumé that includes turning around database company Sybase and selling it to SAP in 2012. Surely, such an experienced executive doesn’t believe that the new keyboard-based BlackBerry Passport (or its Porsche Design sibling) can be the solution:

337_unnamed

Beyond serving the needs or wants of die-hard keyboard-only users, it’s hard to see the Passport gaining a foothold in the marketplace. Tepid reviews don’t help (“The Passport just doesn’t offer the tools I need to get my work done”); Android compatibility is a kludge; developers busy writing code for the two leading platforms won’t commit.

Chen, never departing from his optimistic script, touts BlackBerry’s security, Mobile Device Management, and the QNX operating system licenses for embedded industry applications.

None of this will move the needle in an appreciable way. And, because BlackBerry’s future is seen as uncertain, corporate customers who once used BlackBerry’s communication, security, and fleet management services continue to abandon their old supplier and turn to the likes of IBM and Good Technology.

The company isn’t in danger of a sudden financial death: Chen has more than $3B in cash at his disposal and the company burns about $35M of it every quarter. Blackberry’s current stock price says the company is worth about $5B, $2B more than its cash position. Therefore, Chen’s endgame is to sell the company, either whole or, more likely, in parts (IP portfolio, QNX OS…) for more than $2B net of cash.

Wall Street knows this, corporate customers know this, carriers looking at selling Passports and some services know this. And potential body parts buyers know this as well… and wait.

It’s not going to be pretty.

JLG@mondaynote.com

An Ancient Love Story: Apple & Payment Systems

 

This week’s product launch should break the mold of Apple’s recent Fall announcements: More products than usual and a challenge to the status quo – in payment system this time.

A larger iPhone; a line of wearables (unveiled if not yet ready-to-ship); significant iOS improvements (a true “iOS 2.0”); HomeKit and HealthKit devices, applications, and partnerships; payment systems… If only half of the rumors about Apple’s September 9th media event are true, we’re going to have a wider and deeper flood of new products than we’ve seen in Apple’s previous Fall launches.

And let’s not forget the big white cocoon that covers the two-story structure that Apple built for the occasion:

Apple White Cocoon Edited

(image source:  AppleInsider)

Apple is likely to add some drama to the event by lifting the veil at the last moment.

For today, we’ll focus on the recent flurry of leaks and rumors surrounding payment systems. We’ve heard about agreements with American Express, Visa, MasterCard, Bank of America; with retailers such as Nordstrom and Macy’s, CVS and Walgreens; and hoteliers such as Starwood… The predications may not prove accurate down to the last detail, but the outbreak is too strong not to be taken seriously. Apple is about to get into the payment system business in a serious way.

There have been rumors before. Search for “apple payment system” and you’ll get about 80 million hits on Google (11 million on Bing). Flipping through the pages, we see that the excitement started as far back as five years ago when Apple’s “Grab & Go” patent filings disclosed the company’s interest in near field communication, a wireless data transfer method that can be used for quick purchases and payments. This led to the birth of a new i-Word around 2010: the iWallet.

From its very beginning, the iPhone has looked like a logical payment device. Our phones are always with us; they’re more secure than the magnetic stripe on a credit card because they can use “payment tokens” — codes that authenticate you without identifying your credit card account; payment apps can be easily downloaded and updated.

The possibilities looked endless and, of course, led to overheated predictions: Think of all the trillions of dollars sloshing around in debit/credit cards. If Apple captured only a small fraction of the flow, they’d be filthy rich!

Others disagreed. In January 2011, PCWorld’s Tom Spring explained why Apple’s Mobile Payment System Will Fail. Among his objections, was the implicit assumption that phones are somehow easier than cards (“What’s gained…by waving an iPhone instead of swiping a bank card is not clear to me”), and that retailers won’t accept phones as payment instruments until the “Another Box at the Register” obstacle is surmounted:

“Near field communication is a technology that requires a physical box/reader on the retailer’s end. Until we know more about what incentives there are for retailers to invest in this technology I think it’s going to be hard sell for Apple to convince millions of merchants to put another box at the point of sale…”

Indeed, attempting to modify ingrained customer behavior isn’t a well-trodden path to riches, nor is asking retailers to install a new box next to their cash register. This is why many payment system innovations, Google Wallet is a recent example, have failed to amass enough gravitational pull to gain currency (pardon the pun). There just hasn’t been enough acceptance by consumers and retailers for “fast lane” payment devices to become as matter-of-fact as the incumbents.

Still… Apple has repeatedly shown great patience and willingness to challenge settled wisdom.

The company’s embrace of payment systems started in 2003 when its newly-opened iTunes Store offered two innovations: Single tracks were sold for 99 cents apiece (at the time), and we could settle the purchase with a credit card. Critics scoffed: The price is too low! The credit card companies’ fixed+percentage transaction fees will be a profit-killer!

How can Apple possibly make money with such a proposition?

This was myopia. The iTunes Store wasn’t intended to be a money maker. Its only purpose was to sell more iPods at higher margins, that’s where the money was – and still is. In retrospect, Jobs was pouring the foundations of the Apple ecosystem business model:   Hardware is the star; everything else supports the big shots’ volumes and margins.

Returning to today’s (or this coming Tuesday’s) topic, Apple doesn’t want to displace the key players — the banks and credit card companies — any more now than they did a decade ago. Credit card companies, for example, play a hard-to-replace role in policing transactions. It’s not always pretty or convenient when one has to call a US number from Europe because the system “tripped” over an unusual transaction, but it works.

One can’t imagine Apple even thinking of storing and lending money, of trying to “capture a fraction of the flow”. If the company does introduce a near field payment system, it won’t be as an attempt to make money in itself, it will simply be another extension of the Apple ecosystem, another way to make iDevices more attractive.

Beyond this neat playbook theory lurks the matter of modifying consumer behavior and retail infrastructure; Tom Spring’s objections are just as cogent today as they were in 2009. And perhaps Apple’s answer — its rebuttal to the conventional reluctance — is hiding in the still-cocooned show-and-tell building.

JLG@mondaynote.com

PS: On today’s topic, see Horace Dediu’s views on the value of payment systems as bit pipes.

PPS: Unrelated but hard to resist: People from the fashion industry now working at Apple. And their friends, fashion editors, unusual invitees to a Cupertino product launch.

Macintel: The End Is Nigh

When Apple announced its 64-bit A7 processor, I dismissed the speculation that this could lead to a switch away from Intel chips for the Macintosh line for a homegrown “desktop-class” chip. I might have been wrong.

“I don’t know exactly when, but sooner or later, Macs will run on Apple-designed ARM chips.” Thus spake Matt Richman in a 2011 blog post titled “Apple and ARM, Sitting in a Tree”. Richman explained why, after a complicated but ultimately successful switch from PowerPC chips to Intel processors in 2005, Apple will make a similar switch, this time to ARM-based descendants of the A4 chip designed by Apple and manufactured by Samsung.

Cost is the first reason invoked for the move to an An processor:

“Intel charges $378 for the i7 chip in the new high-end 15 inch MacBook Pro. They don’t say how much they charge for the i7 chip in the low-end 15 inch MacBook Pro, but it’s probably around $300. …When Apple puts ARM-based SoC’s in Macs, their costs will go down dramatically. ”

We all know why Intel has been able to command such high prices. Given two microprocessors with the same manufacturing cost, power dissipation, and computing power, but where one runs Windows and the other doesn’t, which chip will achieve the higher market price in the PC market? Thus, Intel runs the table, it tells clone makers which new x86 chips they’ll receive, when they’ll receive them, and, most important, how much they’ll cost. Intel’s margins depend on it.

ARM-based processors, on the other hand, are inherently simpler and therefore cost less to make. Prices are driven even lower because of the fierce competition in the world of mobile devices, where the Wintel monopoly doesn’t apply.

329_A7chip

Cost is the foremost consideration, but power dissipation runs a close second. The aging x86 architecture is beset by layers of architectural silt accreted from a succession of additions to the instruction set. Emerging media formats demand new extensions, while obsolete constructs must be maintained for the sake of Microsoft’s backward compatibility religion. (I’ll hasten to say this has been admirably successful for more than three decades. The x86 nickname used to designate Wintel chips originates from the 8086 processor introduced in 1978 – itself a backward-compatible extension of the 8088…)
Because of this excess baggage, an x86 chip needs more transistors than its ARM-based equivalent, and thus it consumes more power and must dissipate more heat.

Last but not least, Richman quotes Steve Jobs:

“I’ve always wanted to own and control the primary technology in everything we do.”

Apple’s leader has often been criticized for being too independent and controlling, for ignoring hard-earned industry wisdom. Recall how Apple’s decision to design its own processors was met with howls of protest, accusations of arrogance, and the usual predictions of doom.

Since then, the interest for another Grand Processor Switch has been alive and well. Googling “Mac running on ARM” gets you close to 10M results. (When you Bing the same query, you get 220M hits — 22x Google’s results. SEO experts are welcome to comment.)

Back to the future…

In September 2013, almost a year ago already, Apple introduced the 64-bit A7 processor that powers new iPhones and iPads. The usual suspects pooh-poohed Apple’s new homegrown CPU, and I indulged in a little fun skewering the microprocessor truthers: 64 bits. It’s Nothing. You Don’t Need It. And We’ll Have It In 6 Months. Towards the end of the article, unfortunately, I dismissed the speculation that Apple An processors would someday power the Mac. I cited iMacs and Mac Pros — the high end of the product line —as examples of what descendants of the A7 couldn’t power.

A friend set me straight.

In the first place, Apple’s drive to own “all layers of the stack” continues unabated years after Steve’s passing. As a recent example, Apple created its own Swift programming language that complements its Xcode IDE and Clang/LLVM compiler infrastructure. (For kremlinology’s sake I’ll point out that there is an official Apple Swift blog, a first in Apple 2.0 history if you exclude the Hot News section of the of apple.com site. Imagine what would happen if there was an App Store blog… But I digress.)

Secondly, the Mac line is suspended, literally, by the late delivery of Intel’s Broadwell x86 processors. (The delay stems from an ambitious move to a bleeding edge fabrication technology that shrinks the basic building block of a chip to 14 nanometers, down from 22 nanometers in today’s Haswell chips.) Of course, Apple and its An semiconductor vendor could encounter similar problems – but the company would have more visibility, more control of its own destiny.

Furthermore, it looks like I misspoke when I said an An chip couldn’t power a high-end Mac. True, the A7 is optimized for mobile devices: Battery-optimization, small memory footprint, smaller screen graphics than an iMac or a MacBook Pro with a Retina display. But having shown its muscle in designing a processor for the tight constraints of mobile devices, why would we think that the team that created the most advanced smartphone/tablet processor couldn’t now design a 3GHz A10 machine optimized for “desktop-class” (a term used by Apple’s Phil Schiller when introducing the A7) applications?

If we follow this line of reasoning, the advantages of ARM-based processors vs. x86 devices become even more compelling: lower cost, better power dissipation, natural integration with the rest of the machine. For years, Intel has argued that its superior semiconductor design and manufacturing technology would eventually overcome the complexity downsides of the x86 architecture. But that “eventually” is getting a bit stale. Other than a few showcase design wins that have never amounted to much in the real world, x86 devices continue to lose to ARM-derived SoC (System On a Chip) designs.

The Mac business is “only” $20B a year, while iPhones and iPad generate more than 5 times that. Still, $20B isn’t chump change (HP’s Personal Systems Group generates about $30B in revenue), and unit sales are up 18% in last June’s numbers vs. a year ago. Actually, Mac revenue ($5.5B) approaches the iPad’s flagging sales ($5.9B). Today, a 11” MacBook Air costs $899 while a 128Gb iPad Air goes for $799. What would happen to the cost, battery life, and size of an A10-powered MacBook Air? And so on for the rest of the Mac line.

By moving to ARM, Apple could continue to increase its PC market share and scoop much of the profits – it currently rakes in about half of the money made by PC makers. And it could do this while catering to its customers in the Affordable Luxury segment who like owning both an iPad and a Mac.

While this is entirely speculative, I wonder what Intel’s leadership thinks when contemplating a future where their most profitable PC maker goes native.

JLG@mondaynote.com

———-

Postscript: The masthead on Matt Richman’s blog tells us that he’s now an intern at Intel. After reading several of his posts questioning the company’s future, I can’t help but salute Intel management’s open mind and interest in tightly reasoned external viewpoints.

And if it surprises you that Richman is a “mere” intern, be aware that he was all of 16-years-old when he wrote the Apple and ARM post. Since then, his blog has treated us to an admirable series of articles on Intel, Samsung, Blackberry, Apple, Washington nonsense – and a nice Thank You to his parents.

 

The Beats Music Rorschach Blot

 

Apple has a long track record of small, cautious, unheralded acquisitions. Has the company gone off course with hugely risky purchase of Beats Music and Electronics, loudly announced at an industry conference? 

As Benedict Evans’ felicitous tweet put it, Apple’s $3B acquisition of Beats, the headphone maker and music streaming company, is a veritable Rorschach blot:

Benedict Evans Rorschach

The usual and expected interpretations of Anything Apple – with the implied or explicit views of the company’s future – were in full display at last week’s Code Conference after the Beats acquisition was officially announced during the second day of the event. Two of the conference’s high-profile invitees, Apple’s SVP Craig Federighi and Beats’ co-founder, Dr. Dre (née André Young), quickly exited the program so all attention could be focused on the two key players: Eddy Cue, Apple’s Sr. VP of Internet Software and Services; and Jimmy Iovine, Beats’ other co-founder and freshly minted Apple employee. They were interviewed on stage by Walt Mossberg and Kara Swisher, the conference creators (59-minute video here).

Walt and Kara had booked Cue and Iovine weeks before Tim Bradshaw scooped the Apple/Beats story on May 8th in the Financial Times (the original FT article sits behind a paywall; TechCrunch version here). Was the booking a sign of prescience? smart luck? a parting gift from Katie Cotton as she retires as head of Apple PR? (And was Swisher’s warmly worded valentine to Cotton for her 18 years of service a quid pro quo acknowledgment?)

After the official announcement and the evening fireside chat, the Rorschach analysis began. Amidst the epigrams, which were mostly facile and predictable, one stood out with its understated questioning of culture compatibility:

‘Iovine: Ahrendts or Browett?‘ 

The “Browett”, here, is John Browett, the British executive who ran Dixons and Tesco, two notoriously middle-brow retail chains. Apple hired him in April 2012 to succeed Ron Johnson as the head of Apple Retail… and showed him the door seven months later, removed for a clear case of cultural incompatibility. When Browett tried to apply his estimable cost-cutting knowledge and experience to the Italian marble Apple Store, things didn’t work out — and the critics were quick to blame those who hired him.

Nothing of the sort can be said of Dame Angela Ahrendts. Now head of Apple’s physical and on-line stores, Ahrendts was lured from Burberry, a culturally compatible and Apple-friendly affordable luxury enterprise.

Will Iovine be a Browett or an Ahrendts?

In a previous Monday Note, I expressed concern for the cultural integration challenges involved in making the Beats acquisition work. What I learned from the on-stage interview is that Jimmy Iovine and Eddy Cue have known and worked with each other for more than ten years. Iovine says he’ll be coming to Cupertino ‘about once a month’, so my initial skepticism may have been overstated; Apple isn’t acquiring a company of strangers.

But are they acquiring a company that creates quality products?  While many see Beats Music’s content curation as an important differentiator in the streaming business, one that would give a new life to its flagging music sales, others are not so sure. They find Beats Music’s musical choices uninspiring. I’m afraid I have to agree. I downloaded the Beats Music app, defined a profile, and listened for several hours while walking around Palo Alto or sitting at my computer. Perhaps it’s me, my age, or my degenerate tastes but none of the playlists that Beats crafted for me delivered neither the frisson of discovery nor the pleasure of listening to an old favorite long forgotten. And my iPhone became quite hot after using the app for only an hour or so.

Regarding the headphones: They’re popular and sell quite well in spite of what The Guardian calls “lacklustre sound”. I tried Beats Electronic’s stylish Studio headphones for a while, but have since returned to the nondescript noise-canceling Bose QC 20i, a preference that was shared (exactly or approximately) by many at the conference.

There was no doubt, at the conference, that Apple understands there are problems with Beats, but there’s also a feeling that the company sees these problems as opportunities. An overheard hallway discussion about the miserable state of the iTunes application (too strongly worded to repeat here verbatim) neatly summed up the opportunity: ‘Keeping Beats as a separate group affords Cook and Cue an opening for independently developing an alternative to iTunes instead of trying to fix the unfixable.’ It’s worth noting that the Beats Music app is available on mobile devices, only, and it appears there’s no plan to create a desktop version. This underlines the diminished role of desktops, and points out the possibility of a real mobile successor to the aging iTunes application.

Continuing with the blot-reading exercise, many members of the audience found it necessary to defend the $3B price tag. Some point out that since Apple’s valuation is about 3X its revenue, Beats’ purported $1.5B hardware revenue easily “justifies” the $3B number. (Having consorted with investment bankers at various moments of my business life, as an entrepreneur, a company director, and a venture investor, I know they can be trusted to explain a wide range of valuations. Apparently, Apple is paying $500M for the streaming business and $2.5B for the hardware part.)

My own reading is that the acquisition price won’t matter: If it acquisition succeeds, the price will be easily forgotten; if it fails, Apple will have bigger worries.

Ultimately, the Apple-Beats products and services we don’t haven’t yet seen will do the talking.

–JLG@mondaynote.com

Peak PC. Intel Fork.

 

Propelled by Moore’s Law and the Internet, PCs have enjoyed four decades of strong growth, defying many doomsday prophecies along the way. But, with microprocessor performance flattening out, the go-go years have come to an end. Intel, the emperor of PC processors, and a nobody in mobile devices needs to react.]

I’m suspicious of Peak <Anything> predictions. Some of us became aware of the notion of a resource zenith during the 1973 OPEC oil embargo, with its shocking images of cars lined up at gas stations (in America!):

Gas Lines Oil Embargo

This was Peak Oil, and it spelled doom to the auto industry.

We know what happened next: Cars improved in design and performance, manufacturers became more numerous. Looking at this bit of history through my geek glasses, I see three explanations for the rebound: computers, computers, and computers. Computer Assisted Design (CAD) made it easier to design new car models as variations on a platform; Volkswagen’s MQB is a good example. Massive computer systems were used to automate the assembly line and manage the supply chain. It didn’t take long for computers to work their way into the cars themselves, from the ECU under the hood to the processors that monitor the health of the vehicle and control the entertainment and navigation systems.

Since then, we’ve had repeated predictions of Peak Oil, only to be surprised by the news that the US will soon become a net oil exporter and, as Richard Muller points out in his must-read Physics for Future Presidents, we have more than a century of coal reserves. (Unfortunately, the book, by a bona fide, middle-of-the-road physicist, can’t promise us that physics will eventually push politics aside when considering the rise of CO2 in the atmosphere…)

I’ve heard similar End of The Go-Go Days predictions about personal computers since 1968 when my love affair with these machines started at HP France (I was lucky enough to be hired to launch their first desktop machine).

I heard the cry again in 1985 when I landed in Cupertino in time for the marked slowdown in Apple ][ sales. The never-before round of layoffs at Apple prompted young MBAs, freshly imported from Playtex and Pepsi, to intone the It’s All Commodities Now dirge. I interpreted the cry (undiplomatically -- I hadn’t yet learned to speak Californian) as a self-serving It’s All Marketing Now ploy. In the meantime, engineers ignored the hand-wringing, went back to work, and, once again, proved that the technology “mines” were far from exhausted.

In 1988, a Sun Microsystems executive charitably warned me: “PCs are driving towards the Grand Canyon at 100 mph!”.  A subscriber to Sun’s The Network Is The Computer gospel, the gent opined that heavy-duty computing tasks would be performed by muscular computers somewhere (anywhere) on the network. Desktop devices (he confusingly called them “servers” because they were to “serve” a windowing protocol, X11) would become commodities no more sophisticated or costly than a telephone. He had no answer for multimedia applications that require local processing of music, video, and graphics, nor could he account for current and imminent mobile devices. His view wasn’t entirely new. In 1965, Herb Grosch gave us his Law, which told us that bigger computers provide better economics; smaller machines are uneconomical.

And yet, personal computers flourished.

I have vivid memories of the joy of very early adopters, yours truly included. Personal computers are liberating in many ways.

First, they don’t belong to the institution, there’s no need for the intercession of a technopriest, I can lift my PC with my arms, my brains, and my credit card.

Second, and more deeply, the PC is a response to a frustration, to a sense of something amiss. One of mankind’s most important creations is the symbol, a sign without a pre-existing meaning: X as opposed to a drawing of a deer on a cave wall. Strung together, these symbols show formidable power. The expressive and manipulative power of symbol strings runs through the Song of Songs, Rumi’s incandescent poetry, Wall Street greed, and quantum physics.

But our central nervous system hasn’t kept up with our invention. We don’t memorize strings well, we struggle with long division, let alone extracting cubic roots in our heads.

The PC comes to the rescue, with its indefatigable ability to remember and combine symbol strings. Hence the partnership with an object that extends the reach of our minds and bodies.

Around 1994, the Internet came out of the university closet, gave the PC access to millions of servers around the world (thus fulfilling a necessary part of the Sun exec’s prophecy), and extended our grasp.

It’s been great and profitable fun.

But today, we once again hear Peak PC stories. Sales have gone flat, never to return:

PC shipments 2014-18 - PNG

This time, I’m inclined to agree.

Why?

Most evenings, my home-builder spouse and I take a walk around Palo Alto. Right now, this smallish university town is going through a building boom. Offices and three-layer retail + office + residence are going up all around University Avenue. Remodels and raze-and-build projects can be found in the more residential parts of town. No block is left unmolested.

I can’t help but marvel. None of this activity, none of Silicon Valley would exist without Moore’s Law, the promise made in 1965 that semiconductor performance would double every 18 months. And, for the better part of 40 years, it did - and rained money on the tech ecosystem, companies and people. PCs, servers, embedded electronics, giant network routers, cars...they’ve all been propelled because Moore’s Law has been upheld...until recently.

The 1977 Apple ][ had a 1MHz 8-bit processor. Today’s PCs and Mac’s reach 3.7GHz, but number that hasn’t changed in more than three years. This isn’t to say that Intel processors aren’t still improving, but the days when each new chip brought substantial increases in clock speed seem to be over.

One should never say never, but Moore’s Law is now bumping into the Laws of Physics. The energy needed to vibrate matter (electrons in our case) increases with frequency. The higher the clock frequency, the higher the power dissipation and the greater the heat that’s generated…and a PC can withstand only so much heat. Consider the cooling contraptions used by PC gamers when they push the performance envelope of their “rigs”:

EK-Thermosphere_right2_12001

To work around the physical limits, Intel and others resort to stratagems such as “multiple cores”, more processors on the same chip. But if too many computations need the result of the previous step before moving forward, it doesn’t matter how many cores you have. Markitects have an answer to that as well: “speculative branch execution”, the use of several processors to execute possible next steps. When the needed outcome appears, the “bad” branches are pruned and the process goes forward on the already-computed good branch. It makes for interesting technical papers, but it’s no substitute for a 8GHz clock speed.

If we need confirmation of the flattening out of microprocessor progress, we can turn to Intel and the delays in implementing its Broadwell chips. The move to a 14 nanometers  “geometry” — the term here denotes the size of a basic circuit building block — is proving more difficult than expected. And the design isn’t meant to yield faster processors, just less power-hungry ones (plus other goodies such as better multi-media processing).

One possible reaction to this state of affairs is to look at tablets as a new engine of growth. This is what Microsoft seems to be doing by promoting its Intel-inside Surface Pro 3 as a laptop replacement. But even if Microsoft tablets turn out to be every bit as good as Microsoft says they are, they aren’t immune to the flattening out of Intel processor performance. (I don’t have an opinion yet on the product — I tried to buy one but was told to wait till June 20th.)

Does this broaden the opening for ARM-based devices? Among their advantages is a cleaner architecture, one devoid of the layers of backwards compatibility silt x86 devices need. ARM derivaties need less circuitry for the same computing task and, as a result, dissipate less power. This is one of the key reasons for their dominance in the battery-powered world of mobile devices. (The other is the customization and integration flexibility provided by the ARM ecosystem.) But today’s ARM derivatives run at lower speeds (a little above 1GHz for some) than Intel chips. Running at higher speeds will challenge them to do so without hurting battery life and having to add the fan that Microsoft tablets need.

With no room to grow, PC players exit the game. Sony just did. Dell took itself private and is going through the surgery and financial bleeding a company can’t withstand in public. Hewlett-Packard, once the leading PC maker, now trails Lenovo. With no sign of turning its PC business around, HP will soon find itself in an untenable position.

Intel doesn’t have the luxury of leaving their game — they only have one. But I can’t imagine that Brian Krzanich, Intel’s new CEO, will look at Peak PC and be content with the prospect of increasingly difficult x86 iterations. There have been many discussions of Intel finally taking the plunge and becoming a “foundry” for someone else’s ARM-based SoC (System On a Chip) designs instead of owning x86 design and manufacturing decisions. Peak PC will force Intel CEO’s hand.

JLG@mondaynote.com

Misunderstanding Apple

 

We’ve come to expect analysts and pundits to misunderstand Apple. More puzzling is when Apple misunderstands itself.

My three-week Road Trip of a Lifetime, driving all the way from Key West, FL to Palo Alto, was interrupted by a bout of pneumonia, low blood oxygen, paroxysmal cough and, most alarming, a loss of appetite. Thankfully, all indicators are looking good and I’m back walking Palo Alto’s leafy streets.

The succession of wheel time and downtime gave me an opportunity to contemplate two recent controversies: Fred Wilson’s prediction of Apple’s imminent fall, and rumors of Apple’s purchase of Beat Electronics. These are both manifestations of what I’ll call, for lack of a better term, Misunderstanding Apple.

First, Fred Wilson. At the recent TechCrunch Disrupt conference, the successful and articulate venture investor predicted that by 2020 Apple will no longer hold the #1 position in the tech world. They won’t even be in the top three. According to Wilson, Apple “doesn’t think about things they way they need to think about things”. Specifically, the company is “too rooted in hardware…[which] is increasingly becoming a commodity” and “Their stuff in the cloud is largely not good. I don’t think they think about data and the cloud.

I’d be surprised by Wilson’s facile, insight-free truisms, except this isn’t the first time he’s shown a blind spot when considering Apple. Wilson is famous for dumping his Apple shares at $91 in January 2009; AAPL is now at $590 or so. (He also sold Google, which closed at $528 on Friday, for a split-adjusted $160. Perhaps there’s a difference between being a venture investor, an insider who watches and influences a young company, and an outsider subjected to forces and emotions outside of one’s control.)

Calling Apple “too rooted in hardware” misunderstands the company. From its inception, Apple has been in one and only one business: personal computers (which, today, includes smartphones and tablets). Indeed, Apple’s quarterly numbers show that the sale of personal computers makes up 87% of its revenue. Everything else that Apple does, from iTunes to the Apple Store, exists to make its smartphones, tablets, laptops, and desktops more useful, more pleasant. And this “everything else” includes the lovingly machined hardware of the MacBook Air and iPhone 5. If the supporting cast does its job well, the main acts will sell in larger numbers and at higher prices.

Customers don’t buy Apple “hardware” in the same way a weekend carpenter buy nails at the friendly neighborhood hardware store. What Fred Wilson seems to miss is that hardware is more than an inert “thing” for Apple: It’s a conduit to an entire ecosystem, and it can yield an enormous strategic advantage. One such example is the 64-bit A7 processor that took everyone by surprise: 64 bits. It’s Nothing. You Don’t Need It. And We’ll Have It In 6 Months.

When the subject of commodization comes up, I invite people to look at the cars they see in the street. Are the likes of Audi, BMW, and Mercedes being commoditized? Do their owners only care that the wheels are black and round? Serendipitously, someone called “SubstrateUnderflow” answers the question in a comment on Wilson’s blog:

“…when I look around at all the cars out there, from the high end models to the most utilitarian models, almost no one buys the base stripped versions. Key devices that are central to people’s lives, comfort and utility have enough emotional SubstrateUndertow to sustain premium pricing.”

The 30-year old Mac business and its healthy margins (about 25% versus HP’s sub-5% for its PCs) shows that Apple has successfully avoided the commoditized race to the bottom that has plagued Wintel devices and is likely to accelerate for smartphones.

Wilson’s criticism of Apple’s “stuff in the cloud”, on the other hand, carries some sting. As a user of Apple’s products and services, I’m often disappointed with Apple’s Cloud offerings. I find iMessage’s quirks irritating, I see a lack of proper synchronization between iBooks on Macs and iDevices, and I’m still waiting for the Cloud version of iWorks to mature. But let’s turn to Horace Dediu for a crisp summary of Apple’s place in the Cloud:

“Not getting the cloud” means that in the last 12 months Apple obtained:
• 800 million iTunes users and
• an estimated 450 million iCloud users spending
•  $3 billion/yr for end-user services plus
•  $4.7 billion/yr for licensing and other income which includes
•  more than $1 billion/yr paid by Google for traffic through Apple devices and
•  $13 billion/yr in app transactions of which
•  $9 billion/yr was paid to developers and
•  $3.9 billion/yr was retained as operating budget and profit for the App Store.

In addition,
•  more than $1 billion/yr in Apple TV (aka Apple’s Kindle) and video sales and
• $2.7 billion/yr in music download sales and
• $1 billion/yr in eBooks sold

In summary, iTunes, Software and Services has been growing between 30% and 40% for four years and is on its way to $30 billion/yr in transactions and sales for 2014.

Horace is right; Fred Wilson clearly hasn’t done the numbers.

———————

I was still on the road when I read about the rumored $3.2B acquisition of Beats Electronics, the company that began in the headphone business and then spawned a streaming music service.

I’m puzzled. If the rumors prove true, Apple may be guilty of misunderstanding itself.

The hardware side, headphones, is immaterial: The products may look good, but their audio quality is regularly panned. And the revenue, about $500M, doesn’t move the needle.

The current wisdom is that Apple is mostly interested in Beats Music, the subscription streaming service. But that business isn’t big, either; it has only attracted about 110K subscribers.

Maybe Apple is interested in Beats Music’s technology and its vision for the future of streaming and music curation. I took the time to watch Walt Mossberg’s interview of Jimmy Iovine in which the Beats co-founder gives hints about his plans. Iovine’s AI-with-a-human-touch solution for delivering “what comes next” is technically vague — and vaguely dystopian (“we’ll scrape your hard drivewe’ll know where you are tomorrow”). I’m not convinced.

We also have rumors that Iovine and Dr. Dre, Beats’ other co-founder, might become some kind of senior advisers to Apple management. Given what I’ve read about Dre’s troubles with the Law, including a battery charge that landed him in jail, and an assault on a female that was settled out of court, I’m troubled. How will this play inside and outside Apple?

I don’t see how such an acquisition would enhance Apple’s business model or reputation.

That said, I hope I’m as wrong as I was when I thought the iPod was doomed to fail against commoditized, yes, that word, MP3 players. I hadn’t seen iTunes behind the hardware, Cloud storage, the distribution and micro-payments infrastructure that would one day make the iPhone and App Phone.

I also see people whose intellect and motives I respect strongly support the rumored acquisition. Preeminent among them is Ben Thompson who, in his Stratechery blog, explores Why Apple Is Buying Beats. There, after positing personal computers might have reached their peak, Ben asks whether Apple is in fact reinventing itself as a kind of fashion house [emphasis mine]:

“Or are we witnessing a reinvention, into the sort of company that seeks to transcend computing, demoting technology to an essential ingredient of an aspirational brand that identifies its users as the truly with it? Is Apple becoming a fashion house? Think about it: you have Jony Ive as all-up head of design, the equivalent of a Tom Ford or Donatella Versace. There is the hire of Angela Ahrendts – why would she leave the CEO position of Burberry for a Senior VP role? You have an iPhone framed as an experience, not a product. And now you acquire an accessory maker differentiated almost completely by its brand, not its inherent technical quality.”

And ponders at the Damned If You Do, Damned If You Don’t of such cultural change:

“Still, I can imagine the very thought of Apple positioning itself as a fashionable luxury brand is somewhat nauseating for many of my readers. It’s an understandable reaction, and one I somewhat share. I worry that Apple is losing what makes Apple, Apple, especially that desire to make the power of computing accessible for normal people. But I also know that stasis means stagnation, and over the long-run, death.”

To be continued…

JLG@mondaynote.com

The iPad Is a Tease

 

As Apple is about to release its latest quarterly numbers, new questions arise about the iPad’s “anemic” growth. The answer is simple – but the remedies are not.

The iPad isn’t growing anymore. What happened? 

In anticipation of Apple’s latest quarterly numbers – they’ll be announced on April 23rd – the usual prerelease estimates swirl around the Web. You can find Yahoo’s summary of analysts’ estimates here; Paul Leitao’s Posts At Eventide provides a detailed and tightly reasoned history and forecast for the March 2014 quarter.

The consensus is that for the company as a whole, there won’t be any surprises: Apple will meet the guidance stated in its January 27th earnings call. Revenue will be down, as befits the quarter following the Christmas shopping frenzy, but profit per share (EPS) will be up a bit.

Boring. With one glaring exception:

Braeburn Group iPad Edited
(Source: The Braeburn Group)

In the same quarter for 2013, the iPad’s year-on-year growth was about 55%. Some of this phenomenal growth was due to a rebound from earlier iPad mini supply constraints, but that doesn’t explain the precipitous drop from 2013 to this year.

Are the iPad’s go-go years over?

As Philip Elmer-DeWitt reports on his Apple 2.0 site, this gloomy prediction appears to be the majority opinion among analysts. Elmer-DeWitt acknowledges that there are outliers — Horace Dediu comes in at the high end with an estimate of 21.8M units (and positive growth) — but “the consensus estimate of 19.3 million, would represent a 0.7% decline”.

It’s one thing for a product to increase in unit volume sales but still grow less than the overall market — that’s simply a loss of market share. And we know how fallacious share numbers can be in the absence of an honest disclosure of sales volumes. No, assuming the estimates are right, what we have here isn’t market share dilution, it isn’t a post-Christmas lull, it’s a year-to-year decline in absolute unit numbers.

Why?

I’ll offer an opinion: The iPad is a tease. Its meteoric debut raised expectations that it can’t currently meet.

To explain, let’s go back four years.

Steve Jobs’ last creation took us by surprise, price included, and was initially panned by many in the kommentariat, from Eric Schmidt to Dan Lyons (who subsequently recanted). But normal humans joyously took to the iPad. In 1984, one of Apple’s tag line for the Mac was “Macintosh – the computer for the rest of us.” Decades later, the iPad was quickly perceived as a sort of second coming. As MacWorld put it in June 2011: Now Apple’s really “for the rest of us”.

Indeed, the iPad wasn’t targeted at a particular type — or generation — of user. David Hockney has produced exquisite iPad “paintings”. Daniel Borel, Logitech’s co-founder, told me that his two-year old grandson immediately “got” the iPad (even if it was just to play games, but…he’s two). Coming out of our breakfast meeting, I crossed paths with a couple of seniors — octogenarians, probably — who proudly told me that they were going to an iPad training session at the Palo Alto Apple Store.

The iPad rose and rose. It won legions of admirers because of its simplicity: No windows (no pun), no file system, no cursor keys (memories of the first Mac). Liberated from these old-style personal computer ways, the iPad cannibalized PC sales and came to be perceived as the exemplar Post-PC device.

But that truly blissful simplicity exacts a high price. I recall my first-day disappointment when I went home and tried to write a Monday Note on my new iPad. It’s difficult — impossible, really — to create a real-life composite document, one that combines graphics, spreadsheet data, rich text from several sources and hyperlinks. For such tasks, the Rest of Us have to go back to our PCs and Macs.

I realize there are iPad users who happily perform “productivity tasks” on their iPads. Most of them use a stand and keyboard sold in a number of guises. The number of different offerings is a testament to a real need. (We’ll note that Apple doesn’t seem eager to address this issue directly. They don’t offer an “iPad-sized” keyboard — the Bluetooth keyboard I use is fine for my iMac, but feels gargantuan when I pair it with my iPad. And Apple’s iPad Dock hasn’t been updated to work with the “Lightning” connector on the newer iPads.)

The iPad’s limitations extend beyond classic office productivity tasks. I just tried to build an itinerary for a long postponed road trip, driving all the way from Key West Florida to Palo Alto. On a Mac, you can easily “print to PDF” to produce a map for each leg of the trip. Then you use the wonderful Preview app (I salute its author and dedicated maintainer) to emend unneeded pages, drag and drop, combine and rearrange the PDF files into a single document. Don’t try this on an iPad: How would you “print-to-PDF” a map page, let alone combine such pages?

Despite the inspiring ads, Apple’s hopes for the iPad overshot what the product can actually deliver. Although there’s a large numbers of iPad-only users, there’s also a substantial population of dual-use customers for whom both tablets and conventional PCs are now part of daily life.

I see the lull in iPad sales as a coming down to reality after unrealistic expectations, a realization that iPads aren’t as ready to replace PCs as many initially hoped.

In his introduction of the iPad in January, 2010, Jobs himself seemed a bit tentative when positioning his latest creation. Sitting in the Le Corbusier chair, Jobs stated that his new tablet would have to “find its place between the iPhone and the Mac”.

This “in-between place” is still elusive.

Microsoft tried to find that “in-between place”, and we know how well that worked. For the Redmond company, the iPad’s limitations were an opportunity: Simply emulate the charm and intuitiveness of the market-leading tablet and cater to the needs of the “professional” user. With its touch interface and keyboard, the Surface device sounded like the solution that had eluded Microsoft’s earlier PC Tablets. In the field, customers didn’t like the dueling interfaces, nor the introduction of layers of complexity where simplicity had been promised. Surface tablets didn’t move the revenue needle and cost Microsoft a $900M write-down.

The iPad represents about 20% of Apple’s revenue; allowing iPad numbers to plummet isn’t acceptable. So far, Apple’s bet has been to keep the iPad simple, rigidly so perhaps, rather than creating a neither-nor product: No longer charmingly simple, but not powerful enough for real productivity tasks. But if the iPad wants to cannibalize more of the PC market, it will have to remove a few walls.

Specifically, the iPad is a computer, it has a file system, directories, and the like — why hide these “details” from users? Why prevent us from hunting around for the bits and bobs we need to assemble a brochure or a trip itinerary?

None of this is news to Apple execs, but they also know that success doesn’t depend on What, on a simple feature list. The next step in iPad growth will depend on How new features are integrated into the user experience. It’s a tricky game of the Best of Both Worlds…and it tripped up Microsoft.

When will we know? I have no idea. Perhaps at the WWDC this coming June.

JLG@mondaynote.com

TV Done Right: Still A Dream

 

As the strong reactions to even the slightest Apple TV rumor demonstrate, there’s a vigorous appetite for a simple, modern Internet TV experience. The technology is ready but carriers aren’t.

Last week started with Big Apple TV News in an authoritative-sounding Wall Street Journal article:

“Apple Inc. is in talks with Comcast Corp. about teaming up for a streaming-television service that would use an Apple set-top box and get special treatment on Comcast’s cables to ensure it bypasses congestion on the Web, people familiar with the matter say.”

Search for “Comcast” in a news aggregator such as Feedly (there are many other good choices), and you’ll see a wide range of reactions to the Apple-Comcast rumor. Given the heat the article generated, it’s odd that there has been zero follow-up from the main players — nothing from Apple and Comcast, no additional information in the WSJ or any other journal. When a deal of such importance is in the works, “people familiar with the matter” have a strong incentive to keep talking, to add color, to spin their side of the story. Of course, no one expects Apple to do much leaking, but the radio silence from Comcast spinmeisters is another matter entirely.

Philip Elmer-DeWitt offers the most likely explanation: The Wall Street Journal got played by someone intent on throwing a wrench into Comcast’s plan to acquireTime Warner’s cable operations. (This wouldn’t be the first time: Cellphone carriers have repeatedly used the WSJ to air their perennial Poor Me complaints about excessive smartphone subsidies.)

Echoes of the WSJ non-story ricocheted around the blogosphere. Some, such as this BBC article, make painful points about the abuse that US consumers undergo at the hands of broadband carriers:

Broadband Cost

As a sharp-witted Be engineer liked to remark: “It costs more… But it does less.”

Carriers take too much money for a user-hostile experience simply because they can. In most locations, cable companies have little or no competition, so there’s no reason for them to do anything more than milk the most profit from a cheap infrastructure. As Apple Insider’s Neil Hughes reminds us, the user experience isn’t a priority for cable providers. Indeed, as I write this from Paris, I have to juggle set-top box restarts and malfunctioning secondary content subscriptions only reluctantly allowed by the main provider.

It doesn’t have to be that way. No miracle is required to make our Cable TV experience easy and gratifying.

Consider today’s cable arrangement, simplified for our discussion. A coax cable is strung from the street into your basement or crawl space. You plug the coax into a signal splitter, connect one output to your cable modem for Internet access, while the others feed the TVs in your household.

Next, you run an Ethernet cable from your modem to your WiFi access point and maybe you also run a wire from the access point to your “most trusted” computer. Upstairs, we see a set-top box, an Internet TV streaming device (Roku, Apple TV, Boxee, or other), and, if your TV is of a certain age, a digital adaptor.

That’s four or five devices that you have to connect and, when things go wrong, disconnect, power down, and restart in the “proper” order.

It’s only too easy to imagine how a next-generation Apple TV could collapse this maze of impenetrable interfaces into one box: Coax in, Wifi and HDMI out and, miracle, one and only one remote! This is something that Apple seems to have the taste and resources to do well.

There are no technical obstacles, no new technology is required, no new software platform, just a careful integration job. I realize I’m veering dangerously close to the “mere matter of implementation” deception, but regardless of the amount of work it would take to integrate the various technologies, the benefit to the user would make the engineering effort worth it.

And there are many benefits:  We can throw away our DVRs as content becomes an app that we can stream whenever we want — the 60 Minutes iPad app is an elegant, flexible exemplar of the type. Rather than paying for a “package” of channels that are selected by the cable provider, we’ll be able to buy a la carte shows, series, and channels through iTunes or similar content vendor. We’ll be able to watch the free-with-ads version of a show, or we can pay for the ad-free edition.

Some day, the status quo will break, perhaps as the result of a patient encirclement and infrastructure buildup — a better, vertically integrated Content Delivery Network, both very much compatible with Apple’s playbook. As the reactions to the (possibly planted) Apple-Comcast rumor amply demonstrate, users are becoming increasingly aware of the disconnect between the experience that the cable companies offer and TV Done Right.

JLG@mondaynote.com

Wearables Fever

 

While Google, Motorola, and Samsung seem eager to jump into the wearables market, Apple characteristically keeps its counsel – and wisely so: Smartwatches and other wearables produce more pageviews than profits.

Wearables are a danger to your health – your mental health, that is. Smartwatches and sensor-laden bracelets aren’t so new anymore — see Microsoft’s 2004 SPOT Watch — but the vernal equinox seems to have triggered a bout of Wearables Fever the likes of which we haven’t seen since the Tablet Fever of January, 2011, when 76 tablets were announced at the Consumer Electronic Show in Las Vegas. As so often happens with pandemics, there was a smaller outbreak, called the Dawn of the Tablet PC, days before the January 2010 iPad launch.

In this year’s derangement, we are witnessing the birth of another epoch-making class of product — the Wearable. As Wired sees it, for example, Jawbone Is Now the Startup Apple Should Fear Most.

In one respect, Jawbone’s devices are a lot like Apple’s. The company admires minimalism…[b]ut Apple’s minimalism is cold — all brushed metal and glass — while Jawbone’s is warm, squishy, and textured… There’s a chance Apple has designed itself into a corner. But for Jawbone, the future is full of possibility.

Then there’s this analysis, quoted and mocked by John Gruber [emphasis mine]:

Cadie Thompson, writing for CNBC, “Time Is Ticking for Apple to Announce an iWatch, Say Analysts”. Apple needs an iWatch sooner rather than later, or the company will risk losing its innovative edge to rivals, analysts say.

They only have 60 days left to either come up with something or they will disappear,” said Trip Chowdhry, managing director at Global Equities Research. “It will take years for Apple’s $130 billion in cash to vanish, but it will become an irrelevant company… it will become a zombie, if they don’t come up with an iWatch.

I’m guessing the ellipsis denotes when he paused for another line of coke.

Parenthetically, it would be wrong to imply that Mr. Chowdhry might be “incentivized” to shout from the rooftops by rewards more satisfying than pageviews — no allegations of stock manipulation complicity here — but I wonder about the games that he and other anal-ists play. As Philip Elmer-DeWitt pointedly noted in a CNN Money column last year, Mr. Chowdhry urged his clients to unload Apple stock for eight months and then blamed the CEO and CFO “for destroying Apple’s shareholder value”.

If you’re curious enough to look at Mr. Chowdhry’s spartan Global Equities Research site, you’ll see he claims to have Commission Sharing Agreements with Goldman Sachs, Merrill Lynch, Barclays, Jefferies, Morgan Stanley and JP Morgan. As the Wikipedia article points out, such agreements “ask that broker to allocate a portion of the commission directly to an independent research provider.” Here, one wonders what the word independent really means…

Back to Wearables: The announcements pile on.

Samsung tells us they’re moving their smartwatches away from Android to a version of Tizen, itself based on a version of the ubiquitous Linux.

Google announces Android Wear, a version of Android for smartwatches.

Motorola, soon to be a Lenovo brand, announces that its moto 360 smartwatch is “Coming Summer 2014 in a selection of styles” and provides these artful renderings:

Moto Wrist Edited

and…

Moto Modern

(I write renderings because, as the Android Wear intro video indicates, these are simulated pictures. This doesn’t mean that the final product won’t be better looking– but we’re clearly not there yet.)

Why the haste? Did Tim Cook succeed in misdirecting Apple’s competition when he pronounced wearables a “very key branch of the tree? Or is there a giant business to be had?

We have many unanswered questions.

First, paraphrasing Horace Dediu, there are the twin questions of For What and By Whom: For what job is a smartwatch “hired”, and by whom? If we look at phones as a model, we see two “employers”: Carriers hire smartphones to increase their ARPU; normal consumers use them as small, ubiquitous, always-connected personal computers.

Will this model work for smartwatches? We can almost certainly eliminate carriers from the equation: Subsidies are out of question because a watch is unlikely to generate carrier revenue.

For us users, a smartwatch collects sensor data, connects to our smartphone, displays alerts, responds to touch and voice commands… and even tells us the time. These are all worthwhile functions that make for neat promo videos, but to keep users interested after the novelty wears out, smartwatches will have to do more than log the miles we’ve run, give us weather updates, and show us the name of the person who’s ringing the smartphone in our pocket. Put another way: We’re willing to pay a premium for our smartphones (whether directly or by contract) because of the huge range of features they provide, the enormous number of apps in the app stores. Will we be as durably aroused – and willing to part with substantial amounts of money – by (yet another) pulse rate app?

Another batch of questions: Since we no longer need a dedicated timepiece to tell us the time — our smartphone does that — Who wears a (dumb) watch these days, How, When, and Why?

Simplifying a bit, younger people don’t wear watches at all and older generations use them as jewelry — and gender-specific jewelry, at that. Furthermore, how many veteran watch-wearers wear the same watch all the time? Many of us own more than one watch, and select the appropriate timepiece (or two — or none at all) for the occasion. These aren’t trivial issues, they’re uncharted territory for mobile device makers and marketers.

Next question: How will smartwatch makers handle the delicate equilibrium between computing power and battery power? As smartwatches evolve and offer more features, a better display, and a more responsive user interface, they’ll need more computing power — and more computing power means a quicker battery drain. Will we put up with watches that run out of power at the end of the day? Will designers retard functionality in order to extend battery life to 24 hours and beyond… or make a smartwatch so big it’ll look like a miniature phone?

The power equilibrium question is why Samsung moved to a dedicated (and pared down) version of Tizen, and why Google did the same for Android Wear. All without giving much information of battery life.

Finally: Is there a business, there? Here in the Valley, Pebble CEO Eric Migicovsky claims to have sold 400,000 watches since January, 2013. At around $150 each, that’s $60M in revenue — a real tribute to Eric’s long-standing belief in wearables (he’s been working at it for six years).

But even if you multiplied this number by 10, it would barely nudge the needle for a large companies such as Samsung, Motorola/Lenovo, or Apple, which means these devices will be confined to the role of smartphone companion. They’ll help make money by enhancing the main product; they’re not going to be a $10B business in themselves.

As Charles Arthur writes in The Guardian, there are fewer than half a million smartwatches in use in the UK: “Wearable computing faces an uphill battle breaking through to the mainstream…”. Similarly, the Register doesn’t see any good, large-scale answers to the question. It calls Google wearables “A solution looking for a rich nerd”.

These challenges might explain why Apple doesn’t seem to have caught this Spring’s Wearables Fever. Smartwatches are destined to be ecosystem extensions, not The Next Big Thing.

JLG@mondaynote.com

One last thought before we close: Not all Ecosystem Extensions are equal. The no-longer-a-hobby Apple TV now brings substantial revenue and growth:

“Sales of the Apple TV are estimated to have grown by 80 percent in 2013, reaching around 10 million units for the calendar year, or some $1 billion worth of set-top boxes sold to end users.”

Horace Dediu puts a “Fortune 130” label on iTunes. By itself, with yearly gross revenue of $23.5B and growing 34%, iTunes is large enough to rank #130 in the Fortune list of the 500 largest US companies:

On a yearly basis iTunes/Software/Services is nearly half of Google’s core business and growing slightly faster.”

While music sales are on the wane, apps and video (mostly Apple TV) show healthy growth. Compared to an Apple TV, how much would an iWatch add to the iTunes business? Apps? Content?

Apple seems wise to stay out of the game until it can make something more lasting than a novelty.

CarPlay Thoughts

 

Who wouldn’t want an iPhone- or Android-like experience in their car instead of today’s misbegotten navigation and entertainment systems? CarPlay’s answer looks nice – until one looks at the details.

Apple’s CarPlay has an air of inevitability. Previously dubbed “iOS in the Car”, CarPlay brings the iPhone’s aesthetics, ease of use, consistency, and universe of apps to the ugly and dumbfounding world of car navigation and entertainment systems.

Seven years after the iPhone launched the Smartphone 2.0 wave, Apple kickstarts another mobile revolution…

It’s an enticing, simple vision. Instead of today’s disjointed systems — which often cost in the $1,000 range, plus $249 for a DVD of updated maps — you get a screen the size of a small tablet running iOS apps with voice and touch control (on-screen and armrest), off-air map updates, open-ended flexibility… We have arrived.

I’ve struggled with dashboard electronics from German, Japanese, and French car makers (no electronics on the old family Chevrolets), and I’ve seen what happened to Ford when it tried to use Microsoft software for its Sync system. Replacing these hairballs with an iOS system only makes sense.

But sense and reality are still living apart.

carplay2

To start, the “iOS in the Car” phrase is misleading. The iOS device “in your car” is the iPhone or iPad that you’ve brought with you — Apple isn’t about to license iOS to automakers (which may be part of the reason why Apple changed the name to “CarPlay”).

And Apple isn’t going to try to take the place of suppliers such as Delphi, VDO, and Aisin by making subsystems for carmakers — it’s not in Apple’s DNA. Not that it would matter if they tried: Automakers have made an art of pinching fractions of cents from their suppliers’ prices; they’d never tolerate Apple’s margins.

CarPlay replicates your iDevice’s screen as H.264 video spewed through an intelligent Lightning cable connected to your car’s USB port. The video format is widely accepted, so the in-car device either understands it already, or can be updated to do so.

So far, so good. As many observers have pointed out, the idea is a wired echo of Apple’s AirPlay, the technology that connects your iDevices (and other compliant products) to your television via the Apple TV black puck. Complications may arise when you consider the various in-dash screen sizes, resolution, actual uses of USB connections (my car’s USB connector is useless for anything other than charging my smartphone), and other mysterious incompatibilities that are beyond Apple’s control. Still, in general, screen replication demands little from the car maker. As with Airplay and a dumb TV set, the intelligence stays inside the smartphone.

The CarPlay proposal is much more limited than the Open Automotive Alliance, a Google initiative that implants a customized version of Android into a car’s electronics. (“Audi connect” is available today; we can expect similar collaborations with Honda, GM and Hyundai.) But if the in-car system runs Android (or QNX, as is often the case today), so much the better, from the carmaker’s point of view: Let Google or one of its partner do all the work to create an Android-based all-in-one car system and let Apple hitch a ride after the work is done. Serving both Android and iOS users is a no-brainer.

It sounds good… but I can’t help but harbor uneasy feelings about this whole “scene”.

To begin with, we have a clash of cultures. To be sure, Eddy Cue, Apple’s Senior VP of Internet Software and Services, is a dealmaking expert and, as a member of the Board of Ferrari, he has serious automotive industry connections. But the spirit that drives Apple is far from that which motivates automakers.

The automotive industry expects to be in control of everything that gets into their cars. The coup that Apple pulled off with the iPhone and AT&T — taking full control of the content, no crapware, iTunes only for media — isn’t going to happen with Mercedes-Benz, or BMW, or even Hyundai. Cars aren’t phones. We’re not going to see aftermarket Toyota CarPlay kits (let alone entire cars) in Apple Stores. Apple won’t get what it always strives for: Controlled Distribution.

Then there’s the F-word: Fragmentation. In-car electronics are a mess, a new culture grafted onto an old one, Silicon Valley and Detroit in a loveless marriage. Actually, that’s  unfair: Under the hood, embedded electronics do wonders to improve the reliability, safety, and economy of our cars. But where the union breaks down is in the User Experience domain. Competent combustion management engineers and the accountants watching over their shoulders have no empathy for smartphone-loving drivers.

The meanderings get more twisted when we consider a key difference between Google and Apple. Google could tell Audi that they’ll pay, in some form, for the user data collected by Audi connect— but Audi already makes a lot of money, they don’t want to open that can of worms. As they say in their privacy agreement:

“We will not share information about you or your Audi vehicle that is connected with your use of Audi connect’s in-car features with third parties for their own purposes without your consent.”

But what would a legally-troubled, profit-starved automaker such as GM say in response to Google’s offer to subsidize the in-car system?

Apple hasn’t played that game.

An all-in-one navigation/communications/entertainment system is a pleasant dream, it feels “right”. But the technical, business model, and cultural obstacles could make for a long, arduous march.

CarPlay could be a very smart way to hitch a ride on many in-car systems without having to struggle with their design and cost challenges, yet another ecosystem extension play.

JLG@mondaynote.com