About Jean-Louis Gassée

http://

Posts by Jean-Louis Gassée:

BlackBerry: The Endgame

 

The BlackBerry was the first truly modern smartphone, the king of Personal Information Management On The Go. But under its modern presentation lurked its most fatal flaw, a software engine that couldn’t be adapted to the Smartphone 2.0 era.

Jet-lagged in New York City on January 4th 2007, just back from New Years in Paris, I left my West 54th Street hotel around 6am in search of coffee. At the corner of the Avenue of the Americas, I saw glowing Starbucks stores in every direction. I walked to the nearest one and lined up to get my first ration of the sacred fluid. Ahead of me, behind me, and on down the line, everyone held a BlackBerry, checking email and BBM messages, wearing a serious but professional frown. The BlackBerry was the de rigueur smartphone for bankers, lawyers, accountants, and anyone else who, like me, wanted to be seen as a four-star businessperson.

Five days later, on January 9th, Steve Jobs walked on stage holding an iPhone and the era of the BlackBerry, the Starbucks of smartphones, would soon be over. Even if it took three years for BlackBerry sales to start their plunge, the iPhone introduction truly was a turning point In BlackBerry’s life.

RIM (as the company was once called) shipped 2M Blackberries in the first quarter of 2007 and quickly ascended to a peak of 14.6M units by Q4 2010, only to fall back to pre-2007 levels by the end of 2013:

337_unnamed-1

Last week, BlackBerry Limited (now the name of the company) released its latest quarterly numbers and they are not good: Revenue plunged to $916M vs. $1.57B a year ago (-42%); the company lost $207M and shipped just 2.1M smartphones, more than a half-million shy of the Q1 2007 number. For reference, IDC tells us that the smartphone industry shipped about 300M units in the second quarter of 2014, with Android and iOS devices accounting for 96% of the global market.

Explanations abound for BlackBerry’s precipitous fall.

Many focus on the company’s leaders, with ex-CEO Jim Balsillie and RIM founder Mike Lazaridis taking the brunt of the criticism. In a March 2011 Monday Note uncharitably titled The Inmates Have Taken Over The Asylum, I quoted the colorful but enigmatic Jim Balsillie speaking in tongues:

“There’s tremendous turbulence in the ecosystem, of course, in mobility. And that’s sort of an obvious thing, but also there’s tremendous architectural contention at play. And so I’m going to really frame our mobile architectural distinction. We’ve taken two fundamentally different approaches in their causalness. It’s a causal difference, not just nuance. It’s not just a causal direction that I’m going to really articulate here—and feel free to go as deep as you want—it’s really as fundamental as causalness.”

This and a barely less bizarre Lazaridis discussion of “application tonnage” led one to wonder what had happened to the two people who had so energetically led RIM/BlackBerry to the top of the industry. Where did they take the wrong turn? What was the cause of the panic in their disoriented statements?

Software. I call it the Apple ][ syndrome.

Once upon a time, the Apple ][ was a friendly, capable, well-loved computer. Its internal software was reliable because of its simplicity: The operating system launched applications and managed the machine’s 8-bit CPU, memory, and peripherals. But the Apple ][ software wasn’t built from the modular architecture that we see in modern operating systems, so it couldn’t adapt as Moore’s Law allowed more powerful processors. A radical change was needed. Hence the internecine war between the Apple ][ and Steve Jobs’ Mac group.

Similarly, the BlackBerry had a simple, robust software engine that helped the company sell millions of devices to the business community, as well as to lay consumers. I recall how my spouse marveled at the disappearance of the sync cable when I moved her from a Palm to a Blackberry and when she saw her data emails, calendar and address book effortless fly from her PC to her new smartphone. (And her PC mechanic was happy to be freed from Hotsync Not Working calls.)

But like the Apple ][, advances in hardware and heightened customer expectations outran the software engine’s ability to evolve.

This isn’t something that escaped RIM’s management. As recounted in a well-documented Globe and Mail story, Mike Lazaridis quickly realized what he was against:

“Mike Lazaridis was at home on his treadmill and watching television when he first saw the Apple iPhone in early 2007. There were a few things he didn’t understand about the product. So, that summer, he pried one open to look inside and was shocked. It was like Apple had stuffed a Mac computer into a cellphone, he thought.

[…] the iPhone was a device that broke all the rules. The operating system alone took up 700 megabytes of memory, and the device used two processors. The entire BlackBerry ran on one processor and used 32 MB. Unlike the BlackBerry, the iPhone had a fully Internet-capable browser.”

So at a very early stage in the shift to the Smartphone 2.0 era, RIM understood the nature and extent of their problem: BlackBerry’s serviceable but outdated software engine was against a much more capable architecture. The BlackBerry was a generation behind.

It wasn’t until 2010 that RIM acquired QNX, a “Unix-ish” operating system that was first shipped in 1982 by Quantum Software Systems, founded by two Waterloo University students. Why did Lazaridis’ company take three years to act on the sharp, accurate recognition of its software problem? Three years were lost in attempts to tweak the old software engine, and in fights between Keyboard Forever! traditionalists and would-be adopters of a touch interface.

Adapting BlackBerry’s applications to QNX was more complicated than just fitting a new software engine into RIM’s product line. To start with, QNX didn’t have the thick layer of frameworks developers depend on to write their applications. These frameworks, which make up most of the 700 megabytes Lazaridis saw in the iPhone’s software engine, had to be rebuilt on top of a system that was well-respected in the real-time automotive, medical, and entertainment segment, but that was ill-suited for “normal” use.

To complicate things, the company had to struggle with its legacy, with existing applications and services. Which ones do we update for the new OS? which ones need to be rewritten from scratch? …and which ones do we drop entirely?

In reality, RIM was much more than three years behind iOS (and, later, Android). Depending on whom we listen to, the 2007 iPhone didn’t just didn’t stand on a modern (if incomplete) OS, it stood on 3 to 5 years of development, of trial and error.

BlackBerry had lost the software battle before it could even be fought.

All other factors that are invoked in explaining BlackBerry’s fall — company culture, hardware misdirections, loss of engineering talent — pale compared to the fundamentally unwinnable software battle.

(A side note: Two other players, Palm and Nokia, lost the battle for the same reason. Encumbered by once successful legacy platforms, they succumbed to the fresh approach taken by Android and iOS.)

Now under turnaround management, BlackBerry is looking for an exit. John Chen, the company’s new CEO, comes with a storied résumé that includes turning around database company Sybase and selling it to SAP in 2012. Surely, such an experienced executive doesn’t believe that the new keyboard-based BlackBerry Passport (or its Porsche Design sibling) can be the solution:

337_unnamed

Beyond serving the needs or wants of die-hard keyboard-only users, it’s hard to see the Passport gaining a foothold in the marketplace. Tepid reviews don’t help (“The Passport just doesn’t offer the tools I need to get my work done”); Android compatibility is a kludge; developers busy writing code for the two leading platforms won’t commit.

Chen, never departing from his optimistic script, touts BlackBerry’s security, Mobile Device Management, and the QNX operating system licenses for embedded industry applications.

None of this will move the needle in an appreciable way. And, because BlackBerry’s future is seen as uncertain, corporate customers who once used BlackBerry’s communication, security, and fleet management services continue to abandon their old supplier and turn to the likes of IBM and Good Technology.

The company isn’t in danger of a sudden financial death: Chen has more than $3B in cash at his disposal and the company burns about $35M of it every quarter. Blackberry’s current stock price says the company is worth about $5B, $2B more than its cash position. Therefore, Chen’s endgame is to sell the company, either whole or, more likely, in parts (IP portfolio, QNX OS…) for more than $2B net of cash.

Wall Street knows this, corporate customers know this, carriers looking at selling Passports and some services know this. And potential body parts buyers know this as well… and wait.

It’s not going to be pretty.

JLG@mondaynote.com

Apple Watch Is And Isn’t…

 

The Apple Watch isn’t just another iDevice, a “wearables” accessory to the Apple ecosystem. It’s a bold attempt to create a new kind of wrist-worn personal computer that looks like a smartwatch.

In previous Monday Notes dealing with the putative iWatch and other “wearables”, I thought the new product would be a nice add-on to the iDevices ecosystem — a bit player that would make the iPhone more desirable —  but that it wouldn’t move the needle, meaning $10B or more in revenue. I reasoned that a watch battery would be too small to feed a computer powerful enough to offer a wide range of apps and communications capabilities.

I was wrong.

In his demonstration (76 minutes into the official video) at the Cupertino Flint Center last Tuesday, Kevin Lynch, the Adobe defector who now runs the Apple Watch software engineering effort, showed us that the Watch isn’t just a shrunk-down iPhone: It can stand on its own, it has introduced an entire new genre of user interface, and will have its own App Store. The reinterpreted watch crown, a side button, touch and pressure on the face, plus voice all combine to a potentially rich and unique set of ways to interact with this newest very personal computer.

As Horace Dediu, our disruption scholar, puts it:

“The Apple Watch is as much a Watch as the iPhone is a Phone.”

The almost overwhelming richness of the user interface and of demonstrated apps led one twitterer to express a concern I can’t suppress:

Dr. Drang Apple Software Army

Will the software overwhelm the hardware, resulting in problematic battery-life, or befuddle normal humans?

Indeed, I remember how I worried when Steve Jobs first demonstrated the iPhone on January 9th, 2007 and stated it ran OS X. Knowing Jobs’ occasionally robust relationship with facts, I feared embarrassment down the road. But, no. When the iPhone shipped almost six months later, on June 29th, hackers immediately dissected it and discovered it ran a bona fide pared-down version of OS X — later renamed iOS.

As with the original iPhone, we might be six months away from a shipping product, time for Apple to fine-tune its software and work on the S1 SoC (System on a Chip) that drives the watch… and to put in place the supply chain and retail operations for the many Apple Watch variations.

In the meantime, some choice morsels of context will help as we consider the impact of Apple’s new Watch. We’ll start with Marc Newson, the famed designer (and Jony Ive’s friend and collaborator)  who just joined Apple. If you haven’t done so already, take a look at this video where Newson flips through his portfolio of watch and clock designs, including this striking reinterpretation of a great classic, the Atmos Clock from Jaeger-LeCoultre:

Newson Atmos

(The pages that Newson surveys in the video are taken from a book published by Taschen, the noted publisher of lovingly designed art books.)

For more context, follow this link supplied by Kontra (a.k.a. @counternotions) and regard the sea of watch designs from Newson’s Ikepod days, a company Newson left in 2012.

Newson Ikepod Manatee

Turning to the Apple Watch mega-site, we see a family resemblance:

Apple Watches

Professional watchmakers and industry executives seem to appreciate Newson’s influence and Apple’s efforts, although they are quick to point out that they don’t think the Apple Watch is a threat to their high-end wares (“It’s a techno-toy more than a watch, but what a fun toy,” says Laurent Picciotto of Chronopassion Paris).  Watches by SJX provides a quick collation of What The Watch Industry Thinks Of The Apple Watch. Swiss watchmaker Eric Giroud voices the majority opinion:

“It’s a nice product; good shape and amazing bracelet – thank you Marc Newson for the resurrection of the Ikepod strap. It’s difficult to speak about its impact on watchmaking because the Apple Watch is not a watch except that it is also worn on the wrist.”

Benjamin Clymer is the editor of Hodinkee, an on-line magazine dedicated to the world of watches. In a post titled A Watch Guy’s Thoughts On The Apple Watch, Clymer provides a review that’s informed by a deep personal knowledge of the watch scene. If you don’t have time to read the whole article — it’s a long piece — the author provides a good summary in the introduction [emphasis mine]:

[…] though I do not believe it poses any threat to haute horology manufactures, I do think the Apple Watch will be a big problem for low-priced quartz watches, and even some entry-level mechanical watches. In years to come, it could pose a larger threat to higher end brands, too. The reason? Apple got more details right on their watch than the vast majority of Swiss and Asian brands do with similarly priced watches, and those details add up to a really impressive piece of design. It offers so much more functionality than other digitals it’s almost embarrassing. But it’s not perfect, by any means.

Not everyone in the watch industry is so impressed. In an article titled Apple Watch ‘too feminine and looks like it was designed by students’, says LVMH executive, The Telegraph provides the money quote [emphasis mine]:

“To be totally honest, it looks like it was designed by a student in their first trimester,” added Mr Biver, who heads up the brands Tag Heuer, Zenith and Hublot.

The article evoked general hilarity and prompted more than one commenter to dig up the infelicitous Ed Colligan quote about the iPhone:

“PC guys are not going to just figure this out. They’re not going to just walk in.”

I’ll offer a rewrite for Jean-Claude Biver and his haute horlogerie colleagues:

“We like Apple products, they provide productivity and fun in our daily lives; we respect the sense of design Sir Jony and now Marc Newson bring to the company. I wish I could say more but, try as I might, I couldn’t get the livestream of Mr. Cook’s presentation to work in my Rue de Rive office in Geneva. First, there was this Mandarin dubbing, I can understand why but it was really annoying. Then, the transmission kept breaking down. I imagine that the tons of concrete now being poured for Apple’s next headquarters will provide a suitable resting place for the individual in charge.
Again, congratulations on a well-executed global launch.”

More seriously, let’s put streaming glitches glitches aside, they won’t matter in the longer run because they don’t concern the product itself. Last week’s launch, its detailed preparations, including the no-longer mysterious white building, attest to the gravity of Apple’s long-term ambition.

As additional evidence that the Apple Watch isn’t just a hobby, recall that the iPhone was initially offered in one size and one color. By comparison, the Apple Watch is an explosion: It comes in three styles and two sizes (in millimeters, 38 and 42, because that’s the trade vocabulary), two material/finishes for each style (silver and space gray, yellow or rose gold), nine bands for the basic Apple Watch, six for the Apple Watch Sport, and at least four for the gold Apple Watch Edition — and all with matching crown buttons.  Henry Ford has definitely left the building.

The fact that Apple invited fashion editors to Cupertino (some of whom had to be told where that town is) is another Think Different sign. Nerds are still welcome, but this is a new game. Again, turn to the Apple Watch site and look at the bands/bracelets. As Ben Clymer notes in his piece, the level of detail tells us this isn’t just another iDevice.

Stepping back a little, when I see the team of watch industry execs, design luminaries, and fashion experts Apple has brought on board, I have a hard time believing that Apple is going to stop at watches. At the very least, will Mssrs. Ive and Newson bring livelier, more varied designs to the iPhone? And what does Tim Cook mean when he slyly alludes to products that “haven’t even been rumored yet…”?

But let’s not get ahead of ourselves — we’re still barely past the demo. We’ll have to wait for the actual product to come to the wrists of real users. Only then will we have the Apple Watch make-or-break moment: Word-of-mouth from non-experts.

And, still in the not getting ahead of ourselves department, for Apple, today’s make-or-break product is the iPhone 6. The Apple Watch makes great “ink” and iPhones make the money.

JLG@mondaynote.com

An Ancient Love Story: Apple & Payment Systems

 

This week’s product launch should break the mold of Apple’s recent Fall announcements: More products than usual and a challenge to the status quo – in payment system this time.

A larger iPhone; a line of wearables (unveiled if not yet ready-to-ship); significant iOS improvements (a true “iOS 2.0”); HomeKit and HealthKit devices, applications, and partnerships; payment systems… If only half of the rumors about Apple’s September 9th media event are true, we’re going to have a wider and deeper flood of new products than we’ve seen in Apple’s previous Fall launches.

And let’s not forget the big white cocoon that covers the two-story structure that Apple built for the occasion:

Apple White Cocoon Edited

(image source:  AppleInsider)

Apple is likely to add some drama to the event by lifting the veil at the last moment.

For today, we’ll focus on the recent flurry of leaks and rumors surrounding payment systems. We’ve heard about agreements with American Express, Visa, MasterCard, Bank of America; with retailers such as Nordstrom and Macy’s, CVS and Walgreens; and hoteliers such as Starwood… The predications may not prove accurate down to the last detail, but the outbreak is too strong not to be taken seriously. Apple is about to get into the payment system business in a serious way.

There have been rumors before. Search for “apple payment system” and you’ll get about 80 million hits on Google (11 million on Bing). Flipping through the pages, we see that the excitement started as far back as five years ago when Apple’s “Grab & Go” patent filings disclosed the company’s interest in near field communication, a wireless data transfer method that can be used for quick purchases and payments. This led to the birth of a new i-Word around 2010: the iWallet.

From its very beginning, the iPhone has looked like a logical payment device. Our phones are always with us; they’re more secure than the magnetic stripe on a credit card because they can use “payment tokens” — codes that authenticate you without identifying your credit card account; payment apps can be easily downloaded and updated.

The possibilities looked endless and, of course, led to overheated predictions: Think of all the trillions of dollars sloshing around in debit/credit cards. If Apple captured only a small fraction of the flow, they’d be filthy rich!

Others disagreed. In January 2011, PCWorld’s Tom Spring explained why Apple’s Mobile Payment System Will Fail. Among his objections, was the implicit assumption that phones are somehow easier than cards (“What’s gained…by waving an iPhone instead of swiping a bank card is not clear to me”), and that retailers won’t accept phones as payment instruments until the “Another Box at the Register” obstacle is surmounted:

“Near field communication is a technology that requires a physical box/reader on the retailer’s end. Until we know more about what incentives there are for retailers to invest in this technology I think it’s going to be hard sell for Apple to convince millions of merchants to put another box at the point of sale…”

Indeed, attempting to modify ingrained customer behavior isn’t a well-trodden path to riches, nor is asking retailers to install a new box next to their cash register. This is why many payment system innovations, Google Wallet is a recent example, have failed to amass enough gravitational pull to gain currency (pardon the pun). There just hasn’t been enough acceptance by consumers and retailers for “fast lane” payment devices to become as matter-of-fact as the incumbents.

Still… Apple has repeatedly shown great patience and willingness to challenge settled wisdom.

The company’s embrace of payment systems started in 2003 when its newly-opened iTunes Store offered two innovations: Single tracks were sold for 99 cents apiece (at the time), and we could settle the purchase with a credit card. Critics scoffed: The price is too low! The credit card companies’ fixed+percentage transaction fees will be a profit-killer!

How can Apple possibly make money with such a proposition?

This was myopia. The iTunes Store wasn’t intended to be a money maker. Its only purpose was to sell more iPods at higher margins, that’s where the money was – and still is. In retrospect, Jobs was pouring the foundations of the Apple ecosystem business model:   Hardware is the star; everything else supports the big shots’ volumes and margins.

Returning to today’s (or this coming Tuesday’s) topic, Apple doesn’t want to displace the key players — the banks and credit card companies — any more now than they did a decade ago. Credit card companies, for example, play a hard-to-replace role in policing transactions. It’s not always pretty or convenient when one has to call a US number from Europe because the system “tripped” over an unusual transaction, but it works.

One can’t imagine Apple even thinking of storing and lending money, of trying to “capture a fraction of the flow”. If the company does introduce a near field payment system, it won’t be as an attempt to make money in itself, it will simply be another extension of the Apple ecosystem, another way to make iDevices more attractive.

Beyond this neat playbook theory lurks the matter of modifying consumer behavior and retail infrastructure; Tom Spring’s objections are just as cogent today as they were in 2009. And perhaps Apple’s answer — its rebuttal to the conventional reluctance — is hiding in the still-cocooned show-and-tell building.

JLG@mondaynote.com

PS: On today’s topic, see Horace Dediu’s views on the value of payment systems as bit pipes.

PPS: Unrelated but hard to resist: People from the fashion industry now working at Apple. And their friends, fashion editors, unusual invitees to a Cupertino product launch.

Three Years Later: Tim Cook’s Apple

 

On September 9th, Apple will announce products likely to be seen as a new milestone in Tim Cook’s tenure as Apple’s CEO.

You Break It You Own It. This Labor Day weekend sits about midway between two  anniversaries: Tim Cook assumed the CEO mantel a little over three years ago – and Steve Jobs left this world – too soon – early October 2011. And, in a few days, Apple will announce new products, part of a portfolio that caused one of Cook’s lieutenants, Eddy Cue, to gush Apple had the “best product lineup in 25 Years”. Uttered at last Spring’s Code Conference, Cue’s saeta was so unusual it briefly disoriented Walt Mossberg, a seasoned interviewer if there ever was one. After a brief pause, Walt slowly asked Apple’s exec to repeat. Cue obliged with a big I Ate The Canary smile – and raised expectations that will soon meet reality.

After three years at the helm, we’ll soon know in what sense Tim Cook “owns” Apple. For having broken Steve’s creation, for having created a field of debris littered with occasionally recognizable remains of a glorious, more innovative, more elegant past. Or for having followed the spirit of Steve’s dictum – not to think of what he would have done – and led Apple to new heights.

For the past three years, detractors have relentlessly criticized Cook for not being Steve Jobs, for failing to bring out the Next Big Thing, for lacking innovation.
Too often, clickbaiters and other media mountebanks veered into angry absurdity. One recommended Cook buy a blazer to save his job; another told us he a direct line to Apple’s Board and knew directors were demanding more innovation from their CEO; and, last Spring, a Valley bloviator commanded Apple to bring out a smartwatch within 60 days – or else! (No links for these clowns.)

More measurably, critics pointed to slower revenue growth: + 9% in 2013 vs + 65% in 2011 and + 52% in 2010, the last two “Jobs Years”. Or the recent decrease in iPad sales: – 9% in the June 2014 quarter – a never-seen-before phenomenon for Apple products (I exclude the iPod, now turning into an ingredient of iPhones and iPads).

Through all this, Apple’s CEO never took the bait and, unlike Jobs, either ignored jibes, calmly exposed his counterpoint, or even apologized when warranted by the Maps fiasco. One known – and encouraging – exception to his extremely controlled public manner took place when he told a representative of a self-described conservative think-tank what to do with his demand “to commit right then and there to doing only those things that were profitable” [emphasis mine]:

“When we work on making our devices accessible by the blind, […] I don’t consider the bloody ROI.”
and…
“If you want me to do things only for ROI reasons, you should get out of this stock.”

Not everything that counts can be counted and… you know the rest of the proverb. Apple shareholders (not to be confused with pump-and-dump traders) at large seem to agree.

The not-taken road to perdition hasn’t been a road to perfection either. Skipping over normal, unavoidable irritants and bugs – the smell of sausage factories is still with me –

a look at Apple’s Mail client makes one wish for stronger actions than bug fixes leading to new crashes. This is a product, or people, that need stronger decision as they do not represent Apple at its best. Another long-time offender is the iTunes client. One unnamed Apple friend calls it “our Vista” and explains it might suffer from its laudable origin as a cross-platform Mac/Windows application, a feature vital to iPod’s success – we’ll recall its 2006 revenue ($7.7B, + 69% year-to-year growth!) was higher than the Mac’s ($7.4B, + 18%).

Now looking forward, we see this:

Apple Flint Center Barge

A large, cocooned structure being built by an “anonymous” company, next to Cupertino’s aptly named Flint Center for the Performing Arts, where Apple will unveil its next products this coming September 9th. Someone joked this was yet another instance of Apple’s shameless imitation of Google’s innovations. This time Apple copied Google’s barges, but could even get its own clone to float.

Seriously, this is good news. This is likely to be a demo house, one in which to give HomeKit, HealthKit or, who knows, payment systems demonstrations, features of the coming iOS 8 release for “communicating with and controlling connected accessories”. The size of the structure speaks for Apple’s ambitions.

On other good news, we hear Apple’s entry into “wearables”, or into the “smartwatch” field won’t see any shipments until 2015. The surprise here is that Apple would show or tease the product on 9/9. There have been exactly zero leaks of body parts, circuit boards, packages and other accessories, leading more compos mentis observers (not to be confused with compost mentis on Fox News) to think a near term announcement wasn’t in the cards. But John Paczkowski, a prudent ans well-informed re/code writer assures us Apple will indeed announce a “wearable” — only to tell us, two days later, it won’t ship until next year. The positive interpretation is this: Apple’s new wearable category isn’t just a thing, an gizmo, you can throw into the channel and get the money pump running – at nice but immaterial accessory rates. Rather, Apple’s newer creation is a function-rich device that needs commitment, software and partnerships, to make a material difference. For this it needs time. Hence the painful but healthy period of frustration. (Electronic Blue Balls, in the immortal words of Regis McKenna, the Grand Master of Silicon Valley Marketing, who was usually critical of firms making an exciting product announcement, only to delay customer gratification for months.)

The topic of payments is likely to be a little less frustrating – but could mead to another gusher of media commentary. Whether Apple partners with Visa, American Express or others is still a matter of speculation. But one thing is clear: this idea isn’t for Apple to displace or disintermediate any of the existing players. Visa, for example, will still police transactions. And Apple isn’t out to make any significant amount of money from payments.

The goal, as always, is to make Apple devices more helpful, pleasurable – and to sell more of these at higher margins as a result. Like HomeKit or HealthKit, it’s an ecosystem play.

There’s also the less surprising matter of new iPhones. I don’t know if there will be a 4.7” model, or a 5.5” model or both. To form the beginning of an opinion, I went to the Palo Alto Verizon store on University Avenue and asked to buy the 5” Lumia Icon Windows Phone on display. The sales person only expressed polite doubt and excused himself “to the back” to get one. It took eight minutes. The rest of the transaction was quick and I walked out of the store $143.74 lighter. I wanted to know how a larger phone would feel on a daily, jeans and jacket breast-pocket experience. It’s a little heavy (167 grams, about 50 grams more than an iPhone 5S), with a very nice, luminous screen and great Segoe WP system font:

Icon Lumia

I won’t review the phone or Windows Phone here. Others have said everything that needs to be said on the matter. It’s going to be a tough road for Microsoft to actually become a weighty number three in the smartphone race.

But mission accomplished: It feels like a larger iPhone, perhaps a tad lighter than the Lumia will deliver a pleasant experience. True, the one-handed use will probably be restricted to a subset of the (mostly male) population. And today’s 4” screen size will continue to be available.

There remains the question of what size exactly: 4.7”, or 5.5” (truly big), or both. For this I’ll leave readers in John Gruber’s capable hands. In a blog post titled Conjecture Regarding Larger iPhone Displays, John carefully computes possible pixel densities for both sizes and offers an clarifying discussion of “points”, an important iOS User Interface definition.

We’ll know soon.

As usual, the small matter of implementation remains. There are sure to be the usual hiccups to be corrected in .1 or .2 update in iOS 8. And there won’t be any dearth of bilious comments about prices and other entries on the well-worn list of Apple sins.

But I’ll be surprised if the public perception of Tim Cook’s Apple doesn’t take yet another turn for the better.

JLG@mondaynote.com

 

Shift Happens: Apple + IBM. This Time It’ll Be Different.

 

Strategic Alliances and other grandly named partnerships never seem to live up to their florid marriage announcements. Apple and IBM are it – again – but this time, Apple is the larger, more prosperous company, and IBM is trying the bad old recipe of regaining growth by cutting down.

Let me slip into something more comfortable: Devil’s Advocate robes. Thus togged out, I will explain why this Apple + IBM rapprochement won’t work – or, worse, it will.

First, the clash of cultures.

Apple is a focused company, its financial statements tell the story: Its money is made in hardware. All other activities, such as the important contributions from the App Store, make up an ecosystem that support the hardware volumes and margins. Everyone in the company knows this.

A look at IBM’s latest quarterly report tells a much more complicated story. In its simplest analysis, the company consists of three main segments, each with its own P&L (Profit & Loss) numbers and, one assumes, its own goals, rewards and punishments, and fight for resources. It is, counterintuitively as the shadow of its former grandeur remains, a smaller business than Apple’s: $24.4B last quarter (-2% year-to-year) vs. $37.4B (+6%).

I asked WolframAlpha for per employee, per year revenue and profit comparisons and got this:

Wofram IBM Apple Revenue

and…

Wolfram IBM Apple Profit

Inside IBM, morale isn’t great. Following a series of layoffs, management is perceived as using Excel as a windshield to drive the company.

Two groups with widely differing habits of the heart and mind.

Second, earlier embraces haven’t worked.

We have memories of  AIM, the 1991 accord between Apple, IBM, and Motorola that gave us Kaleida, the multimedia PowerPC processor, and Taligent, Apple and IBM’s attempt at a more modern operating system. Big announcements, big plans – and nothing but debris.

Even earlier, we have memories of the Apple/DEC Alliance: In the Summer of 1987, my boss and benefactor John Sculley had given me the mission to bring to a conclusion a conversation he’d started with DEC’s CEO. Things went well and, in January 1988, we reached our goal:

 “…Apple Computer and Digital Equipment announced a joint development agreement under which the two companies would work together to integrate Macintosh and the AppleTalk network system with the VAX and DECnet.

At the celebratory dinner, I sat next to DEC’s founder, Ken Olson. The likable Grand Old Man professed happiness with our collaboration and calmly told me that while he knew lots of people who used PCs, he couldn’t comprehend why. At home, he said, he had a “glass teletype” — a CRT, remember those? — and an Ethernet connection back to the factory, quite expensive at the time. Combined with DEC’s  ALL-IN-1 office productivity suite (all commands were two-characters long) he had everything he needed.

The Apple/DEC Alliance went nowhere. As with many such covenants, the product of the announcement was the announcement itself. The marriage itself was a sham.

Third and more generally, alliances don’t work.

There was a time when strategic alliances were all the rage. In 1993, my friend Denise Caruso published the aptly titled Alliance Fever, a 14-page litany of more than 500 embraces. The list started at 3DO and ending with Zenith Electronics, neither of which still stands: 3DO went bankrupt in 2003, Zenith was absorbed by LG Electronics.

These aren’t isolated bad endings. If you have the time and inclination for a nostalgic stroll through the list, you’ll see many more such disappearances.

But, you’ll object, this was more than twenty years ago. The industry has learned from these examples; we won’t fall into the same rut.

One would hope. And one would be disappointed.

The tendency remains strong for sheepish company execs to congregate and participate in what Valley wags call a Clusterf#^k. In two Monday Notes (Mobile World 2010 and 2011), I offered examples such as this one:

Do your eyes glaze over when you read such BS?

“Global leaders Intel Corporation and Nokia merge Moblin and Maemo to create MeeGo*, a Linux-based software platform that will support multiple hardware architectures across the broadest range of device segments, including pocketable mobile computers, netbooks, tablets, mediaphones, connected TVs and in-vehicle infotainment systems.”

Relax, you’re normal. Who are they kidding? Themselves, most likely.

All the holy words are there: Linux (mandatory), based (to male things clearer), platform (the p-word), multiple hardware architectures (we don’t know what we’re doing so we’re covering all bases), broadest range of devices (repeat the offense just committed), segments (the word adds a lot of meaning to the previous phrase), including pocketable mobile computers, netbooks, tablets, mediaphones, connected TVs and in-vehicle infotainment systems (only microprocessor-driven Toto toilets are missing from the litany).

Alliances generally don’t work because there’s no one really in charge, no one has the power to mete out reward and punishment, to say no, to change course. Often, the partners in an alliance are seen as a bunch of losers clinging to each other with the hope that there’s safety in numbers. It’s a crude but, unfortunately, not inaccurate caricature.

I’ll switch sides now and explain why It’ll Be Different This Time.

Division of labor is the most convincing argument for this partnership. IBM is and always has been an Enterprise Services company. As it did in its glorious mainframe days, it can take care of everything: analyze your business, recommend changes, re-engineer your organization, write software, maintain everything. Today, there’s much less focus on hardware revenue, but the broad scope remains.

Then came the mobile revolution, which IBM has missed out on. It’s not that they didn’t have the opportunity. The company could have jumped on the mobile-everything wave, but that would have meant breaking the “Roadmap 2015” promise that was avowed by IBM’s former CEO, Sam Palmisano. Palmisano might be forgiven for not anticipating the size and importance of mobile when he promised, in his 2010 letter to investors, that IBM share value would double by 2015, but Ginni Rometty, Palmisano’s successor, has no excuse. The 2012 changing of the guard was a perfect opportunity for Rometty to stand up, say Things Have Changed and re-jigger the roadmap. Ah well.

On the positive side, IBM’s clients are re-organizing their businesses as a result of the mobile deluge, some late, some early. The smarter ones have realized that mobile devices aren’t just “small PCs” and have turned to broad-range professional services vendors such as IBM to re-engineer their business.

For Apple’s part, the iPhone and the iPad have gained increasingly wider acceptance with large Enterprise customers:98% of Fortune 500 companies have rdeployed iOS devices and more than 90% of tablet activations in enterprise environments are iPads.” Of course, a few BYOD devices don’t constitute wholesale adoption inside a company. Apple doesn’t have the manpower and culture to come in, engineer, deploy, and maintain company-wide applications and fleets of devices. That’s IBM forte.

What’s new in the arrangement is IBM’s decision to invest in extending its ability to develop applications that fully integrate iOS devices — as opposed to “suffering” them.

On the numbers side, naysayers mistakenly use the “98%” figure quoted above to opine that the partnership won’t create much additional revenue. They’re probably right — at least initially. But the partnerships could herald a move from “anecdotal” to systematic deployments that are deep and wide. This will take time and the needle won’t move right away…it will be more like the hours hand on the clock face.

Another more immediate effect, across a wide range of enterprises, will be the corporate permission to use Apple devices. Recall the age-old mantra You Don’t Get Fired For Buying IBM, which later became DEC, then Microsoft, then Sun…and now Apple. Valley gossip has it that IBM issued an edict stating that Macs were to be supported internally within 30 days. Apparently, at some exec meetings, it’s MacBooks all around the conference room table — except for the lonely Excel jockey who needs to pivot tables.

We’ll see if the company whose motto once was Think actually works well with the Think Different squad.

JLG@mondaynote.com

 

The Sweet Spot On Apple’s Racket

 

iPad sales are falling – but the sky is not. We’re merely dealing with a healthy case of expectations adjustment.

The tablet computer has always felt inevitable. The desire to harness the power of a computer in the comfortable form of a letter-size tablet with a keyboard, or perhaps a stylus for more natural interaction — or why not both? — has been with us for a very long time. Here we see Alan Kay holding a prototype of his 1972 Dynabook (the photo is from 2008):

Alan_Kay_and_the_prototype_of_Dynabook,_pt._5_(3010032738)

(credit: http://en.wikipedia.org/wiki/Dynabook)

Alan prophetically called his invention a personal computer for children of all ages.

For more than 40 years, visionaries, entrepreneurs, and captains of industry have whetted our appetite for such tablets. Before it was recast as a PDA, a Personal Digital Assistant, Steve Sakoman’s Newton was a pen-based letter-size tablet. Over time, we saw the GridPad, Jerry Kaplan’s and Robert Carr’s Go, and the related Eo Personal Communicator. And, true to its Embrace and Extend strategy, Microsoft rushed a Windows for Pen Computing extension into Windows 3.1.

These pioneering efforts didn’t succeed, but the hope persisted: ‘Someone, someday will get it right’. Indeed, the tablet dream got a big boost from no less than Bill Gates when, during his State of The Industry keynote speech at Comdex 2001 (Fall edition), Microsoft’s CEO declared that tablets were just around the corner [emphasis mine]:

“The Tablet takes cutting-edge PC technology and makes it available wherever you want it, which is why I’m already using a Tablet as my everyday computer. It’s a PC that is virtually without limits — and within five years I predict it will be the most popular form of PC sold in America.

Unfortunately, the first Tablet PCs, especially those made by Toshiba (I owned two), are competent but unwieldy. All the required ingredients are present, but the sauce refuses to take.

Skip ahead to April 2010. The iPad ships and proves Alan Kay right: The first experience with Apple’s tablet elicits, more often than not, a child-like joy in children of all ages. This time, the tablet mayonnaise took and the “repressed demand” finally found an outlet. As a result, tablets grew even faster than PCs ever did:

PNG - Tablets Fastest Ever

(Source: Mary Meeker’s regular Internet Trends 2014 presentation, always long, never boring)

In her 2013 report, Meeker showed iPads topping the iPhone’s phenomenal growth, climbing three times faster than its more pocketable sibling:

PNG - iPad 3X iPhone Meeker 2013

(Source: Mary Meeker Internet Trends 2013)

There were, however, two unfortunate aspects to this rosy picture.

First, there was the Post-PC noise. The enthusiasm for Android and iOS tablets, combined with the end of the go-go years for PC sales, led many to decree that we had finally entered the “Post-PC” era.

Understandably, the Post-PC tag, with its implication that the PC is no longer necessary or wanted, didn’t please Microsoft. As early as 2011, the company was ready with its own narrative which was delivered by Frank Shaw, the company’s VP of Corporate Communications: Where the PC is headed: Plus is the New “Post”. In Microsoft’s cosmos, the PC remains at the center of the user’s universe while smartphones and tablets become “companion devices”. Reports of the PC’s death are greatly exaggerated, or, as Shaw puts it, with a smile, “the 30-year-old PC isn’t even middle aged yet, and about to take up snowboarding”.

(Actually, the current debate is but a new eruption of an old rash. “Post-PC” seems to have been coined by MIT’s David Clark around 1999, causing Bill Gates to pen a May 31st, 1999 Newsweek op-ed titled: Why the PC Will Not Die…)

Both Bill and Frank are right – mostly. Today’s PC, the descendant of the Altair 8800 for which Gates programmed Microsoft’s first Basic interpreter, is alive and, yes, it’s irreplaceable for many important tasks. But classical PCs — desktops and laptops — are no longer at the center of the personal computing world. They’ve been replaced by smaller (and smallest) PCs — in other words, by tablets and smartphones. The PC isn’t dead or passé, but it is shape-shifting.

There was a second adverse consequence of the iPad’s galloping growth: Expectations ran ahead of reality. Oversold or overbought, it doesn’t matter, the iPad (and its competitors) promised more than they could deliver. Our very personal computers — our tablets and smartphones — have assumed many of the roles that previously belonged to the classical PC, but there are some things they simply can’t do.

For example, in an interview with the Wall Street Journal, Tim Cook confides that “he does 80% of the work of running the world’s most valuable company on an iPad.” Which is to say Tim Cook needs a Mac for the remaining 20%…but the WSJ quote doesn’t tell us how important these 20% are.

We now come to the downward trend in iPad’s unit sales: -2.29% for the first quarter of calendar year 2014 (compared to last year). Even more alarming, unit sales are down 9% for the quarter ending in June. Actually, this seems to be an industry-wide problem rather than an Apple-specific trend. In an exclusive Re/code interview, Best Buy CEO Hubert Joly says tablet sales are “crashing”, and sees hope for PCs.

Many explanations have been offered for this phenomenon, the most common of which is that tablets have a longer replacement cycle than smartphones. But according to some skeptics, such as Peter Bright in an Ars Technica op-ed, there’s a much bigger problem [emphasis mine]:

“It turns out that tablets aren’t the new smartphone…[t]hey’re the new PC; if you’ve already got one, there’s not much reason to buy a new one. Their makers are all out of ideas and they can’t make them better. They can only make them cheaper.”

Bright then concludes:

“[T]he smartphone is essential in a way that the tablet isn’t. A large screen smartphone can do…all the things a tablet can do… Who needs tablets?”

Hmmm…

There is a simpler – and much less portentous – explanation. We’re going through an “expectations adjustment” period in which we’ve come to realize that tablets are not PC replacements. Each personal computer genre carries its own specifics; each instils unique habits of the body, mind, and heart; none of them is simply a “differently sized” version of the other two.

The realization of these different identities manifests itself in Apple’s steadfast refusal to hybridize, to make a “best of both worlds” tablet/laptop product.

Microsoft thinks otherwise and no less steadfastly (and expensively) produces Surface Pro hybrids. I bought the first generation two years ago, skipped the second, and recently bought a Surface Pro 3 (“The tablet that can replace your laptop”). After using it daily for a month, I can only echo what most reviewers have said, including Joanna Stern in the WSJ:

“On its third attempt, Microsoft has leapt forward in bringing the tablet and laptop together—and bringing the laptop into the future. But the Pro 3 also suffers from the Surface curse: You still make considerable compromises for getting everything in one package.”

Trying to offer the best of tablets and laptops in one product ends up compromising both functions. In my experience, too many legacy Windows applications work poorly with my fingers on the touch screen. And the $129 Type Cover is a so-so keyboard and poor trackpad. Opinions will differ, of course, but I prefer using Windows 8.1 on my Mac. We’ll see how the upcoming Windows 9, code name Threshold, will cure the ills of what Mary Jo Foley, a well-introduced Microsoft observer, calls Vista 2.0.

If we consider that Mac unit sales grew 18% last quarter (year-to-year), the company’s game becomes clear: The sweet spot on Apple’s racket is the set of customers who, like Tim Cook, use MacBooks and iPads. It’s by no means the broadest segment, just the most profitable one. Naysayers will continue to contend that the prices of competing tablets are preordained to crash and will bring ruin to Apple’s Affordable Luxury product strategy…just as they predicted netbooks would inflict damage on MacBooks.

As for Peter Bright’s contention that “[tablet] makers are all out of ideas and they can’t make them better”, one can easily see ways in which Google, Lenovo, Microsoft, Apple, and others could make improvements in weight, speed, input methods, system software, and other factors I can’t think of. After we get over the expectations adjustment period, the tablet genre will continue to be innovative, productive, and fun – for children of all ages.

JLG@mondaynote.com

App Store Curation: An Open Letter To Tim Cook

 

With one million titles and no human guides, the Apple App Store has become incomprehensible for mere mortals. A simple solution exists: curation by humans instead of algorithms.

Dear Tim,

You know the numbers better than anyone — I don’t need to quote them to you — but we all know that the iOS App Store is a veritable gold mine. Unfortunately, the App Store isn’t being mined in the best interests of Apple’s customers and developers, nor, in the end, in the interests of the company itself.

The App Store may be a gold mine, but it’s buried in an impenetrable jungle.

Instead of continuing with this complaint, I’ll offer a suggestion: Let humans curate the App Store.

Instead of using algorithms to sort and promote the apps that you permit on your shelves, why not assign a small group of adepts to create and shepherd an App Store Guide, with sections such as Productivity, Photography, Education, and so on. Within each section, this team of respected but unnamed (and so “ungiftable”) critics will review the best-in-class apps. Moreover, they’ll offer seasoned opinions on must-have features, UI aesthetics, and tips and tricks. A weekly newsletter will identify notable new titles, respond to counter-opinions, perhaps present a developer profile, footnote the occasional errata and mea culpa…

The result will be a more intelligible App Store that makes iOS users happier.

If I’m so convinced, why don’t I drive it myself? You might recall that I offered to do so — for free — in a brief lobby conversation at the All Things D conference a couple of years ago. The ever-hovering Katie Cotton gave me the evil eye and that was the end of the exchange.

I look back on my years at Apple with a certain affection, and would be happy to repay the company for what it did for me, so, yes, I would do it for free… but I can’t bankroll a half dozen tech writers, nor can I underwrite the infrastructure costs. And it won’t pay for itself: As an independent publication (or, more likely, an app) an App Store Guide isn’t financially viable. We know it’s next to impossible to entice people to pay for information and, as the Monday Note proves, I have no appetite for becoming a nano-pennies-per-pageview netwalker.

So, the App Store Guide must be an Apple publication, a part of its ecosystem.

Best,

JLG

PS:  We both understand that ideas are just ideas, they’re not actual products. As Apple has shown time and again — and most vividly with the 30-year old tablet idea vs. the actual iPad — it’s the product that counts. If you see the wisdom of a human-curated Apple App Guide, and I hope you do, I will not seek credit.

——————————-

Regular Monday Note readers will remember I already tilted at the App Store curation windmill: Why Apple Should Follow Michelin and  the tongue-in-cheek Google’s Red Guide to the Android App Store. Who knows, the third time might be the charm.

To play devil’s advocate, let’s consider a developer’s bad reaction to an Apple App Guide review. Let’s say MyNewApp gets a thumbs down in the Productivity section of the Guide. I’m furious; I write Tim or Eddy Cue an angry letter, I huffs and puff, threaten to take my business elsewhere — to Windows Phone, for example. I exhort my friends, family, and satisfied customers to contribute to a letter-writing campaign…

Why risk this sort of backlash? Particularly when today’s formula of “featuring” apps seems to be working:

330-Apps_curation

But…does it really work all that well? Today’s way of choosing this app over that one already upsets the non-chosen. Further, the stars used to “measure” user feedback are known to be less than reliable. A thoughtful, detailed, well-reasoned review would serve customers and developers alike.

This leads us to the Guide’s most important contribution to the app universe: Trust. An Apple-sponsored App Guide can be trusted for a simple reason: The company’s one and only motive is to advance its users’ interests by making the App Store more trustworthy, more navigable. As for developers, they can rely on a fair and balanced (seriously) treatment of their work. The best ones will be happier and the “almost best” others will see an opportunity to get their improved work noticed in a future review cycle.

There is also the temptation to shrug the suggestion off with the customary ‘Don’t fix it, it’s not broken.’  Sorry, no, it is broken. See what Marco Arment, a successful Apple developer, says on his blog [emphasis mine]:

“Apple’s App Store design is a big part of the problem. The dominance and prominence of “top lists” stratifies the top 0.02% so far above everyone else that the entire ecosystem is encouraged to design for a theoretical top-list placement that, by definition, won’t happen to 99.98% of them. Top lists reward apps that get people to download them, regardless of quality or long-term use, so that’s what most developers optimize for. Profits at the top are so massive that the promise alone attracts vast floods of spam, sleaziness, clones, and ripoffs.”

and…

Quality, sustainability, and updates are almost irrelevant to App Store success and usually aren’t rewarded as much as we think they should be, and that’s mostly the fault of Apple’s lazy reliance on top lists instead of more editorial selections and better search.

The best thing Apple could do to increase the quality of apps is remove every top list from the App Store.”

We can now turn to my own biases.

Why do I care? Good question, I’m now 70 and could just sit in zazen and enjoy the show. And there’s a lot of show to enjoy: The tech industry is more exciting now than when I was a rookie at HP France in 1968. But in today’s app stores, the excitement fades — and I’m not just talking about Apple, Android’s Google Play is every bit as frustrating. I see poorly exploited gold mines where quantity obscures quality and the lack of human curation ruins the Joy of Apps. There are caves full of riches but, most of of the time, I can’t find a path to the mother lode.

Is it a lack of courage in anticipation of imagined protests? Hunger sated by too much success too soon? An addiction to solving all problems by algorithm instead of by human judgment?

I hope its none of these, and that we’ll soon see a newsletter/blog and a reasoned, regularly enriched guide that leads us to the better App Store titles.

—JLG

 

Macintel: The End Is Nigh

When Apple announced its 64-bit A7 processor, I dismissed the speculation that this could lead to a switch away from Intel chips for the Macintosh line for a homegrown “desktop-class” chip. I might have been wrong.

“I don’t know exactly when, but sooner or later, Macs will run on Apple-designed ARM chips.” Thus spake Matt Richman in a 2011 blog post titled “Apple and ARM, Sitting in a Tree”. Richman explained why, after a complicated but ultimately successful switch from PowerPC chips to Intel processors in 2005, Apple will make a similar switch, this time to ARM-based descendants of the A4 chip designed by Apple and manufactured by Samsung.

Cost is the first reason invoked for the move to an An processor:

“Intel charges $378 for the i7 chip in the new high-end 15 inch MacBook Pro. They don’t say how much they charge for the i7 chip in the low-end 15 inch MacBook Pro, but it’s probably around $300. …When Apple puts ARM-based SoC’s in Macs, their costs will go down dramatically. ”

We all know why Intel has been able to command such high prices. Given two microprocessors with the same manufacturing cost, power dissipation, and computing power, but where one runs Windows and the other doesn’t, which chip will achieve the higher market price in the PC market? Thus, Intel runs the table, it tells clone makers which new x86 chips they’ll receive, when they’ll receive them, and, most important, how much they’ll cost. Intel’s margins depend on it.

ARM-based processors, on the other hand, are inherently simpler and therefore cost less to make. Prices are driven even lower because of the fierce competition in the world of mobile devices, where the Wintel monopoly doesn’t apply.

329_A7chip

Cost is the foremost consideration, but power dissipation runs a close second. The aging x86 architecture is beset by layers of architectural silt accreted from a succession of additions to the instruction set. Emerging media formats demand new extensions, while obsolete constructs must be maintained for the sake of Microsoft’s backward compatibility religion. (I’ll hasten to say this has been admirably successful for more than three decades. The x86 nickname used to designate Wintel chips originates from the 8086 processor introduced in 1978 – itself a backward-compatible extension of the 8088…)
Because of this excess baggage, an x86 chip needs more transistors than its ARM-based equivalent, and thus it consumes more power and must dissipate more heat.

Last but not least, Richman quotes Steve Jobs:

“I’ve always wanted to own and control the primary technology in everything we do.”

Apple’s leader has often been criticized for being too independent and controlling, for ignoring hard-earned industry wisdom. Recall how Apple’s decision to design its own processors was met with howls of protest, accusations of arrogance, and the usual predictions of doom.

Since then, the interest for another Grand Processor Switch has been alive and well. Googling “Mac running on ARM” gets you close to 10M results. (When you Bing the same query, you get 220M hits — 22x Google’s results. SEO experts are welcome to comment.)

Back to the future…

In September 2013, almost a year ago already, Apple introduced the 64-bit A7 processor that powers new iPhones and iPads. The usual suspects pooh-poohed Apple’s new homegrown CPU, and I indulged in a little fun skewering the microprocessor truthers: 64 bits. It’s Nothing. You Don’t Need It. And We’ll Have It In 6 Months. Towards the end of the article, unfortunately, I dismissed the speculation that Apple An processors would someday power the Mac. I cited iMacs and Mac Pros — the high end of the product line —as examples of what descendants of the A7 couldn’t power.

A friend set me straight.

In the first place, Apple’s drive to own “all layers of the stack” continues unabated years after Steve’s passing. As a recent example, Apple created its own Swift programming language that complements its Xcode IDE and Clang/LLVM compiler infrastructure. (For kremlinology’s sake I’ll point out that there is an official Apple Swift blog, a first in Apple 2.0 history if you exclude the Hot News section of the of apple.com site. Imagine what would happen if there was an App Store blog… But I digress.)

Secondly, the Mac line is suspended, literally, by the late delivery of Intel’s Broadwell x86 processors. (The delay stems from an ambitious move to a bleeding edge fabrication technology that shrinks the basic building block of a chip to 14 nanometers, down from 22 nanometers in today’s Haswell chips.) Of course, Apple and its An semiconductor vendor could encounter similar problems – but the company would have more visibility, more control of its own destiny.

Furthermore, it looks like I misspoke when I said an An chip couldn’t power a high-end Mac. True, the A7 is optimized for mobile devices: Battery-optimization, small memory footprint, smaller screen graphics than an iMac or a MacBook Pro with a Retina display. But having shown its muscle in designing a processor for the tight constraints of mobile devices, why would we think that the team that created the most advanced smartphone/tablet processor couldn’t now design a 3GHz A10 machine optimized for “desktop-class” (a term used by Apple’s Phil Schiller when introducing the A7) applications?

If we follow this line of reasoning, the advantages of ARM-based processors vs. x86 devices become even more compelling: lower cost, better power dissipation, natural integration with the rest of the machine. For years, Intel has argued that its superior semiconductor design and manufacturing technology would eventually overcome the complexity downsides of the x86 architecture. But that “eventually” is getting a bit stale. Other than a few showcase design wins that have never amounted to much in the real world, x86 devices continue to lose to ARM-derived SoC (System On a Chip) designs.

The Mac business is “only” $20B a year, while iPhones and iPad generate more than 5 times that. Still, $20B isn’t chump change (HP’s Personal Systems Group generates about $30B in revenue), and unit sales are up 18% in last June’s numbers vs. a year ago. Actually, Mac revenue ($5.5B) approaches the iPad’s flagging sales ($5.9B). Today, a 11” MacBook Air costs $899 while a 128Gb iPad Air goes for $799. What would happen to the cost, battery life, and size of an A10-powered MacBook Air? And so on for the rest of the Mac line.

By moving to ARM, Apple could continue to increase its PC market share and scoop much of the profits – it currently rakes in about half of the money made by PC makers. And it could do this while catering to its customers in the Affordable Luxury segment who like owning both an iPad and a Mac.

While this is entirely speculative, I wonder what Intel’s leadership thinks when contemplating a future where their most profitable PC maker goes native.

JLG@mondaynote.com

———-

Postscript: The masthead on Matt Richman’s blog tells us that he’s now an intern at Intel. After reading several of his posts questioning the company’s future, I can’t help but salute Intel management’s open mind and interest in tightly reasoned external viewpoints.

And if it surprises you that Richman is a “mere” intern, be aware that he was all of 16-years-old when he wrote the Apple and ARM post. Since then, his blog has treated us to an admirable series of articles on Intel, Samsung, Blackberry, Apple, Washington nonsense – and a nice Thank You to his parents.

 

Microsoft’s New CEO Needs An Editor

 

Satya Nadella’s latest message to the troops – and to the world – is disquieting. It lacks focus, specifics, and, if not soon sharpened, his words will worry employees, developers, customers, and even shareholders.

As I puzzled over the public email Microsoft’s new CEO sent to his troops, Nicolas Boileau’s immortal dictum came to mind:

Whatever is well conceived is clearly said,
And the words to say it flow with ease.

Clarity and ease are sorely missing from Satya Nadella’s 3,100 plodding words, which were supposed to paint a clear, motivating future for 127,000 Microsoftians anxious to know where the new boss is leading them.

LE WEB PARIS 2013 - CONFERENCES - PLENARY 1 - SATYA NADELLA

Nadella is a repeat befuddler. His first email to employees, sent just after he assumed the CEO mantle on earlier this year, was filled with bombastic and false platitudes:

“We are the only ones who can harness the power of software and deliver it through devices and services that truly empower every individual and every organization. We are the only company with history and continued focus in building platforms and ecosystems that create broad opportunity.”

(More in the February 9th, 2014 Monday Note)

In his latest message, Nadella treats us to more toothless generalities:

“We have clarity in purpose to empower every individual and organization to do more and achieve more. We have the right capabilities to reinvent productivity and platforms for the mobile-first and cloud-first world. Now, we must build the right culture to take advantage of our huge opportunity. And culture change starts with one individual at a time.”

Rather than ceding to the temptation of quoting more gems, let’s turn to a few simple rules of exposition.

First, the hierarchy of ideas:

328_strategy_graph

This admittedly simplistic diagram breaks down an enterprise into four layers and can help diagnose thinking malfunctions.

The top layer deals with the Identity or Culture — I use the two terms interchangeably as one determines the other. One level down, we have Goals, where the group is going. Then come the Strategies or the paths to those goals. Finally, we have the Plan, the deployment of troops, time, and money.

The arrow on the left is a diagnostic tool. It reminds us that as we traverse the diagram from Identity to Plan, the number of words that we need to describe each layer increases.  It should only take a few words to limn a company’s identity (Schlumberger, oil services; Disney, family entertainment), describing the company’s goals will be just a tad more verbose (“in 5 years’ time we’ll achieve $X EPS, Y% revenue growth and Z% market share”), and so on.

The arrow also tells us that the “rate of change” — the frequency at which a description changes — follows the same trajectory. Identity should change only very slowly, if ever. At the other end, the plan will need constant adjustment as the company responds to rapidly shifting circumstances, the economy, the competition.

Using the old Microsoft as an example:
— Identity: We’re the emperor of PC software
— Goals: A PC on every desk and home – running our software
— Strategy: Couple the Windows + Office licenses to help OEMs see the light; Embrace and Extend Office competitors.
— Plan: Changes every week.

Returning to Nadella’s prose, can we mine it for words to fill the top three layers? Definitely not.

Second broken rule: Can I disagree? Any text that relies on platitudes says not much at all; in a message-to-the-troops that’s supposed to give direction, irrefutable statements are deadly. Some randomly selected examples in an unfortunately overabundant field:

“[…] we will strike the right balance between using data to create intelligent, personal experiences, while maintaining security and privacy.”

or…

“Together we have the opportunity to create technology that impacts the planet.”

 or…

“Obsessing over our customers is everybody’s job.”

If I’m presented with statements I cannot realistically disagree with – We Will Behave With Utmost Integrity – I feel there’s something wrong. If it’s all pro and no con, it’s a con.

There are other violations but I’ll stop in order to avoid the tl;dr infraction I reproach Nadella for: Never make a general statement without immediately following it with the sacramental “For Example”.

For example:

“[…] we will modernize our engineering processes to be customer-obsessed, data-driven, speed-oriented and quality-focused.”

… would be more believable if followed by:

Specifically, we’ll ask each each software engineer to spend two days every month visiting customers on even months, and third party developers on odd ones. They will also spend one day per quarter seconding Customer Service Representatives over our phone banks.” 

Satya Nadella is an unusually intelligent man, a Mensa-caliber intellect, well-read, he quotes Nietzsche, Oscar Wilde, and Rainer Maria Rilke. Why, then, does he repeatedly break basic storytelling rules?

Two possible explanations come to mind.

First, because he’s intelligent and literate, he forgot to use an unforgiving editor. ‘Chief, you really want to email that?’ Or, if he used an editor, he was victimized by a sycophantic one. ‘Satya, you nailed it!’

Second, and more likely, Nadella speaks in code. He’s making cryptic statements that are meant to prepare the troops for painful changes. Seemingly bland, obligatory statements about the future will decrypt into wrenching decisions:

“Organizations will change. Mergers and acquisitions will occur. Job responsibilities will evolve. New partnerships will be formed. Tired traditions will be questioned. Our priorities will be adjusted. New skills will be built. New ideas will be heard. New hires will be made. Processes will be simplified. And if you want to thrive at Microsoft and make a world impact, you and your team must add numerous more changes to this list that you will be enthusiastic about driving.”

In plainer English: Shape up or ship out.

Tortured statements from CEOs, politicians, coworkers, spouses, or suppliers, in no hierarchical order, mean one thing: I have something to hide, but I want to be able to say I told you the facts.

With all this in mind, let’s see if we can restate Nadella’s message to the troops:

This is the beginning of our new FY 2015 – and of a new era at Microsoft.
I have good news and bad news.
The bad news is the old Devices and Services mantra won’t work.

For example: I’ve determined we’ll never make money in tablets or smartphones.

So, do we continue to pretend we’re “all in” or do we face reality and make the painful decision to pull out so we can use our resources – including our integrity – to fight winnable battles? With the support of the Microsoft Board, I’ve chosen the latter. We’ll do our utmost to minimize the pain that will naturally arise from this change. Specifically, we’ll offer generous transitions arrangements in and out of the company to concerned Microsoftians and former Nokians.

The good news is we have immense resources to be a major player in the new world of Cloud services and Native Apps for mobile devices. We let the first innings of that game go by, but the sting energizes us. An example of such commitment is the rapid spread of Office applications – and related Cloud services – on any and all mobile devices. All Microsoft Enterprise and Consumer products/services will follow, including Xbox properties.

I realize this will disrupt the status quo and apologize for the pain to come. We have a choice: change or be changed.

Stay tuned.

Or words (about 200) to that effect.

In parting, Nadella would do well to direct his attention to another literate individual, John Kirk, whose latest essay, Microsoft Is The Very Antithesis Of Strategy, is a devastating analysis that compares the company’s game plan to the advice given by Sun Tzu, Liddell Hart, and Carl von Clausewitz, writers who are more appropriate to the war that Microsoft is in than the authors Microsoft’s CEO seems to favor.

The CEO’s July 10th email promises more developments, probably around the July 22nd Earnings release. Let’s hope he’ll offer sharper and shorter words to describe Microsoft’s entry into the Cloud First – Mobile First era.

JLG@mondaynote.com

The Network Is the Computer: Google Tries Again

 

All you need is a dumb device attached to a smart network. It’s an old idea that refuses to die despite repeated failures. Now it’s Google’s turn.

In the late 1980s, Sun Microsystems used a simple, potent war cry to promote its servers: The Network Is The Computer. Entrust all of your business intelligence, computing power, and storage to Sun’s networked SPARC systems and you can replace your expensive workstation with a dumb, low cost machine. PCs are doomed.

Nothing of the sort happened, of course. Sun’s venture was disrupted by inexpensive servers assembled from the PC organ bank and running Open Source software.

PCs prospered, but that didn’t dampen the spirits of those who would rid us of them.

Fast-forward to the mid-1990s and thought re-emerges in a new guise: The Browser Will Be The Operating System (a statement that’s widely misattributed to Marc Andreessen, who holds a more nuanced view on the matter). The browser will serve as a way to access networked services that will process your data. The actual OS on your device, what sort of apps it can run — or even if it can run any (other than a browser) — these questions will fade into insignificance.

Soon after, Oracle took a swing at the Network is the Computer piñata by defining the Network Computer Reference Profile (or NCRP), a specification that focused on network connectivity and deemphasized local storage and processing. It was understood, if not explicitly stated, that an NCRP device must be diskless. A number of manufacturers offered NCRP implementations, including Sun (which would ultimately be acquired by Oracle) with its JavaStation. But despite Larry Ellison’s strongly expressed belief that Network Computers would rid the industry of the evil Microsoft, the effort went nowhere.

Today, The Network Is The Computer lives on under the name Cloud Computing, the purest example of which is a Google Chromebook running on Chrome OS. (And thus, in a sense, Sun’s idea lives on: Google’s first investor was Sun co-founder Andy Bechtolsheim.)

So far, Chromebooks have shown only modest penetration (a topic for musings in a future Monday Note), but despite the slow adoption, Google has become one of the largest and most important Cloud Computing companies on the planet. Combine this with the Android operating system that powers more than a billion active devices, could Google bring us to the point where The Network Really Is The Computer?

It’s a complicated question, partly because the comparison with the previous generation of devices, traditional PCs, can (excuse me) cloud the view.

Unlike PCs, smartphones rely on an expensive wireless infrastructure. One can blame the oligopolistic nature of the wireless carrier industry (in English: too few companies to have a really competitive market), but that doesn’t change the simple fact that wireless bandwidth isn’t cheap. The dumber the device, the more it has to rely on the Cloud to process and store data, and the more bandwidth it will consume.

Let’s visit Marc Andreessen actual words regarding Network-As-Computer, from a 2012 Wired interview [emphasis mine]:

“[I]f you grant me the very big assumption that at some point we will have ubiquitous, high-speed wireless connectivity, then in time everything will end up back in the web model.”

If we interject, on Andreessen’s behalf, that wireless connectivity must be as inexpensive as it is ubiquitous, then we begin to see the problem. The “data hunger” of media intensive apps, from photo processing to games, shows no sign of slowing down. And when you consider the wireless bandwidth scarcity that comes from the rapid expansion of smartphone use, it seems that conditions are, yet again, conspiring against the “dumb device” model.

The situation is further confounded when we consider that Google’s business depends on delivering users to advertisers. Cloud computing will help drive down the cost of Android handsets and thus offer an even wider audience to advertisers…but these advertisers want a pleasant and memorable UI, they want the best canvas for their ads. When you dumb down the phone, you dumb down the ad playback experience.

In a recent blog post titled The next phase of smartphones, Benedict Evans neatly delineates the two leading “cloud views” by contrasting Apple and Google [emphasis mine]:

“Apple’s approach is about a dumb cloud enabling rich apps while Google’s is about devices as dumb glass that are endpoints of cloud services…”

But Google’s “dumb glass” can’t be too dumb.  For its mobile advertising business, Google needs to “see” everything we do on our smartphones, just like it does on our PCs. Evans intimates as much:

“…it seems that Google is trying to make ‘app versus web’ an irrelevant discussion – all content will act like part of the web, searchable and linkable by Google.”

Native apps running on a “really smart” device are inimical to Google’s business model. To keep the advertisers happy, Google would have to “instrument” native apps, insert deep links that will feed its data collection activities.

This is where the Apple vs. Google contrast is particularly significant: iOS apps are not allowed to let advertisers know what we are doing – unless explicitly authorized. Apple’s business model doesn’t rely on peddling our profile to advertisers.

In the end, I wonder if Google really believes in the “dumb glass” approach to smartphones. Perhaps, at least for now, The Computer will remain The Computer.

JLG@mondaynote.com