About Jean-Louis Gassée

http://

Posts by Jean-Louis Gassée:

The Intel Enigma

 

by Jean-Louis Gassée

Intel once turned down the opportunity to become the sole supplier of iPhone processors. Why haven’t they let go of their defocused search for the Next Big Thing and, instead, used All Means Necessary to regain the account?

Intel is a prosperous company. For the quarter ended last September, Intel scored $14.6B in Sales, 65% Gross Margin and $4.5B in Operating Income, a nice progression from the same period a year ago:

348_PL

A 65% Gross Margin is enviable for any company, and exceptional for a hardware maker: Intel’s GM is up in software territory. By comparison, Apple’s Gross Margin – considered too comfortable by followers of the Church of Market Share – stands at 38.6% for the 2014 Fiscal Year ended last September.

But when we take a closer look at the numbers, the picture isn’t as rosy.  Nearly 90% of Intel’s revenue — $12.9B of the total $14.6B  — comes from two groups: PC and Data Center (servers, networking, storage). Intel’s presence in the mobile world? Nonexistent:

348_non_existent

Essentially no revenue for Mobile and Communications, and a $1B loss. Looking at the past four quarters, Intel has lost about $4B in the pursuit of the mobile market (Daniel Eran Dilger says $7B in the past two years).

How did Intel handle the problem? By sweeping it under the rug. In November, Intel CEO Brian Krzanich announced that the company was merging Mobile into the PC group and would discontinue its $51 per Android tablet subsidy in 2015. This came just weeks after Krzanich had proclaimed Mission Accomplished in the tablet field:

“‘We’ve made good progress getting into tablets’ Krzanich told reporters ahead of the annual Intel Developer Forum in San Francisco. ‘We’ve gone from nothing to something where I consider us a real tablet manufacturer.’”

348The company’s inability to break into the mobile field — into any field other than PCs and servers — isn’t new, and it has worried Intel for decades. Company execs and strategists aren’t happy being the hardware half of Wintel, with being yoked to Microsoft’s fortunes. They like the money, but they want a “second source” for their profits, something other than the x-86 market, so they’ve embarked on a never-ending quest for the next stage in the Intel rocket.

(Of course, the company isn’t blind to the benefits of the Wintel alliance: Given two processors of equal merit, the one running Windows fetches the higher price, hence the ferocious tactics that have landed the company in court on several occasions.)

In its search for the Next Big Thing, Intel has tried alternatives to the x-86 architecture and come up with failures such as the iAPX 32 and the Itanium high-end server processor. The latter, a puzzling adoption of HP’s PA-RISC architecture, was quickly dubbed Itanic by tech wags as results failed to match lofty launch projections.

Intel has tried server farms, modems, networking equipment and, I kid you not, toy microscopes, but they somehow never got around to mobile. In the pre-iPhone days of the mobile world, the dominant players — Nokia, Motorola, Palm, Blackberry — all used processors based on the ARM architecture, processors that were too small and inexpensive to interest Intel. No money there, they cost 1/10th or less of a PC processor.

Steve Jobs offered Intel a chance to get into the mobile game: He asked the company to bid on an ARM-derivative for the iPhone. As Paul Otellini, Intel’s CEO at the time, wistfully and gallantly recounted, he gave the opportunity a pass, thinking the numbers (price and quantity) were too low. (An ex-Intel acquaintance told me that the business people felt they should go after Nokia, instead, because of its huge volume at the time.)

In 2006, after missing the iPhone, Intel sold its ARM processor business to Marvell.

When iPhones and Android-based smartphones took off, Intel insisted they weren’t concerned, that they would triumph in the end: We will win because our unapproachable manufacturing technology will produce x-86 processors that are superior in every way to ARM-based competitors.

We’ve heard this line every year since. The latest version is summarized in this slide from a November Investor Meeting:

348_intel_transistor

What Intel contends here is that they always have a three-year lead over their competition. — it’s just a given. What company execs fail to explain is why smartphone manufacturers have failed to see the light, and why Android tablet makers had to be bribed.

Now it seems that Intel has discovered the Internet of Things… and Wearables, of course. If you have the patience, flip through this 66-slide presentation that tells us that IoT will be huge because the objects around us will all become intelligent (a story we’ve already heard from companies such as Cisco — which is also looking for its Next Big Thing).

348_intel_IoT_evryw

Naturally, wearables are in there:

348_intel_IoT_evryw

This is painful. The whole presentation is an Everything And The Kitchen Sink assemblage of unoriginal ideas. There’s no focus in Intel’s Theory of Everything, no way to see when, where, and how the company will actually rise above the IoT noise.

As for wearables — now fashionable in more ways than one — Intel touts its new MICA bracelet:

348_fashion

You can “pre-order” yours at Opening Ceremony and have it delivered in time for Christmas.

Let’s not forget Intel’s partnership with Google for the next-gen Google Glass, nor the company’s acquisition of Basis, a maker of fitness wearables.

Certainly, the more “initiatives” Intel throws at the wall the higher the chances that one of them will stick. But from the outside, it feels like Intel is being driven by courtiers and PowerPoint makers, that senior management really doesn’t know what to do – and what not to do. (Krzanich says he green-lighted the MICA project because his wife approved of it “after using it for several days”.)

Of all the things Intel should and shouldn’t have done, the Apple element figures mightily. Since Intel offered a whopping $51 Android tablet subsidy, a charity that landed its mobile activities $7B in the red over two years, why didn’t the company offer Apple a $10 or $20 subsidy per processor as a way get the manufacturing relationship restarted? ‘We’ll beat Samsung’s prices, we’ll be your second source.’ If Intel’s 14nm process is so superior, how come Intel execs didn’t convince Apple to dump frenemy Samsung?

I see three possible answers.

One is that the 14 nanometer process is woefully late. Deliveries of some Broadwell chips (the nickname of the next round of x-86 processors) are now slated for early- to mid-2105. Apple might feel that Intel’s process needs to mature before it can deliver 300M units.

The second is that Intel’s claim of a three-year technology lead might be less than reliable. Samsung could be closer to delivering 14nm chips than Intel would like us (and itself) to believe.

Or perhaps Intel sees Apple as a real adversary that’s intent on designing all of its own processors, even for laptops and desktops that are currently powered by x-86 chips. But even so, why not become the preferred fabricator?

The Intel enigma remains: There’s no clear, resounding answer to the What’s Next?question, only some lingering puzzlement over What Happened?

JLG@mondaynote.com

Apple Watch: Hard Questions, Facile Predictions

 

by Jean-Louis Gassée

Few Apple products have agitated forecasters and competitors as much as the company’s upcoming watch. The result is an escalation of silly numbers – and one profound observation from a timepiece industry insider.

Apple Watch 2015 sales predictions are upon us: 10 million, 20 million, 24 million, 30 million, even 40 million! Try googling “xx million apple watch”, you won’t be disappointed. Microsoft’s Bing doesn’t put a damper on the enthusiasm either: It finds a prediction for first year sales of 60 million Apple Watches!

These are scientific, irony-free numbers, based on “carefully weighed percentages of iPhone users” complemented by investigations into “supplier orders” and backed up by interviews with “potential buyers”. Such predictions reaffirm our notion that the gyrations and divinations of certain anal-ists and researchers are best appreciated as black comedy— cue PiperJaffray’s Gene Munster with his long-running Apple TV Set gag.

Fortunately, others are more thoughtful. They consider how the product will actually be experienced by real people and how the new Apple product will impact the watch industry.

As you’ll recall from the September 14th “Apple Watch Is And Isn’t”, Jean-Claude Biver, the LVMH executive in charge of luxury watch brands such as Hublot and TAG Heuer, offered his frank opinion of the “too feminine” AppleWatch:

“To be totally honest, it looks like it was designed by a student in their first trimester.” 

At the time, it sounded like You Don’t Need This sour grapes from disconcerted competitor. But recently, Biver has also given us deeper, more meaningful thoughts:

“A smartwatch is very difficult for us because it is contradictory,” said Mr. Biver. “Luxury is supposed to be eternal … How do you justify a $2,000 smart watch whose technology will become obsolete in two years?” he added, waving his iPhone 6. 

Beautiful. All the words count. Luxury and Eternity vs. Moore’s Law.

To help us think about the dilemma that preoccupies the LVMH exec, let’s take a detour through another class of treasured objects: Single Lens Reflex cameras.

347_Nikon_F_Photomic_FTn-2714

 

Unless you were a photojournalist or fashion photographer taking hundreds of pictures a day, these cameras lasted forever. A decade of use would come and go without impact on the quality of your pictures or the solid feel of the product. People treasured their Hasselblads, Leicas (not an SLR), Canons, and more obscure marques such as the Swiss Alpa. (I’m a bit partial, here, I bought a Nikon exactly like the one pictured above back in 1970.)

These were purely mechanical marvels. No battery, the light sensor was powered by…light.

Then, in the mid-nineties, digital electronics begin to sneak in. Sensor chips replaced silver-halide film; microcomputers automated more and more of the picture taking process.

The most obvious victim was Eastman Kodak, a company that had dominated the photographic film industry for more than a century – and filed for bankruptcy in 2012. (A brief moment of contemplation: Kodak owned many digital photography patents and even developed the first digital camera in 1975, but “…the product was dropped for fear it would threaten Kodak’s photographic film business.” [Wikipedia].)

The first digital cameras weren’t so great. Conventional film users rightly criticized the lack of resolution, the chromatic aberrations, and other defects of early implementations. But better sensors, more powerful microprocessors, and clever software won the day. A particular bit of cleverness that has saved a number of dinner party snapshots was introduced in the late-nineties: A digital SLR sends a short burst of flash to evaluate the scene, and then uses the measurements to automatically balance shutter speed and aperture, thus correcting the classical mistake of flooding the subject in the foreground while leaving the background in shadows.

Digital cameras have become so good we now have nostalgia “film packs” that recreate the defects — sorry, the ambiance — of analog film stock such as Ektachrome or Fuji Provia.

But Moore’s Law exacts a heavy price. At the high end, the marvelous digital cameras from Nikon, Canon, and Sony are quickly displaced year after year by new models that have better sensors, faster microprocessors, and improved software. Pros and prosumers can move their lenses — the most expensive pieces of their equipment — from last year’s model to this one’s, but the camera body is obsolete. In this regard, the most prolific iterator seems to be Sony, today’s king of sensor chips; the company introduces new SLR models once or twice a year.

At the medium to low end, the impact of Moore’s law was nearly lethal. Smartphone cameras have become both so good and so convenient (see Chase Jarvis’ The Best Camera is the One That’s With You) that they have displaced almost all other consumer picture taking devices.

What does the history of cameras say for watches?

At the high-end, a watch is a piece of jewelry. Like a vintage Leica or Canon mechanical camera, a Patek watch works for decades, it doesn’t use batteries, and it doesn’t run on software. Mechanical watches have even gained a retro chic among under-forty urbanites who have never had to wind a stem. (A favorite of techies seems to be the Officine Panerai.)

So far, electronic watches haven’t upended the watch industry. They’ve mostly replaced a spring with a battery and have added a few functions and indicator displays – with terrible user interfaces. This is about to change. Better/faster/cheaper organs are poised to invade watches: sensors, microprocessors + software, wireless links…

Jean-Claude Biver is right to wonder how the onslaught of ever-improving technology will affect the “eternity” of the high-end, fashion-conscious watch industry…and he’ll soon find out:  He’s planning a (yet-to-be announced) TAG Heuer smartwatch.

With this in mind, Apple’s approach is intriguing: The company plays the technology angle, of course, and has loaded their watch with an amazing — some might say disquieting — amount of hardware and software, but they also play the fashion and luxury game. The company invited fashion writers to the launch; it hosted a celebrity event at Colette in Paris with the likes of Karl Lagerfeld and Anna Wintour in attendance. The design of the watch, the choice of materials for the case and bands/bracelets… Apple obviously intends to offer customers a differentiated combination of traditional fashion statement and high-tech functions.

But we’re left with a few questions…

Battery life is one question — we don’t know what it will be. The AppleWatch user interface is another.

The product seems to be loaded with features and apps… will users “get” the UI, or will they abandon hard-to-use functions, as we’ve seen in many of today’s complicated watches?

But the biggest question is, of course, Moore’s Law. Smartphone users have no problem upgrading every two years to new models that offer enticing improvements, but part of that ease is afforded by carrier subsidies (and the carriers play the subsidy game well, despite their disingenuous whining).

There’s no carrier subsidy for the AppleWatch. That could be a problem when Moore’s Law makes the $5K high-end model obsolete. (Expert Apple observer John Gruber has wondered if Apple could just update the watch processor or offer a trade-in — that would be novel.)

We’ll see how all of this plays out with regard to sales. I’ll venture that the first million or so AppleWatches will sell easily. I’ll certainly buy one, the entry-level Sports model with the anodized aluminum case and elastomer band. If I like it, I’ll even consider the more expensive version with a steel case and ingenious Marc Newson link bracelet — reselling my original purchase should be easy enough.

Regardless of the actual sales, first-week numbers won’t matter. It’s what happens after that that matters.

Post-purchase Word of Mouth is still the most potent marketing device. Advertising might create awareness, but user buzz is what makes or breaks products such as a watch or phone (as opposed to cigarettes and soft drinks). It will take a couple months after the AppleWatches arrive on the shelves before we can judge whether or not the product will thrive.

Only then can we have a sensible discussion about how the luxury segment of the line might plan to deal with the eternity vs. Moore’s Law question.

JLG@mondaynote.com

Clayton Christensen Becomes His Own Devil’s Advocate

 

by Jean-Louis Gassée

Every generation has its high tech storytellers, pundits who ‘understand’ why products and companies succeed and why they fail. And each next generation tosses out the stories of their elders. Perhaps it’s time to dispense with “Disruption”.

“I’m never wrong.”

Thus spake an East Coast academic, who, in the mid- to late-eighties, parlayed his position into a consulting money pump. He advised — terrorized, actually — big company CEOs with vivid descriptions of their impending failure, and then offered them salvation if they followed his advice. His fee was about $200K per year, per company; he saw no ethical problem in consulting for competing organizations.

The guru and I got into a heated argument while walking around the pool at one of Apple’s regular off-sites. When I disagreed with one of his wild fantasies, his retort never varied: I’m never wrong.

Had I been back in France, I would have told him, in unambiguous and colorful words, what I really thought, but I had acclimated myself to the polite, passive-aggressive California culture and used therapy-speak to “share my feelings of discomfort and puzzlement” at his Never Wrong posture. “I’ve always been proved right… sometimes it simply takes longer than expected”, was his comeback. The integrity of his vision wasn’t to be questioned, even if reality occasionally missed its deadline.

When I had entered the tech business a decade and a half earlier, I marveled at the prophets who could part the sea of facts and reveal the True Way. Then came my brief adventures with the BCG-advised diversification of Exxon into the computer industry.

Preying on the fear of The End of Oil in the late-seventies, consultants from the prestigious Boston company hypnotized company executives with their chant: Information Is The Oil of The 21st Century. Four billion dollars later (a lot of money at the time), Exxon finally recognized the cultural mismatch of the venture and returned to the well-oiled habits of its hearts and minds.

It was simply a matter of time, but the BCG was ultimately proved right — we now have our new Robber Barons of zeroes and ones. But they were wrong about something even more fundamental but slippery, something they couldn’t divine from their acetate foils: culture.

A little later, we had In Search of Excellence, the 1982 best-seller that turned into a cult. Tom Peters, the more exuberant of the book’s two authors, was a constant on pledge-drive public TV. As I watched him one Sunday morning with the sound off, his sweaty fervor and cutting gestures reminded me of the Bible-thumping preacher, Jimmy “I Sinned Against You” Swaggart. (These were my early days in California; I flipped through a lot of TV channels before Sunday breakfast, dazzled by the excess.)

Within a couple of years, several of the book’s exemplary companies — NCR, Wang, Xerox — weren’t doing so well. Peters’ visibility led to noisy accusations and equally loud denials of faking the data, or at least of carefully picking particulars.

These false prophets commit abuses under the color of authority. They want us to respect their craft as a form of science, when what they’re really doing is what Neil Postman, one of my favorite curmudgeons, views as simple storytelling: They felicitously arrange the facts in order to soothe anxiety in the face of a confusing if not revolting reality. (Two enjoyable and enlightening Postman books: Conscientious Objections, a series of accessible essays, and Amusing Ourselves To Death, heavier, very serious fare.)

A more recent and widely celebrated case of storytelling in a scientist’s lab coat is Clayton Christensen’s theory of disruptive innovation. In order to succeed these days — and, especially, to pique an investor’s interest — a new venture must be disruptive, with extra credit if the disrupter has attended the Disrupt conference and bears a Renommierschmiss from the Startup Battlefield.

345_christensen__
(Credit: www.claytonchristensen.com )

Christensen’s body of work is (mostly) complex, sober, and nuanced storytelling that’s ill-served by the overly-simple and bellicose Disruption! battle cry. Nonetheless, I’ll do my share and provide my own tech world simplification: The incumbency of your established company is forever threatened by lower cost versions of the products and services you provide. To avoid impending doom, you must enrich your offering and engorge your price tag. As you abandon the low end, the interloper gains business, muscles up, and chases you farther up the price ladder. Some day — and it’s simply a matter of time — the disruptor will displace you.

According to Christensen, real examples abound. The archetypes, in the tech world, are the evolution of the disk drive, and the disruptive ascension from mainframe to minicomputer to PC – and today’s SDN (Software Defined Networking) entrants.

But recently, skeptical voices have disrupted the Disruption business.

Ben Thompson (@monkbent) wrote a learned paper that explains What Clayton Christensen Got Wrong. In essence, Ben says, disruption theory is an elegant explanation of situations where the customer is a business that’s focused on cost. If the customer is a consumer, price is often trumped by the ineffable values (ease-of-use, primarily) that can only be experienced, that can’t be described in a dry bullet list of features.

More broadly, Christensen came under attack by Jill Lepore, the New Yorker staff writer who, like Christensen, is a Harvard academic. In a piece titled The Disruption Machine, What the gospel of innovation gets wrong, Lepore asserts her credentials as a techie and then proceeds to point out numerous examples where Christensen’s vaunted storytelling is at odds with facts [emphasis and edits mine]:

“In fact, Seagate Technology was not felled by disruption. Between 1989 and 1990, its sales doubled, reaching $2.4 billion, “more than all of its U.S. competitors combined,” according to an industry report. In 1997, the year Christensen published ‘The Innovator’s Dilemma,”’Seagate was the largest company in the disk-drive industry, reporting revenues of nine billion dollars. Last year, Seagate shipped its two-billionth disk drive. Most of the entrant firms celebrated by Christensen as triumphant disrupters, on the other hand, no longer exist

Between 1982 and 1984, Micropolis made the disruptive leap from eight-inch to 5.25-inch drives through what Christensen credits as the ‘Herculean managerial effort’ of its C.E.O., Stuart Mahon. But, shortly thereafter, Micropolis, unable to compete with companies like Seagate, failed. 

MiniScribe, founded in 1980, started out selling 5.25-inch drives and saw quick success. ‘That was MiniScribe’s hour of glory,’ the company’s founder later said. ‘We had our hour of infamy shortly after that.’ In 1989, MiniScribe was investigated for fraud and soon collapsed; a report charged that the company’s practices included fabricated financial reports and ‘shipping bricks and scrap parts disguised as disk drives.’”

Echoes of the companies that Tom Peters celebrated when he went searching for excellence.

Christensen is admired for his towering intellect and also for his courage facing health challenges — one of my children has witnessed both and can vouch for the scholar’s inspiring presence. Unfortunately, his reaction to Lepore’s criticism was less admirable. In a BusinessWeek interview Christensen sounds miffed and entitled:

“I hope you can understand why I am mad that a woman of her stature could perform such a criminal act of dishonesty—at Harvard, of all places.”

At Harvard, of all places. Hmmm…

In another attempt to disprove Jill Lepore’s disproof, a San Francisco- based investment banker wrote a scholarly rearrangement of Disruption epicycles. In his TechCrunch post, the gentleman glows with confidence in his use of the theory to predict venture investment successes and failures:

“Adding all survival and failure predictions together, the total gross accuracy was 84 percent.”

and…

“In each case, the predictions have sustained 99 percent levels of statistical confidence without a flinch.”

Why the venture industry hasn’t embraced the model, and why the individual hasn’t become richer than Warren Buffet as a result of the unflinching accuracy remains a story to be told.

Back to the Disruption sage, he didn’t help his case when, as soon as the iPhone came out, he predicted Apple’s new device was vulnerable to disruption:

“The iPhone is a sustaining technology relative to Nokia. In other words, Apple is leaping ahead on the sustaining curve [by building a better phone]. But the prediction of the theory would be that Apple won’t succeed with the iPhone. They’ve launched an innovation that the existing players in the industry are heavily motivated to beat: It’s not [truly] disruptive. History speaks pretty loudly on that, that the probability of success is going to be limited.”

Not truly disruptive? Five years later, in 2012, Christensen had an opportunity to let “disruptive facts” enter his thinking. But no, he stuck to his contention that Modularity always defeats integration:

“I worry that modularity will do its work on Apple.”

In 2013, Ben Thompson, in his already quoted piece, called Christensen out for sticking to his theory:

“[…] the theory of low-end disruption is fundamentally flawed. And Christensen is going to go 0 for 3.”

Perhaps, like our poolside guru, Christensen believes he’s always right…but, on rare occasions, he’s simply wrong on the timing.

Apple will, of course, eventually meet its maker, whether through some far off, prolonged mediocrity, or by a swift, regrettable decision. But such predictions are useless, they’re storytelling – and a bad, facile kind at that. What would be really interesting and courageous would be a detailed scenario of Apple’s failure, complete with a calendar of main steps towards the preordained ending. No more Wrong on the Timing excuses.

A more interesting turn for a man of Christensen’s intellect and reach inside academia would be to become his own Devil’s Advocate. Good lawyers pride themselves in researching their cases so well they could plead either side. Perhaps Clayton Christensen could explain, with his usual authority, how the iPhone defines a new theory of innovation. Or why the Macintosh has prospered and ended up disrupting the PC business by sucking up half of the segment profits. He could then draw comparisons to other premium goods that are happily chosen by consumers, from cars to clothes and…watches.

JLG@mondaynote.com

Payment Systems Adventures – Part II: Counting Friends And Foes

 

by Jean-Louis Gassée

It’s still too early to tell if Apple Pay will square the circle and emerge as a payment system that’s more secure, more convenient, and is widely accepted. MCX, a competing solution that faces more challenges than Apple Pay, helps shed light on the problem.

Apple Pay was announced on September 9th with the new iPhone 6, and rolled out on October 20th.

Where it works, it works well. The roster of banks and merchants that accept Apple’s new payment system is impressive, with big names such as Visa, American Express, Bank of America, Macy’s, Walgreens, and Whole Foods.

But it doesn’t work everywhere.

At launch, Apple Pay covered just a corner of the territory blanketed by today’s debit and credit cards. Then we had a real surprise. Within 24 hours of the roll-out, a handful of merchants, notably CVS, Rite-Aid, Target, and Wal-Mart, pulled the plug on Apple Pay. Apparently, these retailers suddenly remembered they had signed an exclusive agreement with Merchant Customer Exchange (MCX), a consortium of merchants that’s developing a competing payment system and mobile app called CurrentC. How a company as well-managed as CVS could have “forgotten” about its contract with MCX, and what the threatened consequences were for this lapse of memory aren’t known…yet.

We could wade through the professions of good faith and sworn allegiance (“We are committed to offering convenient, reliable, and secure payment methods that meet the needs of our customers”, says Rite Aid PR flack Ashley Flower), but perhaps we’re better off just listing MCX’s Friends and Foes.

Let’s start with the Foes: MCX hates credit cards. As Ron Shevlin of Snarketing 2.0 reports, the hatred isn’t even veiled:

“At last year’s BAI Retail Delivery conference…I asked Mr. Scott [Lee Scott, former Wal-Mart CEO] why, in the face of so many failed consortia before it, would MCX succeed? He said: ‘I don’t know that it will, and I don’t care. As long as Visa suffers.’”

This open animosity is understandable. When we look at Wal-Mart’s latest financials, we see that the company’s net income is 3.1% of sales. A typical Visa transaction costs them 1.51% of the amount that was charged. (See Credit Card Processing Fees & Rates for mind-numbing esoterica.)

For Wal-Mart and other big merchants, this 1.51% “donation” cuts too close to the bone, which is why they banded together to form the MCX consortium.

So we know who MCX’s Foes are…but does it have any Friends?

Not really. Counting the MCX merchants themselves as Friends is a bit of a circular argument — no sin there, it’s business — but it doesn’t build a compelling case for the platform.

What about consumers?

On paper, the MCX idea is simple: You download the CurrentC app onto your mobile phone and connect it to a bank account (ABA routing and account number). When it comes time to pay for a purchase, CurrentC displays a QR code that you present to the cashier. The code is scanned, there’s a bit of network chatter, and money is pumped directly out of your bank account.

Set-up details are still a bit sketchy. For example, the CurrentC trial run required the customer’s social security and driver’s license numbers in addition to the bank info. MCX says it doesn’t “expect” to have these additional requirements when CurrentC launches in early 2015, but I’m not sure that it matters. The requirement that the customer supply full banking details and then watch as money is siphoned off without delay is essentially no different from a debit card — but with a middle man inserted into the process. And while debit card use surpassed credit cards as far back as 2007, US shoppers are loathe to leave the warm embrace of their credits cards when it comes to big ticket purchases (average debit card charge in 2012: $37; credit card: $97; see here for yet more estorica).

What does MCX and CurrentC offer that would entice consumers to abandon their credit and debit cards and give merchants direct access to their bank accounts? The consortium can’t offer much in the way of financial incentives, not when the whole point is to remedy Visa’s 1.51% processing fee.

Now let’s look at Apple Pay; first, consumers.

Apple has recognized the strong bond between consumers and their credit cards: The average wallet contains 3.7 cards, with a balance of $7.3K outstanding. Apple Pay doesn’t replace credit cards so much as it makes the relationship more secure and convenient.

Set up is surprisingly error-free — and I’m always expecting bugs (more on that in a future note). The credit card that’s connected to your iTunes account is used by default, all you have to do is launch Passbook and re-enter the CVV number on the back. If you want to use a different credit card account, you take a picture of the card and Passbook verifies it with the issuer. Debit cards also work, although you have to call the bank…as in an actual telephone call. In my case, the bank had a dedicated 877 number. Less than 30 seconds later a confirmation appeared on my device.

Paying is simple: Gently tap the phone on a compatible, NFC-enabled point-of-sale terminal and place a registered finger on the TouchID button; the phone logs the transaction in Passbook and then vibrates pleasantly to confirm.

344-1

On the security side, Apple Pay doesn’t store your credit card number, neither on your  phone nor on Apple’s servers. Instead, the card is represented by an encrypted token; the most you can ever see are the last four digits of the card — even on an unlocked phone, even when you’re deleting a card from your Passbook.

Simplifying a bit (or a lot), during a transaction this encrypted token is sent through the NFC terminal back to your bank where it’s decrypted. Not even the merchant can see the card.

We can also count the banks and credit card companies as Friends of Apple Pay. For them, nothing much changes. A small fee goes to Apple (0.15%, $1 for every $700). Apple Pay isn’t meant to make money in itself, its goal is to make iDevices more pleasant, more secure.

Banks also like the potential for cutting down on fraud. In 2013, payment card fraud was pegged at $14B globally with half of that in the US. How deeply Apple Pay will cut into this number isn’t known, but the breadth and warmth of Apple Pay adoption by financial institutions speaks for their expectations. Wells Fargo, for example, put up a large billboard over the 101 freeway and promoted the service on social media:

What about merchants? This is a mixed bag; some seem to be fully on board, although, as ever, we mustn’t judge by what they say for the flackery on the left is just as disingenuous as the flackery on the right. Regard the declaration from pro-Apple Pay Walgreens: “Incorporating the latest mobile technology into our business is another way we are offering ultimate convenience for our customers.” Sound familiar?

Others, such as Wal-Mart are resolute Foes. Of the fence sitters, time will tell if they’ll jump into the Apple Pay camp or desert it. It’s still very early.

Questions remain regarding “loyalty” programs, a cynical word if there ever was one when considering the roach motels of frequent flyer miles. A quick look at in-app payments provides a possible answer.

One such example, no surprise, is Apple’s own App Store app where you can pay with Apple Pay after scanning an accessory’s barcode. The app triggers a confirmation email that shows that the merchant, Apple, is aware of the transaction. Other merchants can, and will, build their own apps, but there’s still the question of how a loyalty program will work for point-of-sale transactions where merchants can’t see your data.

In a clumsily worded comparison, MCX CEO Dekkers Davidson tries to imply that his company’s exclusivity requirement is much like AT&T’s arrangement with Apple in the early days of the iPhone, and arrangement that wasn’t permanent and that worked out well for both parties. In the meantime, one can visualize Apple engaging in an encircling action, patiently adding partners and features quarter after quarter.

We’ll know soon if this battle is won before it’s even fought.

JLG@mondaynote.com

Cultural Adventures In Payment Systems – Part I

 

by Jean-Louis Gassée

Payment systems and user behaviors have evolved over the past three decades. In this first of a two-part Monday Note, I offer a look at the obstacles and developments that preceded the Apple Pay launch.

When I landed in Cupertino in 1985, I was shocked, shocked to find that so much gambling was going on in here. But it wasn’t the Rick’s Café Américain kind of gambling, it was the just-as-chancy use of plastic: Colleagues would heedlessly offer their credit card numbers to merchants over the phone; serious, disciplined executives would hand their AmEx Platinums to their assistants without a second thought.

This insouciant way of doing business was unheard of in my Gallic homeland. The French (and most Europeans) think that trust is something that must be earned, that it has a value that is debased when it’s handed out too freely. They think an American’s trusting optimism is naïve, even infantile.

After I got over my shock, I came to see that my new countrymates weren’t such greenhorns. They understood that if you want to lubricate the wheels of commerce, you have to risk an occasional loss, that the rare, easily-remedied abuses are more than compensated for by a vibrant business. It wasn’t long before I, too, was asking my assistant to run to the store with my Visa to make last-minute purchases before a trip.

(On the importance of Trust and its contribution to The Wealth of Nations — or their poverty — see Alain Peyrefitte’s La Société de Confiance [The Society of Trust]. Unfortunately the work hasn’t been translated into English, unlike two of Peyrefitte’s other books, The Trouble with France and the prophetic 1972 best-seller The Immobile Empire. The title of the latter is a deplorable translation of Quand la Chine s’éveillera… Le monde tremblera, “When China Awakes, The World Will Shake”, a foreboding attributed to Napoleon.)

The respective attitudes towards trust point out a profound cultural difference between my two countries. But I also noticed other differences that made my new environment feel a little antiquated.

For example, direct deposit and direct deduction weren’t nearly as prevalent in America as in France. In Cupertino, I received a direct deposit paycheck, but checks to cover expenses were still “cut”, and I had to write checks for utilities and taxes and drop them in the mailbox.

Back in Paris, everything had been directly wired into and out of my bank account. Utilities were automatically deducted ten days after the bill was sent, as mandated by law (the delay allowed for protests and stop-payments if warranted). Paying taxes was ingeniously simple: Every month through October, a tenth of last year’s total tax was deducted from your bank account. In November and December, you got a reprieve for Holiday spending fun (or, if your income had gone up, additional tax payments to Uncle François — Mitterrand at the time, not Hollande).

Like a true Frenchman, I once mocked these “primitive” American ways in a conversation with a Bank of America exec in California. A true Californian, she smiled, treated me to a well-rehearsed Feel-Felt-Found comeback, and then, dropping the professional mask, she told me that the distrust of electronic commerce that so astonished me here in Silicon Valley (of all places), it was nothing compared to Florida where it’s common for retirees to cash their Social Security checks at the bank, count the physical banknotes and coins, and then deposit the money into their accounts.

Perhaps this was the heart of the “Trust Gap” between Europe and the US: Europeans have no problem trusting electronic commerce as long as it doesn’t involve people; Americans trust people, not machines.

My fascination with electronic payment modes preceded my new life in Silicon Valley. In 1981, shortly after starting Apple France, I met Roland Moreno, the colorful Apple ][ hardware and software developer who invented the carte à puce (literally “chip card”, but better known as a “smart card”) that’s found in a growing number of credit cards, and in mobile phones where it’s used as a Subscriber Identity Module (SIM).

343_jlg

The key to Moreno’s device was that it could securely store a small amount of information, hence its applicability to payment cards and mobile phones.

I carried memories of my conversations with Moreno with me to Cupertino. In 1986, we briefly considered adding a smart card reader to the new ADB Mac keyboard, but nothing came of it. A decade later, Apple made a feeble effort to promote the smart card for medical applications such as a patient ID, but nothing came of that, either.

The results of the credit cards industry’s foray into smart card technology were just as tepid. In 2002, American Express introduced its Blue smart card in the US with little success:

“But even if you have Blue (and Blue accounts for nearly 10% of AmEx’s 50 million cards), you may still have a question: What the hell does that chip (and smart cards in general) do?

The answer: Mostly, nothing. So few stores have smart-card readers that Blue relies on its magnetic strip for routine charges.”

In the meantime, the secure smart chip found its way into a number of payment cards in Europe, thus broadening the Trust Gap between the Old and New Worlds, and heightening Roland’s virtuous and vehement indignation.

(Moreno, who passed away in 2012, was a true polymath; he was an author, gourmand, inventor of curious musical instruments, and, I add without judgment, an ardent connoisseur of a wide range of earthly delights).

Next came the “Chip and PIN” model. Despite its better security — the customer had to enter a PIN after the smart card was recognized — Chip and PIN never made it to the US, not only because there were no terminals into which the customers could type their PINs (let alone that could read the smart cards in the first place), but, just as important, because there was a reluctance on the part of the credit card companies to disturb ingrained customer behavior.

It appeared that smart cards in the US were destined to butt up against these two insurmountable obstacles: The need for a new infrastructure of payment terminals and a skepticism that American customers would change their ingrained behavior to accept them.

In 2003, I made a bad investment in the payment system field on behalf of the venture company I had just joined. The entrepreneur that came to us had extensive “domain knowledge” and proposed an elegant way to jump over both the infrastructure and the customer behavior obstacles by foregoing the smart card altogether. Instead, he would secure the credit card’s magnetic stripe.

(more next page)

Tim Cook Free At Last

 

by Jean-Louis Gassée

Trading one’s privacy for the benefit of others isn’t an easy decision. Tim Cook just made such a swap, and the reverberations are beginning to be heard.

I’m happy and relieved that Tim Cook decided to “come out”, to renounce his cherished privacy and speak of his sexual orientation in plain terms rather than veiled, contorted misdirections. The unsaid is toxic.

If you haven’t done so already, please take the time to read Tim’s I’m Proud to Be Gay Businessweek editorial. Soberly written and discreetly moving, the piece concludes with:

“…I’m doing my part, however small, to help others. We pave the sunlit path toward justice together, brick by brick. This is my brick.”

It’s an admirable cause…but why should I care? Why does this 70-year old French-born American, a happily married-up father of three adult and inexplicably civilized children, care that Cook’s sexuality is now part of the public record?

342-Tim_cook

First, I like and respect Cook for what he does, how he does it, and the way he handles his critics. For the past three years he’s been bombarded by questions about Apple’s slowing growth and the absent Next Big Thing, he’s been criticized for both hastening and impeding the inevitable commoditization of All Things Apple, he’s been called a liar by the NYT. Above all, he’s had to suffer the hidden — and occasionally blatant — accusation: You’re no Steve Jobs.

Throughout it all, Cook has displayed a preternatural calm in refusing to take the bait. In a previous Monday Note, I attributed his ability to deflect the cruel jibes to his having grown up “different” in Alabama. In his editorial, Cook confirms as much:

“It’s been tough and uncomfortable at times… [but] it’s also given me the skin of a rhinoceros, which comes in handy when you’re the CEO of Apple.”

Second, I’ve seen the ravages of homophobia at close range. A salient and personal example is the young gay architect of our first Palo Alto house. He quickly sensed he could be open with us, and would tease my wife Brigitte by showing her pictures of a glorious group of young bucks on vacation in Greece, adding, “What a loss for females”. But he also told us of his shame when he became aware of his desires in his adolescence, that he kneeled down every night to pray that his god would have mercy and make him “normal”. His parents rejected him and refused to keep in touch, even after the HIV virus made him perilously sick.

One morning when we were driving to his place in San Francisco to deliver a painting Brigitte had made for him, his partner called and told us not to come. Our friend had just passed away, still unaccepted by his parents.

Another personal example. A local therapist, a gay Buddhist, told me he couldn’t work as an M.D. in his native Caracas because the oppressive culture wouldn’t allow a gay man to so much as touch another man — even as a doctor. When he decided to tell his parents he was gay, he had to take them to a California mountain and mellow them with a certain herb before they would hear him out, and even then they didn’t entirely embrace his “choice” of sexuality.

Years of conversation with the fellow — who’s exactly my age — in a setting that facilitates honesty have brought empathy and insights that aren’t prevalent or even encouraged in the Parisian culture I come from, even in the supposedly liberated Left Bank that has been the home of lionized gay men such as Yves Saint-Laurent and Karl Lagerfeld. (I recommend Alicia Drake’s The Beautiful Fall. Lagerfeld, Saint Laurent, and Glorious Excess in 1970s Paris, a well-document and beautifully written parallel life history.)

This leads me to my third point, brought up by my wife. Gays have always been accepted in creative milieus. In many fields — fashion, certainly, but even in high tech — it’s almost expected that a “designer” is homosexual. Despite counter examples such as  Christian Lacroix, or our own Sir Jony, the stereotype endures.

According to the stereotype, it’s okay for “artistes” (I’ve learned the proper dismissive pronunciation, an elongated ‘eee’ after the first ’t’) to be unconventional, but serious business people must be straight. When I landed in Cupertino in 1985, I became acquainted with the creative <=> gay knee jerk. True-blue business people who didn’t like Apple took to calling us “fags” because of our “creative excesses” and disregard of the establishment.

What Brigitte likes most about Cook’s coming out is that it portends a liberation of the Creative Ghetto. Cook isn’t just outing himself has a gay executive; he’s declaring that being gay — or “creatively excessive”, or unconventional — is fully appropriate at the very top of American business. It helps, she concludes, that Apple’s CEO has made his statement from a position of strength, at a time when the company’s fortunes have reached a new peak and his leadership is more fully recognized than ever.

The ripples now start. Perhaps they’ll bring retroactive comfort to many execs such as former BP CEO John Browne who, in 2007, left his job in fear of a revelation about his lifestyle – and an affirmation to myriads of “different” people at the bottom of the pyramid.

Tim Cook brings hope of a more accepting world – both inside and outside of business. For this he must be happy, and so am I.

And, while I’m at it, Happy Birthday.

JLG@mondaynote.com

Science Fiction: Apple Makes A Toaster Fridge…

 

…a supremely elegant one, naturally.

Plummeting iPad sales rekindle fantasies of a hybrid device, a version that adopts PC attributes, something like a better execution of the Microsoft Surface Pro concept. Or not.

For a company that has gained a well-deserved reputation for its genre-shifting — even genre-creating — devices, it might seem odd that these devices evolve relatively slowly, almost reluctantly, after they’ve been introduced.

It took five years for the iPhone to grow from its original 3.5” in 2007, to a doubled 326 ppi on the same screen size for the June 2010 iPhone 4, to a 5” screen for the 2012 iPhone 5.

In the meantime, Samsung’s 5.3” Galaxy Note, released in 2011, was quickly followed by a 5.5” phablet version. Not to be outdone, Sony’s 2013 Xperia Z Ultra reached 6.4” (160 mm). And nothing could match the growth spurt of the long-forgotten (and discontinued) Dell Streak: from 5” in 2010 to 7” a year later.

Moreover, Apple’s leadership has a reputation — again, well-deserved — of being dismissive of the notion that their inspired creations need to evolve. While dealing with the iPhone 4 antenna fracas at a specially convened press event in 2010, a feisty Steve Jobs took the opportunity to ridicule Apple’s Brobdingnagian smarphone rivals, calling them “Hummers”, predicting that no one will buy a phone so big “you can’t get your hand around it”.

A smaller iPad? Nah, you’d have to shave your fingertips. Quoting the Grand Master in October 2010 [emphasis mine]:

“While one could increase the resolution to make up some of the difference, it is meaningless unless your tablet also includes sandpaper, so that the user can sand down their fingers to around one-quarter of their present size. Apple has done expensive user testing on touch interfaces over many years, and we really understand this stuff.

There are clear limits of how close you can place physical elements on a touch screen, before users cannot reliably tap, flick or pinch them. This is one of the key reasons we think the 10-inch screen size is the minimum size required to create great tablet apps.

For his part, Tim Cook has repeatedly used the “toaster-fridge” metaphor to dismiss the idea that the iPad needs a keyboard… and to diss hybrid tablet-PC devices such as Microsoft’s Surface Pro, starting with an April 2012 Earnings Call [emphasis and stitching mine]:

“You can converge a toaster and a refrigerator, but those aren’t going to be pleasing to the user. […] We are not going to that party, but others might from a defensive point of view.”

Recently, however, Apple management has adopted a more nuanced position. In a May 2013 AllThings D interview, Tim Cook cautiously danced around the iPhone screen size topic — although he didn’t waste the opportunity to throw a barb at Samsung [insert and emphasis mine]:

“We haven’t [done a bigger screen] so far, that doesn’t shut off the future. It takes a lot of really detailed work to do a phone right when you do the hardware, the software and services around it. We’ve chosen to put our energy in getting those right and have made the choices in order to do that and we haven’t become defocused working multiple lines.”

Sixteen months later, Apple’s Fall 2014 smartphone line-up sports three screen sizes: the 4” iPhone 5C and 5S , the new 4.7” iPhone 6, and the 5.5” iPhone 6 Plus phablet.

Is this apostasy? Fecklessness?

Remarking on Jobs’ quotable but not-always-lasting pronouncements, Cook gives us this:

“[Jobs] would flip on something so fast that you would forget that he was the one taking the 180 degree polar [opposite] position the day before. I saw it daily. This is a gift, because things do change, and it takes courage to change. It takes courage to say, ‘I was wrong.’ I think he had that.”

That brings us to the future of the iPad. In the same interview (in 2012) Cook expressed high hopes for Apple’s tablet:

“The tablet market is going to be huge… As the ecosystem gets better and better and we continue to double down on making great products, I think the limit here is nowhere in sight.”

Less than two years after the sky-is-the-limit pronouncement, iPad unit sales started to head South and have now plummeted for three quarters in a row (- 2,3%, – 9% and – 13% for the latest period). This isn’t to say that the iPad is losing ground to its competitors, unless you include $50 models. Microsoft just claimed $903M in Surface Pro revenue for the quarter ended last September, which, at $1K per hybrid, would be .9M units, or double that number if the company only sold its $499 year-old model. For reference, 12.3M iPads were sold in the same period (I don’t know any company, other than Apple, that discloses its tablet unit volume).

As Andreessen Horowitz’s Benedict Evans felicitously tweets it: There’re 2 tablet markets: next-gen computing vision, where Apple has 80%, and, bigger but quite separate, the cheap TV/casual games device.”

Still, the concern remains. Does the iPad own 80% of a shrinking market, or can the Cupertino team reboot sales and fulfill Tim Cook’s The Limit Is Nowhere In Sight promise?

What’s missing?

A hint might lie in plain sight at the coffee shop next door. We see laptops, a Kindle reader or two, and iPads – many with an attached keyboard. Toaster-fridges!

But here’s Craig Federighi, Apple’s Sr. VP of Software Engineering, who is fond of dismissing talk of touch-screen Macs:

“We don’t think it’s the right interface, honestly.”

I find Federighi’s remark a bit facile. Yes, touching the screen makes much more ergonomic sense for a tablet than for a laptop, but in view of the turnabouts discussed above, I don’t quite know what to make of the honestly part.

Frederigh may be entombed in the OS X and iOS software caves, but can he honestly ignore the beautiful Apple Wireless Keyboard proposed as an iPad accessory, or the many Logitech, Incase, and Belkin keyboards offered in the company’s on-line store? (Amazon ranks such keyboards between #20 and #30 in their bestsellers lists.) Is he suborning others to commit the crime of toaster-fridging?

In any case, the iPad + keyboard combo is an incomplete solution. It’s not that the device suffers from a lack of apps. Despite its poor curation, the App Store’s 675,000 iPad apps offer productivity, entertainment, education, graphic composition and editing, music creation, story-telling, and many other tools. As Father Horace (Dediu) likes to put it, the iPad can be “hired to do interesting jobs”.

No, what’s missing is that the iOS user interface building blocks are not keyboard-friendly. And when you start to list what needs to be done, such as adding a cursor, the iPad hybrid looks more and more like a Mac…but a Mac with smaller margins. The 128GB iPad plus an Apple Keyboard rings up at $131 less than a 11”, 128GB MacBook Air. (As an added benefit, perhaps the Apple toaster-fridge would come bundled with Gene Munster’s repeatedly predicted TV Set.)

On to better science fiction.

Let’s imagine what might happen next quarter when Intel finally ships the long-promised Broadwell processors. The new chips’ primary selling point is reduced power consumption. The Broadwell probably won’t dislodge ARM SoCs from smartphones, but a reduced appetite for electricity could enable a smaller, slimmer, lighter MacBook Air 2, with or without a double (linear) density Retina display.

Now consider last quarter’s iPad and Mac numbers, compared to the previous year:

341_jlg_table

Mac units grew 25% year-on-year, while iPads experienced a 7% decrease.

You’re in Apple’s driver seat: Do you try to make the iPad feel more like a Mac despite the risks on many levels (internal engineering, app developers, UI issues), or do you let nature to take its course and let the segment of more demanding users gravitate to the Mac, cannibalizing iPad sales as a result? Put another way, are you willing to risk the satisfaction of users who enjoy “pure tablet” simplicity in order to win over customers who will naturally choose a nimbler Mac?

JLG@mondaynote.com

PS: John Kirk just published a column titled The Apple Mac Takes Its Place In The Post-PC World where he digs up a prophetic Gates quote and explains the rise of the Mac as the weapon of choice for power users.

The iPad’s Future

 

The new iPad Air 2 is more than a mere iteration, but the real revolution in the iPad line may be heralded by the introduction of the iPhone 6 Plus.

The new iPad Air 2 looks and feels terrific. Hold an iPad mini in one hand and an iPad Air 2 in the other —  they seem to weigh about the same. This is an illusion: The 341 gram mini is lighter than the 444 gram Air 2 (.75 vs .98 pounds; both with cellular equipment), but the Air 2 is almost impossibly thin. At 6.1 mm, the Air 2 makes the mini’s 7.4 mm feel bulky.

340_aplSlim

The iPad Air 2 also has an improved screen, a better camera, enhanced motion capture, faster processing, and, perhaps most important, it has Touch ID, Apple’s fingerprint recognition system. This is a bigger deal than initially reported. For businesses that have increasingly stringent security requirements, Touch ID is a welcome replacement for annoying password entry and will help the selling iPads in “compliance-burdened” enterprises. (On this, and the rest of Apple’s announcements, see Charles Arthur’s column in The Guardian, IMHO the best overview.)

And liberation from the password or, more important, from lazy security, isn’t limited to IT-controlled environments. I hear from normal humans that they love the Apple Pay + Touch ID combination for their online shopping, an activity that was previously more convenient on a conventional PC.

If a MacBook Air showed up with a comparable pile of improvements, there would be oohs and aahs all over the Kommentariat. Instead, the slimmed-down, sped up iPad Air 2 has been met with either tepid, supercilious praise (“If the iPad has never appealed to you as a product, the Air 2 probably won’t change your mind”; CNET) or borderline dismissal on the grounds that it won’t fix iPad’s slowing sales (“But it is not clear that making the iPad Air 2 the Twiggy of tablet devices will be enough to reinvigorate Apple’s iPad sales”; NYT).

Indeed, after growing faster than anything in tech history, tablets have stalled. For the past three quarters unit sales have plummeted: iPad sales fell by 2.29% in the first (calendar) quarter of 2014 versus the same quarter in 2013, and they fell by 9% in Q2:

340_appl_tabl

(A thank-you to Apple for providing actual unit and revenue numbers for their product lines— does any other company do that?)

When Apple releases its fiscal Q4 numbers this coming Monday, we’ll find out how “poorly” the iPad did in the July to September period. We don’t expect the numbers to show a turn around, neither for the quarter and certainly not for the entire fiscal year.

Some folks look at these numbers and question the device’s future (Apple iPad Fad is Over). But technological viability and short-term sales effects are two different topics.

In The iPad Is a Tease last April and The Sweet Spot On Apple’s Racket in August, I tried to separate the merits of the tablet genre, which I see as established and durable, from the unreasonable expectations that arose from the sense of liberation from PC obfuscation. If you see the tablet as a one-for-one replacement for a PC, you’ll be disappointed, and the falling iPad sales will look like an inevitable skid into obsolescence. I flirted with membership in that camp when I accused the iPad of being unsympathetic to “ambitious” users (iPad and File Systems: Failure of Empathy; in my defense, that was in early 2013 — eons ago in tech time).

I’ve since recanted. Instead of a hybrid product as promoted by Microsoft, the sweet spot in Apple’s business model seems to be a tablet and a laptop, each one used for what it does best, unencumbered by hybridization.

As Tim Cook noted last week, Mac sales (laptops, mostly) grew 18% in the last reported quarter. This time, contrary to earlier expectations, it looks like the Mac is cannibalizing the iPad… not a bad “problem” to have. And it’s nothing like the evisceration of iPod sales after the iPhone was introduced. With the advent of the iPhone, the music player became an ingredient, it was no longer a standalone genre.

The new Air 2 won’t put iPad sales back on its previous growth curve… and I don’t think Apple is troubled by this. Making the iPad Air nimbler and more useful, a stand-out in a crowd of tablets, that’s Apple’s strategy, and it’s more than good enough — for the time being.

Talk of Apple’s game plan brings us to the iPhone 6 Plus. (These lengthening product names bring bad memories form the auto industry, but what can Apple do?) Does the new, larger iPhone say something about the future of the iPad mini?

I once thought the mini was the “real” iPad because I could carry it everywhere in a jacket pocket. But about two weeks ago I bought an iPhone 6 Plus, and I haven’t touched my mini since. (As punishment for my sin, I found 52 apps awaiting an update when I finally turned on the mini this morning…) Now I have an “iPad micro” in my (front) jeans pocket…and it makes phone calls.

With the introduction of the iPhone 6 Plus, the iDevices playing field has changed: A broader range of iPhones could “chase” the iPad upwards, creating opportunity for a beefier “iPad Pro”. Or perhaps Apple will use its now-proven microprocessor design muscle to make a lighter, nimbler MacBook Air.

Whatever Apple does next, the iPhone 6 Plus might prove to be a turning point.

JLG@mondaynote.com

HP’s Old Curses

 

Finally! HP did what everyone but its CEO and Board thought inevitable: They spun off the commoditized PC and printing businesses. This is an opportunity to look deeper into HP’s culture for roots of today’s probably unsolvable problems.

The visionary sheep of Corporate America are making a sharp 180º turn in remarkable lockstep. Conglomerates and diversification strategies are out. Focus, focus, focus is now the path to growth and earnings purity.

As reported in last week’s Monday Note, eBay’s John Donahoe no longer believes that eBay and PayPal “make sense together”, that splitting the companies “gives the kind of strategic focus and flexibility that we think will be necessary in the coming period”. This week, Symantec announced that it will spin off its storage division (née Veritas) so that “the businesses would be able to focus better on growth opportunities including M&A”.

And now Meg Whitman tells us that HP will be “a lot more nimble, a lot more focused” as two independent companies: HP Inc. for PCs and printers, Hewlett Packard Enterprises for everything else.

Spinning off the PC and printer business made sense three years ago when Léo Apotheker lost his CEO job for suggesting it, and it still makes sense today, but this doesn’t mean that an independent HP PC company will stay forever independent. In a declining PC market that they once dominated, HP has fallen behind Lenovo, the company that acquired IBM’s PC business and made the iconic ThinkPad even more ubiquitous. HP Inc. will also face a newly-energized Dell, as well as determined Asian makers such as Acer and Asus. That Acer is losing money and Asus’ profits have fallen by 24% will make the PC market even more prone to price wars and consolidation. It doesn’t take much imagination to foresee HP Inc. shareholders agitating for a sale.

Many think that Hewlett-Packard Enterprise’s future isn’t so bright, either. The company’s latest numbers show that the enterprise business, which competes with the likes of IBM, Oracle, and SAP, isn’t growing.  As with the PC business, such unexciting state of affairs leads to talk of consolidation, of the proverbial “merger of equals”.

Such unhappy prospects for what once was a pillar of Silicon Valley leads to bitter criticism of a succession of executives and of an almost surreal procession of bad Board decisions. Three years ago, I partook in such criticism in a Monday Note titled How Bad Boards Kill Companies: HP. This was after an even older column, The Incumbent’s Curse: HP, where I wistfully contemplated the company’s rise and fall.

I’m fascinated by the enduring power, both negative or positive, of corporate cultures, of the under-the-surface system of emotions and permissions. After thinking about it, I feel HP’s current problems are rooted more deeply and started far earlier than the Board’s decisions and the sorry parade of executives over the past 15 years.

Founded in 1939, HP spent a quarter century following one instinct: Make products for the guy at the next bench. HP engineers could identify with their customers because their customer were people just like them…it was nerd heaven.

HP’s line of pocket calculators is the exemplar of a company following its instincts. They worked well because they appealed to techies. The  amazingly successful HP-80 was a staple of the financial world; its successor, the HP 12-C, is still sold today.

But HP’s initial success bred a strain of Because We Can that led the company into markets for which its culture had no natural feeling. I’m not just referring to the bizarre attempt in 1977 to sell the HP-01 “smartwatch” through jewelry stores…

Hewlett_Packard_Digital_Watch_Modell_1_1977

No, I’m referring to computers. Not the technical/scientific desktop kind, but computers that were marketed to corporate IT departments. In the late ’60’s, HP embarked on the overly ambitious Omega project, a 32-bit, real-time computer that was cancelled in 1970. The Because We Can impulse of HP engineers wasn’t supported by a reliable internal representation of the customer’s ways, wants, and emotions. (A related but much more modest project, the 16-bit Alpha, ultimately led to the successful HP 3000 — but even the HP 3000 had a difficult birth.)

Similarly, when 8-bit microprocessors emerged in 1974, HP engineers had no insights into the desires of the unwashed hobbyist. They couldn’t understand why anyone would embrace designs that were clearly inferior to their pristine 16-bit 9800 series of desktop machines.

By the late 70’s the company was bisected into engineers who stuck to the “guy at the next bench” approach, and engineers who targeted the IT workers that they mistakenly thought they understood. Later, in 1999, the instrument engineers and products — the “real” HP to many of us — were split off into Agilent, a relatively small business that’s not very profitable. The company’s less than $7B in revenue is nothing compared to the more than $100B in yearly revenues for the pre-split HP.

In all industries, some companies manage to stick to their story, while others drift from the script. I’m thinking of Volkswagen and its 40-year old Golf (not the misbegotten Phaeton) versus Honda’s sprightly 1972 Civic hatchback that later lost its soul and turned into today’s banal little sedan. (To be fair, I see the Civic as alive and well in the Honda Fit.)

In the tech world, Oracle has kept to the plot – no doubt because the founder, Larry Ellison, is still at the helm after 37 years. Others, like Cisco, make bizarre acquisitions: Flip, a consumer camera company that it quickly shut down, and home networking company LinkSys (purchased at a time when CEO John Chambers called The Home his company’s next frontier). And now Cisco is going after the $19T (trillion!) Internet of Things.

The now dysfunctional Wintel lost the plot by letting the PC-centric intuitions that worked well for so long blind them to the fact mobile devices aren’t “PCs – only smaller”.

I have a personal feeling of melancholy when I see that the once mighty HP has drifted from its instincts. The company hired me in June 1968 to launch their first desktop computer on the French market. After years in the weeds, this was the chance of a lifetime for this geeky college drop-out. At the time I joined, HP’s vision was concentrated. They rarely acquired other companies…why buy what you can build yourself? That all changed, and in a big way, in the 90’s.

To this day, I’m grateful for the kindness and patience of the HP that took me in. It was the company that David Packard describes in The HP Way, not today’s tired conglomeration.

JLG@mondaynote.com

eBay Under New Management – Again

 

Apple Pay, not even launched yet, is already making waves. Apple’s payment system has caused eBay to move people and business units around.

Early in 2012, PayPal’s President, Scott Thompson, abruptly left the company to become CEO of Yahoo. During his four-year tenure at the eBay subsidiary, Thompson had doubled PayPal’s user population and increased payment volume by 26% per year to over $120B. So why did he leave? eBay CEO John Donahoe put it this way:

“Scott wanted to be a CEO, and that’s great. He felt the opportunity wasn’t going to come along again. He had the best non-CEO job in the world, but he wanted to be a CEO, and wanted to go for it.”

Yes, Thompson wanted to be CEO…of an independent PayPal, but Donahoe and the eBay Board wouldn’t have it.

Fast forward to this year. Carl Icahn believes that PayPal would be more creative and make more money for its shareholders if it were freed from eBay tangles, so he makes a non-binding proposal to separate PayPal from its parent company.

In a January 23rd, 2014 blog post, Donahoe rebuffs the offer and doubles down on his position:

“PayPal and eBay make sense together for many reasons. Let me highlight three that we believe are among the most important [emphasis his]:
One: eBay accelerates PayPal’s success.
Two: eBay data makes PayPal smarter.
And three: eBay funds PayPal’s growth.”

Donahoe prays at the Church of Synergy and Leverage: Together, eBay and PayPal will ascend to heights neither is able to reach on its own.

That was then.

Last week, Donahoe left the Church. He and the eBay Board announced their three-part game plan for 2015:

  • PayPal will become an independent company led by Dan Schulman (American Express, AT&T, Priceline, Virgin Mobile)
  • Devin Wenig, currently president of eBay Marketplace, will replace Donahoe as eBay CEO.
  • After the separation is complete, Donahoe will no longer have an executive role but will  serve on the Board of one or both companies

(Compensation packages for the new CEOs are detailed in this SEC filing.)

What happened?

In eBay’s Investor Presentation, Donahoe extolls the union’s accomplishments, but explains that “Now the Time is Right for Two World Class Independent Platforms” and that the decision to part company “[r]eflects confidence we can preserve relationships and avoid dis-synergies through arm’s-length operating agreements”.

Spoken like a true consultant. (Prior to joining eBay, Donahoe had a stellar career at Bain & Company, where his eBay CEO predecessor Meg Whitman also worked.)

There is a shorter explanation: Apple Pay.

Apple’s new payment system, tied to the iPhone 6, is supported by American Express, Visa, and MasterCard, and recognized by a number of merchants including Walgreens, Macy’s, Target, and Whole Foods.

This changes the competitive landscape in two ways.

The first is the gravitational well, the network effect: More participants will attract more participants. It remains to be seen how well Apple Pay will perform, but we know the Touch ID feature works well — better than this skeptical user expected, and better and more securely than its current competitors.

The second way Apple Pay changes the landscape is much more alarming to competitors: Business Model Disruption. For Apple, revenue from a payment system is peripheral, it’s yet another part of the larger ecosystem that sustains the iDevice money makers. To PayPal, of course, payment revenue is all there is.

This distinction isn’t clear to everyone. In a conversation in Paris last week, an otherwise sensible friend insisted that Apple Pay will be a “huge profit opportunity”. No, Apple will earn about $1 for every $700 charged through Apple Pay. In order to reach the $10B “unit of needle movement”, Apple Pay would have to transact $7T (trillion). For reference, 2013 US retail revenue was $4.5T.

According to their 2013 Annual Report, eBay processed about $180B in payments in 2013, yielding $6.1B in transaction revenues. For that same $180B, Apple would content itself with $270M….that’s about 0.15% of the company’s overall revenue.

When eBay purchased PayPal for $1.5B in 2002, the deal made sense — it certainly made much more sense than the later acquisition and disposition of Skype. In recent years, PayPal has grown faster than eBay’s Marketplaces business, to the point where the two were roughly equal last year ($6.1B vs $6.8B). Today, Wall Street values the combined companies at approximately $67B (although it will be interesting to see how much the PayPal “half” fetches).

The fast-growth, synergistic business Donahoe vigorously guarded last January has been kicked to the curb because its business model is threatened by Apple Pay.

It didn’t have to be that way. We’ve recently heard that PayPal and Apple had been in “massive” talks earlier this year…until Apple found out about PayPal’s partnership with Samsung, thus ending any hope of a collaboration with the Cupertino team. Recall that PayPal’s President David Marcus unexpectedly left the company last June to lead Facebook’s mobile messaging initiative. The official explanation at the time was that Marcus was simply looking for a new adventure, but it’s more likely that Marcus was frustrated with Donahoe:

“eBay CEO John Donahoe pushed for the Samsung deal even though PayPal president at the time, who left for Facebook following the Apple-PayPal deal collapse, David Marcus was ‘purposely categorically against the Samsung deal, knowing that it would jeopardize PayPal’s relationship with Apple.’”

Looking at the game board three months later, Donahoe dissolved the eBay-PayPal union and deliberately wrote himself out of a job — undoubtedly with the “help” of his Board.

In the meantime, we have PayPal’s reaction to Apple Pay: An ad mocking Apple for the selfies fracas. Yes, a number of individual iCloud accounts were compromised by clever social engineering techniques and outright password theft, but no one seriously believes the iCloud infrastructure itself was penetrated. Conversely, in May of this year, eBay suffered a massive security breach requiring all users to change their passwords because hackers did gain access to the company’s servers, something PayPal management chose to ignore.

Again, we don’t yet know if Apple’s payment system will live up to its promise, but with the iPhone 6 and 6 Plus looking like The Mother of All Upgrades (two weeks after the launch, people are still lining up outside Apple Stores), Apple Pay should be on solid ground on its rumored October 20th opening day. Nonetheless, with an ex-Amex exec at the helm of a soon independent PayPal, the payment game is going to be interesting.

JLG@mondaynote.com