Finally! HP did what everyone but its CEO and Board thought inevitable: They spun off the commoditized PC and printing businesses. This is an opportunity to look deeper into HP’s culture for roots of today’s probably unsolvable problems.
Apple Pay, not even launched yet, is already making waves. Apple’s payment system has cause eBay to move people and business units around.
The BlackBerry was the first truly modern smartphone, the king of Personal Information Management On The Go. But under its modern presentation lurked its most fatal flaw, a software engine that couldn’t be adapted to the Smartphone 2.0 era.
The Apple Watch isn’t just another iDevice, a “wearables” accessory to the Apple ecosystem. It’s a bold attempt to create a new kind of wrist-worn personal computer that looks like a smartwatch.
In previous Monday Notes dealing with the putative iWatch and other “wearables”, I thought the new product would be a nice add-on to the iDevices ecosystem — a bit player that would make the iPhone more desirable — but that it wouldn’t move the needle, meaning $10B or more in revenue. I reasoned that a watch battery would be too small to feed a computer powerful enough to offer a wide range of apps and communications capabilities.
I was wrong.
In his demonstration (76 minutes into the official video) at the Cupertino Flint Center last Tuesday, Kevin Lynch, the Adobe defector who now runs the Apple Watch software engineering effort, showed us that the Watch isn’t just a shrunk-down iPhone: It can stand on its own, it has introduced an entire new genre of user interface, and will have its own App Store. The reinterpreted watch crown, a side button, touch and pressure on the face, plus voice all combine to a potentially rich and unique set of ways to interact with this newest very personal computer.
As Horace Dediu, our disruption scholar, puts it:
“The Apple Watch is as much a Watch as the iPhone is a Phone.”
The almost overwhelming richness of the user interface and of demonstrated apps led one twitterer to express a concern I can’t suppress:
Will the software overwhelm the hardware, resulting in problematic battery-life, or befuddle normal humans?
Indeed, I remember how I worried when Steve Jobs first demonstrated the iPhone on January 9th, 2007 and stated it ran OS X. Knowing Jobs’ occasionally robust relationship with facts, I feared embarrassment down the road. But, no. When the iPhone shipped almost six months later, on June 29th, hackers immediately dissected it and discovered it ran a bona fide pared-down version of OS X — later renamed iOS.
As with the original iPhone, we might be six months away from a shipping product, time for Apple to fine-tune its software and work on the S1 SoC (System on a Chip) that drives the watch… and to put in place the supply chain and retail operations for the many Apple Watch variations.
In the meantime, some choice morsels of context will help as we consider the impact of Apple’s new Watch. We’ll start with Marc Newson, the famed designer (and Jony Ive’s friend and collaborator) who just joined Apple. If you haven’t done so already, take a look at this video where Newson flips through his portfolio of watch and clock designs, including this striking reinterpretation of a great classic, the Atmos Clock from Jaeger-LeCoultre:
Turning to the Apple Watch mega-site, we see a family resemblance:
Professional watchmakers and industry executives seem to appreciate Newson’s influence and Apple’s efforts, although they are quick to point out that they don’t think the Apple Watch is a threat to their high-end wares (“It’s a techno-toy more than a watch, but what a fun toy,” says Laurent Picciotto of Chronopassion Paris). Watches by SJX provides a quick collation of What The Watch Industry Thinks Of The Apple Watch. Swiss watchmaker Eric Giroud voices the majority opinion:
“It’s a nice product; good shape and amazing bracelet – thank you Marc Newson for the resurrection of the Ikepod strap. It’s difficult to speak about its impact on watchmaking because the Apple Watch is not a watch except that it is also worn on the wrist.”
Benjamin Clymer is the editor of Hodinkee, an on-line magazine dedicated to the world of watches. In a post titled A Watch Guy’s Thoughts On The Apple Watch, Clymer provides a review that’s informed by a deep personal knowledge of the watch scene. If you don’t have time to read the whole article — it’s a long piece — the author provides a good summary in the introduction [emphasis mine]:
[…] though I do not believe it poses any threat to haute horology manufactures, I do think the Apple Watch will be a big problem for low-priced quartz watches, and even some entry-level mechanical watches. In years to come, it could pose a larger threat to higher end brands, too. The reason? Apple got more details right on their watch than the vast majority of Swiss and Asian brands do with similarly priced watches, and those details add up to a really impressive piece of design. It offers so much more functionality than other digitals it’s almost embarrassing. But it’s not perfect, by any means.
Not everyone in the watch industry is so impressed. In an article titled Apple Watch ‘too feminine and looks like it was designed by students’, says LVMH executive, The Telegraph provides the money quote [emphasis mine]:
“To be totally honest, it looks like it was designed by a student in their first trimester,” added Mr Biver, who heads up the brands Tag Heuer, Zenith and Hublot.
The article evoked general hilarity and prompted more than one commenter to dig up the infelicitous Ed Colligan quote about the iPhone:
“PC guys are not going to just figure this out. They’re not going to just walk in.”
I’ll offer a rewrite for Jean-Claude Biver and his haute horlogerie colleagues:
“We like Apple products, they provide productivity and fun in our daily lives; we respect the sense of design Sir Jony and now Marc Newson bring to the company. I wish I could say more but, try as I might, I couldn’t get the livestream of Mr. Cook’s presentation to work in my Rue de Rive office in Geneva. First, there was this Mandarin dubbing, I can understand why but it was really annoying. Then, the transmission kept breaking down. I imagine that the tons of concrete now being poured for Apple’s next headquarters will provide a suitable resting place for the individual in charge.
Again, congratulations on a well-executed global launch.”
More seriously, let’s put streaming glitches glitches aside, they won’t matter in the longer run because they don’t concern the product itself. Last week’s launch, its detailed preparations, including the no-longer mysterious white building, attest to the gravity of Apple’s long-term ambition.
As additional evidence that the Apple Watch isn’t just a hobby, recall that the iPhone was initially offered in one size and one color. By comparison, the Apple Watch is an explosion: It comes in three styles and two sizes (in millimeters, 38 and 42, because that’s the trade vocabulary), two material/finishes for each style (silver and space gray, yellow or rose gold), nine bands for the basic Apple Watch, six for the Apple Watch Sport, and at least four for the gold Apple Watch Edition — and all with matching crown buttons. Henry Ford has definitely left the building.
The fact that Apple invited fashion editors to Cupertino (some of whom had to be told where that town is) is another Think Different sign. Nerds are still welcome, but this is a new game. Again, turn to the Apple Watch site and look at the bands/bracelets. As Ben Clymer notes in his piece, the level of detail tells us this isn’t just another iDevice.
Stepping back a little, when I see the team of watch industry execs, design luminaries, and fashion experts Apple has brought on board, I have a hard time believing that Apple is going to stop at watches. At the very least, will Mssrs. Ive and Newson bring livelier, more varied designs to the iPhone? And what does Tim Cook mean when he slyly alludes to products that “haven’t even been rumored yet…”?
But let’s not get ahead of ourselves — we’re still barely past the demo. We’ll have to wait for the actual product to come to the wrists of real users. Only then will we have the Apple Watch make-or-break moment: Word-of-mouth from non-experts.
And, still in the not getting ahead of ourselves department, for Apple, today’s make-or-break product is the iPhone 6. The Apple Watch makes great “ink” and iPhones make the money.
This week’s product launch should break the mold of Apple’s recent Fall announcements: More products than usual and a challenge to the status quo – in payment system this time.
A larger iPhone; a line of wearables (unveiled if not yet ready-to-ship); significant iOS improvements (a true “iOS 2.0”); HomeKit and HealthKit devices, applications, and partnerships; payment systems… If only half of the rumors about Apple’s September 9th media event are true, we’re going to have a wider and deeper flood of new products than we’ve seen in Apple’s previous Fall launches.
And let’s not forget the big white cocoon that covers the two-story structure that Apple built for the occasion:
(image source: AppleInsider)
Apple is likely to add some drama to the event by lifting the veil at the last moment.
For today, we’ll focus on the recent flurry of leaks and rumors surrounding payment systems. We’ve heard about agreements with American Express, Visa, MasterCard, Bank of America; with retailers such as Nordstrom and Macy’s, CVS and Walgreens; and hoteliers such as Starwood… The predications may not prove accurate down to the last detail, but the outbreak is too strong not to be taken seriously. Apple is about to get into the payment system business in a serious way.
There have been rumors before. Search for “apple payment system” and you’ll get about 80 million hits on Google (11 million on Bing). Flipping through the pages, we see that the excitement started as far back as five years ago when Apple’s “Grab & Go” patent filings disclosed the company’s interest in near field communication, a wireless data transfer method that can be used for quick purchases and payments. This led to the birth of a new i-Word around 2010: the iWallet.
From its very beginning, the iPhone has looked like a logical payment device. Our phones are always with us; they’re more secure than the magnetic stripe on a credit card because they can use “payment tokens” — codes that authenticate you without identifying your credit card account; payment apps can be easily downloaded and updated.
The possibilities looked endless and, of course, led to overheated predictions: Think of all the trillions of dollars sloshing around in debit/credit cards. If Apple captured only a small fraction of the flow, they’d be filthy rich!
Others disagreed. In January 2011, PCWorld’s Tom Spring explained why Apple’s Mobile Payment System Will Fail. Among his objections, was the implicit assumption that phones are somehow easier than cards (“What’s gained…by waving an iPhone instead of swiping a bank card is not clear to me”), and that retailers won’t accept phones as payment instruments until the “Another Box at the Register” obstacle is surmounted:
“Near field communication is a technology that requires a physical box/reader on the retailer’s end. Until we know more about what incentives there are for retailers to invest in this technology I think it’s going to be hard sell for Apple to convince millions of merchants to put another box at the point of sale…”
Indeed, attempting to modify ingrained customer behavior isn’t a well-trodden path to riches, nor is asking retailers to install a new box next to their cash register. This is why many payment system innovations, Google Wallet is a recent example, have failed to amass enough gravitational pull to gain currency (pardon the pun). There just hasn’t been enough acceptance by consumers and retailers for “fast lane” payment devices to become as matter-of-fact as the incumbents.
Still… Apple has repeatedly shown great patience and willingness to challenge settled wisdom.
The company’s embrace of payment systems started in 2003 when its newly-opened iTunes Store offered two innovations: Single tracks were sold for 99 cents apiece (at the time), and we could settle the purchase with a credit card. Critics scoffed: The price is too low! The credit card companies’ fixed+percentage transaction fees will be a profit-killer!
How can Apple possibly make money with such a proposition?
This was myopia. The iTunes Store wasn’t intended to be a money maker. Its only purpose was to sell more iPods at higher margins, that’s where the money was – and still is. In retrospect, Jobs was pouring the foundations of the Apple ecosystem business model: Hardware is the star; everything else supports the big shots’ volumes and margins.
Returning to today’s (or this coming Tuesday’s) topic, Apple doesn’t want to displace the key players — the banks and credit card companies — any more now than they did a decade ago. Credit card companies, for example, play a hard-to-replace role in policing transactions. It’s not always pretty or convenient when one has to call a US number from Europe because the system “tripped” over an unusual transaction, but it works.
One can’t imagine Apple even thinking of storing and lending money, of trying to “capture a fraction of the flow”. If the company does introduce a near field payment system, it won’t be as an attempt to make money in itself, it will simply be another extension of the Apple ecosystem, another way to make iDevices more attractive.
Beyond this neat playbook theory lurks the matter of modifying consumer behavior and retail infrastructure; Tom Spring’s objections are just as cogent today as they were in 2009. And perhaps Apple’s answer — its rebuttal to the conventional reluctance — is hiding in the still-cocooned show-and-tell building.
PS: On today’s topic, see Horace Dediu’s views on the value of payment systems as bit pipes.
On September 9th, Apple will announce products likely to be seen as a new milestone in Tim Cook’s tenure as Apple’s CEO.
You Break It You Own It. This Labor Day weekend sits about midway between two anniversaries: Tim Cook assumed the CEO mantel a little over three years ago – and Steve Jobs left this world – too soon – early October 2011. And, in a few days, Apple will announce new products, part of a portfolio that caused one of Cook’s lieutenants, Eddy Cue, to gush Apple had the “best product lineup in 25 Years”. Uttered at last Spring’s Code Conference, Cue’s saeta was so unusual it briefly disoriented Walt Mossberg, a seasoned interviewer if there ever was one. After a brief pause, Walt slowly asked Apple’s exec to repeat. Cue obliged with a big I Ate The Canary smile – and raised expectations that will soon meet reality.
After three years at the helm, we’ll soon know in what sense Tim Cook “owns” Apple. For having broken Steve’s creation, for having created a field of debris littered with occasionally recognizable remains of a glorious, more innovative, more elegant past. Or for having followed the spirit of Steve’s dictum – not to think of what he would have done – and led Apple to new heights.
For the past three years, detractors have relentlessly criticized Cook for not being Steve Jobs, for failing to bring out the Next Big Thing, for lacking innovation.
Too often, clickbaiters and other media mountebanks veered into angry absurdity. One recommended Cook buy a blazer to save his job; another told us he a direct line to Apple’s Board and knew directors were demanding more innovation from their CEO; and, last Spring, a Valley bloviator commanded Apple to bring out a smartwatch within 60 days – or else! (No links for these clowns.)
More measurably, critics pointed to slower revenue growth: + 9% in 2013 vs + 65% in 2011 and + 52% in 2010, the last two “Jobs Years”. Or the recent decrease in iPad sales: – 9% in the June 2014 quarter – a never-seen-before phenomenon for Apple products (I exclude the iPod, now turning into an ingredient of iPhones and iPads).
Through all this, Apple’s CEO never took the bait and, unlike Jobs, either ignored jibes, calmly exposed his counterpoint, or even apologized when warranted by the Maps fiasco. One known – and encouraging – exception to his extremely controlled public manner took place when he told a representative of a self-described conservative think-tank what to do with his demand “to commit right then and there to doing only those things that were profitable” [emphasis mine]:
“When we work on making our devices accessible by the blind, […] I don’t consider the bloody ROI.”
“If you want me to do things only for ROI reasons, you should get out of this stock.”
The not-taken road to perdition hasn’t been a road to perfection either. Skipping over normal, unavoidable irritants and bugs – the smell of sausage factories is still with me –
a look at Apple’s Mail client makes one wish for stronger actions than bug fixes leading to new crashes. This is a product, or people, that need stronger decision as they do not represent Apple at its best. Another long-time offender is the iTunes client. One unnamed Apple friend calls it “our Vista” and explains it might suffer from its laudable origin as a cross-platform Mac/Windows application, a feature vital to iPod’s success – we’ll recall its 2006 revenue ($7.7B, + 69% year-to-year growth!) was higher than the Mac’s ($7.4B, + 18%).
Now looking forward, we see this:
A large, cocooned structure being built by an “anonymous” company, next to Cupertino’s aptly named Flint Center for the Performing Arts, where Apple will unveil its next products this coming September 9th. Someone joked this was yet another instance of Apple’s shameless imitation of Google’s innovations. This time Apple copied Google’s barges, but could even get its own clone to float.
Seriously, this is good news. This is likely to be a demo house, one in which to give HomeKit, HealthKit or, who knows, payment systems demonstrations, features of the coming iOS 8 release for “communicating with and controlling connected accessories”. The size of the structure speaks for Apple’s ambitions.
On other good news, we hear Apple’s entry into “wearables”, or into the “smartwatch” field won’t see any shipments until 2015. The surprise here is that Apple would show or tease the product on 9/9. There have been exactly zero leaks of body parts, circuit boards, packages and other accessories, leading more compos mentis observers (not to be confused with compost mentis on Fox News) to think a near term announcement wasn’t in the cards. But John Paczkowski, a prudent ans well-informed re/code writer assures us Apple will indeed announce a “wearable” — only to tell us, two days later, it won’t ship until next year. The positive interpretation is this: Apple’s new wearable category isn’t just a thing, an gizmo, you can throw into the channel and get the money pump running – at nice but immaterial accessory rates. Rather, Apple’s newer creation is a function-rich device that needs commitment, software and partnerships, to make a material difference. For this it needs time. Hence the painful but healthy period of frustration. (Electronic Blue Balls, in the immortal words of Regis McKenna, the Grand Master of Silicon Valley Marketing, who was usually critical of firms making an exciting product announcement, only to delay customer gratification for months.)
The topic of payments is likely to be a little less frustrating – but could mead to another gusher of media commentary. Whether Apple partners with Visa, American Express or others is still a matter of speculation. But one thing is clear: this idea isn’t for Apple to displace or disintermediate any of the existing players. Visa, for example, will still police transactions. And Apple isn’t out to make any significant amount of money from payments.
The goal, as always, is to make Apple devices more helpful, pleasurable – and to sell more of these at higher margins as a result. Like HomeKit or HealthKit, it’s an ecosystem play.
There’s also the less surprising matter of new iPhones. I don’t know if there will be a 4.7” model, or a 5.5” model or both. To form the beginning of an opinion, I went to the Palo Alto Verizon store on University Avenue and asked to buy the 5” Lumia Icon Windows Phone on display. The sales person only expressed polite doubt and excused himself “to the back” to get one. It took eight minutes. The rest of the transaction was quick and I walked out of the store $143.74 lighter. I wanted to know how a larger phone would feel on a daily, jeans and jacket breast-pocket experience. It’s a little heavy (167 grams, about 50 grams more than an iPhone 5S), with a very nice, luminous screen and great Segoe WP system font:
I won’t review the phone or Windows Phone here. Others have said everything that needs to be said on the matter. It’s going to be a tough road for Microsoft to actually become a weighty number three in the smartphone race.
But mission accomplished: It feels like a larger iPhone, perhaps a tad lighter than the Lumia will deliver a pleasant experience. True, the one-handed use will probably be restricted to a subset of the (mostly male) population. And today’s 4” screen size will continue to be available.
There remains the question of what size exactly: 4.7”, or 5.5” (truly big), or both. For this I’ll leave readers in John Gruber’s capable hands. In a blog post titled Conjecture Regarding Larger iPhone Displays, John carefully computes possible pixel densities for both sizes and offers an clarifying discussion of “points”, an important iOS User Interface definition.
We’ll know soon.
As usual, the small matter of implementation remains. There are sure to be the usual hiccups to be corrected in .1 or .2 update in iOS 8. And there won’t be any dearth of bilious comments about prices and other entries on the well-worn list of Apple sins.
But I’ll be surprised if the public perception of Tim Cook’s Apple doesn’t take yet another turn for the better.
Strategic Alliances and other grandly named partnerships never seem to live up to their florid marriage announcements. Apple and IBM are it – again – but this time, Apple is the larger, more prosperous company, and IBM is trying the bad old recipe of regaining growth by cutting down.
Let me slip into something more comfortable: Devil’s Advocate robes. Thus togged out, I will explain why this Apple + IBM rapprochement won’t work – or, worse, it will.
First, the clash of cultures.
Apple is a focused company, its financial statements tell the story: Its money is made in hardware. All other activities, such as the important contributions from the App Store, make up an ecosystem that support the hardware volumes and margins. Everyone in the company knows this.
A look at IBM’s latest quarterly report tells a much more complicated story. In its simplest analysis, the company consists of three main segments, each with its own P&L (Profit & Loss) numbers and, one assumes, its own goals, rewards and punishments, and fight for resources. It is, counterintuitively as the shadow of its former grandeur remains, a smaller business than Apple’s: $24.4B last quarter (-2% year-to-year) vs. $37.4B (+6%).
I asked WolframAlpha for per employee, per year revenue and profit comparisons and got this:
Inside IBM, morale isn’t great. Following a series of layoffs, management is perceived as using Excel as a windshield to drive the company.
Two groups with widely differing habits of the heart and mind.
Second, earlier embraces haven’t worked.
We have memories of AIM, the 1991 accord between Apple, IBM, and Motorola that gave us Kaleida, the multimedia PowerPC processor, and Taligent, Apple and IBM’s attempt at a more modern operating system. Big announcements, big plans – and nothing but debris.
Even earlier, we have memories of the Apple/DEC Alliance: In the Summer of 1987, my boss and benefactor John Sculley had given me the mission to bring to a conclusion a conversation he’d started with DEC’s CEO. Things went well and, in January 1988, we reached our goal:
“…Apple Computer and Digital Equipment announced a joint development agreement under which the two companies would work together to integrate Macintosh and the AppleTalk network system with the VAX and DECnet.”
At the celebratory dinner, I sat next to DEC’s founder, Ken Olson. The likable Grand Old Man professed happiness with our collaboration and calmly told me that while he knew lots of people who used PCs, he couldn’t comprehend why. At home, he said, he had a “glass teletype” — a CRT, remember those? — and an Ethernet connection back to the factory, quite expensive at the time. Combined with DEC’s ALL-IN-1 office productivity suite (all commands were two-characters long) he had everything he needed.
The Apple/DEC Alliance went nowhere. As with many such covenants, the product of the announcement was the announcement itself. The marriage itself was a sham.
Third and more generally, alliances don’t work.
There was a time when strategic alliances were all the rage. In 1993, my friend Denise Caruso published the aptly titled Alliance Fever, a 14-page litany of more than 500 embraces. The list started at 3DO and ending with Zenith Electronics, neither of which still stands: 3DO went bankrupt in 2003, Zenith was absorbed by LG Electronics.
These aren’t isolated bad endings. If you have the time and inclination for a nostalgic stroll through the list, you’ll see many more such disappearances.
But, you’ll object, this was more than twenty years ago. The industry has learned from these examples; we won’t fall into the same rut.
One would hope. And one would be disappointed.
The tendency remains strong for sheepish company execs to congregate and participate in what Valley wags call a Clusterf#^k. In two Monday Notes (Mobile World 2010 and 2011), I offered examples such as this one:
Do your eyes glaze over when you read such BS?
“Global leaders Intel Corporation and Nokia merge Moblin and Maemo to create MeeGo*, a Linux-based software platform that will support multiple hardware architectures across the broadest range of device segments, including pocketable mobile computers, netbooks, tablets, mediaphones, connected TVs and in-vehicle infotainment systems.”
Relax, you’re normal. Who are they kidding? Themselves, most likely.
All the holy words are there: Linux (mandatory), based (to male things clearer), platform (the p-word), multiple hardware architectures (we don’t know what we’re doing so we’re covering all bases), broadest range of devices (repeat the offense just committed), segments (the word adds a lot of meaning to the previous phrase), including pocketable mobile computers, netbooks, tablets, mediaphones, connected TVs and in-vehicle infotainment systems (only microprocessor-driven Toto toilets are missing from the litany).
Alliances generally don’t work because there’s no one really in charge, no one has the power to mete out reward and punishment, to say no, to change course. Often, the partners in an alliance are seen as a bunch of losers clinging to each other with the hope that there’s safety in numbers. It’s a crude but, unfortunately, not inaccurate caricature.
I’ll switch sides now and explain why It’ll Be Different This Time.
Division of labor is the most convincing argument for this partnership. IBM is and always has been an Enterprise Services company. As it did in its glorious mainframe days, it can take care of everything: analyze your business, recommend changes, re-engineer your organization, write software, maintain everything. Today, there’s much less focus on hardware revenue, but the broad scope remains.
Then came the mobile revolution, which IBM has missed out on. It’s not that they didn’t have the opportunity. The company could have jumped on the mobile-everything wave, but that would have meant breaking the “Roadmap 2015” promise that was avowed by IBM’s former CEO, Sam Palmisano. Palmisano might be forgiven for not anticipating the size and importance of mobile when he promised, in his 2010 letter to investors, that IBM share value would double by 2015, but Ginni Rometty, Palmisano’s successor, has no excuse. The 2012 changing of the guard was a perfect opportunity for Rometty to stand up, say Things Have Changed and re-jigger the roadmap. Ah well.
On the positive side, IBM’s clients are re-organizing their businesses as a result of the mobile deluge, some late, some early. The smarter ones have realized that mobile devices aren’t just “small PCs” and have turned to broad-range professional services vendors such as IBM to re-engineer their business.
For Apple’s part, the iPhone and the iPad have gained increasingly wider acceptance with large Enterprise customers: “98% of Fortune 500 companies have rdeployed iOS devices and more than 90% of tablet activations in enterprise environments are iPads.” Of course, a few BYOD devices don’t constitute wholesale adoption inside a company. Apple doesn’t have the manpower and culture to come in, engineer, deploy, and maintain company-wide applications and fleets of devices. That’s IBM forte.
What’s new in the arrangement is IBM’s decision to invest in extending its ability to develop applications that fully integrate iOS devices — as opposed to “suffering” them.
On the numbers side, naysayers mistakenly use the “98%” figure quoted above to opine that the partnership won’t create much additional revenue. They’re probably right — at least initially. But the partnerships could herald a move from “anecdotal” to systematic deployments that are deep and wide. This will take time and the needle won’t move right away…it will be more like the hours hand on the clock face.
Another more immediate effect, across a wide range of enterprises, will be the corporate permission to use Apple devices. Recall the age-old mantra You Don’t Get Fired For Buying IBM, which later became DEC, then Microsoft, then Sun…and now Apple. Valley gossip has it that IBM issued an edict stating that Macs were to be supported internally within 30 days. Apparently, at some exec meetings, it’s MacBooks all around the conference room table — except for the lonely Excel jockey who needs to pivot tables.
We’ll see if the company whose motto once was Think actually works well with the Think Different squad.
iPad sales are falling – but the sky is not. We’re merely dealing with a healthy case of expectations adjustment.
The tablet computer has always felt inevitable. The desire to harness the power of a computer in the comfortable form of a letter-size tablet with a keyboard, or perhaps a stylus for more natural interaction — or why not both? — has been with us for a very long time. Here we see Alan Kay holding a prototype of his 1972 Dynabook (the photo is from 2008):
Alan prophetically called his invention a personal computer for children of all ages.
For more than 40 years, visionaries, entrepreneurs, and captains of industry have whetted our appetite for such tablets. Before it was recast as a PDA, a Personal Digital Assistant, Steve Sakoman’s Newton was a pen-based letter-size tablet. Over time, we saw the GridPad, Jerry Kaplan’s and Robert Carr’s Go, and the related Eo Personal Communicator. And, true to its Embrace and Extend strategy, Microsoft rushed a Windows for Pen Computing extension into Windows 3.1.
These pioneering efforts didn’t succeed, but the hope persisted: ‘Someone, someday will get it right’. Indeed, the tablet dream got a big boost from no less than Bill Gates when, during his State of The Industry keynote speech at Comdex 2001 (Fall edition), Microsoft’s CEO declared that tablets were just around the corner [emphasis mine]:
“The Tablet takes cutting-edge PC technology and makes it available wherever you want it, which is why I’m already using a Tablet as my everyday computer. It’s a PC that is virtually without limits — and within five years I predict it will be the most popular form of PC sold in America.“
Unfortunately, the first Tablet PCs, especially those made by Toshiba (I owned two), are competent but unwieldy. All the required ingredients are present, but the sauce refuses to take.
Skip ahead to April 2010. The iPad ships and proves Alan Kay right: The first experience with Apple’s tablet elicits, more often than not, a child-like joy in children of all ages. This time, the tablet mayonnaise took and the “repressed demand” finally found an outlet. As a result, tablets grew even faster than PCs ever did:
(Source: Mary Meeker’s regular Internet Trends 2014 presentation, always long, never boring)
In her 2013 report, Meeker showed iPads topping the iPhone’s phenomenal growth, climbing three times faster than its more pocketable sibling:
(Source: Mary Meeker Internet Trends 2013)
There were, however, two unfortunate aspects to this rosy picture.
First, there was the Post-PC noise. The enthusiasm for Android and iOS tablets, combined with the end of the go-go years for PC sales, led many to decree that we had finally entered the “Post-PC” era.
Understandably, the Post-PC tag, with its implication that the PC is no longer necessary or wanted, didn’t please Microsoft. As early as 2011, the company was ready with its own narrative which was delivered by Frank Shaw, the company’s VP of Corporate Communications: Where the PC is headed: Plus is the New “Post”. In Microsoft’s cosmos, the PC remains at the center of the user’s universe while smartphones and tablets become “companion devices”. Reports of the PC’s death are greatly exaggerated, or, as Shaw puts it, with a smile, “the 30-year-old PC isn’t even middle aged yet, and about to take up snowboarding”.
(Actually, the current debate is but a new eruption of an old rash. “Post-PC” seems to have been coined by MIT’s David Clark around 1999, causing Bill Gates to pen a May 31st, 1999 Newsweek op-ed titled: Why the PC Will Not Die…)
Both Bill and Frank are right – mostly. Today’s PC, the descendant of the Altair 8800 for which Gates programmed Microsoft’s first Basic interpreter, is alive and, yes, it’s irreplaceable for many important tasks. But classical PCs — desktops and laptops — are no longer at the center of the personal computing world. They’ve been replaced by smaller (and smallest) PCs — in other words, by tablets and smartphones. The PC isn’t dead or passé, but it is shape-shifting.
There was a second adverse consequence of the iPad’s galloping growth: Expectations ran ahead of reality. Oversold or overbought, it doesn’t matter, the iPad (and its competitors) promised more than they could deliver. Our very personal computers — our tablets and smartphones — have assumed many of the roles that previously belonged to the classical PC, but there are some things they simply can’t do.
For example, in an interview with the Wall Street Journal, Tim Cook confides that “he does 80% of the work of running the world’s most valuable company on an iPad.” Which is to say Tim Cook needs a Mac for the remaining 20%…but the WSJ quote doesn’t tell us how important these 20% are.
We now come to the downward trend in iPad’s unit sales: -2.29% for the first quarter of calendar year 2014 (compared to last year). Even more alarming, unit sales are down 9% for the quarter ending in June. Actually, this seems to be an industry-wide problem rather than an Apple-specific trend. In an exclusive Re/code interview, Best Buy CEO Hubert Joly says tablet sales are “crashing”, and sees hope for PCs.
Many explanations have been offered for this phenomenon, the most common of which is that tablets have a longer replacement cycle than smartphones. But according to some skeptics, such as Peter Bright in an Ars Technica op-ed, there’s a much bigger problem [emphasis mine]:
“It turns out that tablets aren’t the new smartphone…[t]hey’re the new PC; if you’ve already got one, there’s not much reason to buy a new one. Their makers are all out of ideas and they can’t make them better. They can only make them cheaper.”
Bright then concludes:
“[T]he smartphone is essential in a way that the tablet isn’t. A large screen smartphone can do…all the things a tablet can do… Who needs tablets?”
There is a simpler – and much less portentous – explanation. We’re going through an “expectations adjustment” period in which we’ve come to realize that tablets are not PC replacements. Each personal computer genre carries its own specifics; each instils unique habits of the body, mind, and heart; none of them is simply a “differently sized” version of the other two.
The realization of these different identities manifests itself in Apple’s steadfast refusal to hybridize, to make a “best of both worlds” tablet/laptop product.
Microsoft thinks otherwise and no less steadfastly (and expensively) produces Surface Pro hybrids. I bought the first generation two years ago, skipped the second, and recently bought a Surface Pro 3 (“The tablet that can replace your laptop”). After using it daily for a month, I can only echo what most reviewers have said, including Joanna Stern in the WSJ:
“On its third attempt, Microsoft has leapt forward in bringing the tablet and laptop together—and bringing the laptop into the future. But the Pro 3 also suffers from the Surface curse: You still make considerable compromises for getting everything in one package.”
Trying to offer the best of tablets and laptops in one product ends up compromising both functions. In my experience, too many legacy Windows applications work poorly with my fingers on the touch screen. And the $129 Type Cover is a so-so keyboard and poor trackpad. Opinions will differ, of course, but I prefer using Windows 8.1 on my Mac. We’ll see how the upcoming Windows 9, code name Threshold, will cure the ills of what Mary Jo Foley, a well-introduced Microsoft observer, calls Vista 2.0.
If we consider that Mac unit sales grew 18% last quarter (year-to-year), the company’s game becomes clear: The sweet spot on Apple’s racket is the set of customers who, like Tim Cook, use MacBooks and iPads. It’s by no means the broadest segment, just the most profitable one. Naysayers will continue to contend that the prices of competing tablets are preordained to crash and will bring ruin to Apple’s Affordable Luxury product strategy…just as they predicted netbooks would inflict damage on MacBooks.
As for Peter Bright’s contention that “[tablet] makers are all out of ideas and they can’t make them better”, one can easily see ways in which Google, Lenovo, Microsoft, Apple, and others could make improvements in weight, speed, input methods, system software, and other factors I can’t think of. After we get over the expectations adjustment period, the tablet genre will continue to be innovative, productive, and fun – for children of all ages.
With one million titles and no human guides, the Apple App Store has become incomprehensible for mere mortals. A simple solution exists: curation by humans instead of algorithms.
You know the numbers better than anyone — I don’t need to quote them to you — but we all know that the iOS App Store is a veritable gold mine. Unfortunately, the App Store isn’t being mined in the best interests of Apple’s customers and developers, nor, in the end, in the interests of the company itself.
The App Store may be a gold mine, but it’s buried in an impenetrable jungle.
Instead of continuing with this complaint, I’ll offer a suggestion: Let humans curate the App Store.
Instead of using algorithms to sort and promote the apps that you permit on your shelves, why not assign a small group of adepts to create and shepherd an App Store Guide, with sections such as Productivity, Photography, Education, and so on. Within each section, this team of respected but unnamed (and so “ungiftable”) critics will review the best-in-class apps. Moreover, they’ll offer seasoned opinions on must-have features, UI aesthetics, and tips and tricks. A weekly newsletter will identify notable new titles, respond to counter-opinions, perhaps present a developer profile, footnote the occasional errata and mea culpa…
The result will be a more intelligible App Store that makes iOS users happier.
If I’m so convinced, why don’t I drive it myself? You might recall that I offered to do so — for free — in a brief lobby conversation at the All Things D conference a couple of years ago. The ever-hovering Katie Cotton gave me the evil eye and that was the end of the exchange.
I look back on my years at Apple with a certain affection, and would be happy to repay the company for what it did for me, so, yes, I would do it for free… but I can’t bankroll a half dozen tech writers, nor can I underwrite the infrastructure costs. And it won’t pay for itself: As an independent publication (or, more likely, an app) an App Store Guide isn’t financially viable. We know it’s next to impossible to entice people to pay for information and, as the Monday Note proves, I have no appetite for becoming a nano-pennies-per-pageview netwalker.
So, the App Store Guide must be an Apple publication, a part of its ecosystem.
PS: We both understand that ideas are just ideas, they’re not actual products. As Apple has shown time and again — and most vividly with the 30-year old tablet idea vs. the actual iPad — it’s the product that counts. If you see the wisdom of a human-curated Apple App Guide, and I hope you do, I will not seek credit.
Regular Monday Note readers will remember I already tilted at the App Store curation windmill: Why Apple Should Follow Michelin and the tongue-in-cheek Google’s Red Guide to the Android App Store. Who knows, the third time might be the charm.
To play devil’s advocate, let’s consider a developer’s bad reaction to an Apple App Guide review. Let’s say MyNewApp gets a thumbs down in the Productivity section of the Guide. I’m furious; I write Tim or Eddy Cue an angry letter, I huffs and puff, threaten to take my business elsewhere — to Windows Phone, for example. I exhort my friends, family, and satisfied customers to contribute to a letter-writing campaign…
Why risk this sort of backlash? Particularly when today’s formula of “featuring” apps seems to be working:
But…does it really work all that well? Today’s way of choosing this app over that one already upsets the non-chosen. Further, the stars used to “measure” user feedback are known to be less than reliable. A thoughtful, detailed, well-reasoned review would serve customers and developers alike.
This leads us to the Guide’s most important contribution to the app universe: Trust. An Apple-sponsored App Guide can be trusted for a simple reason: The company’s one and only motive is to advance its users’ interests by making the App Store more trustworthy, more navigable. As for developers, they can rely on a fair and balanced (seriously) treatment of their work. The best ones will be happier and the “almost best” others will see an opportunity to get their improved work noticed in a future review cycle.
There is also the temptation to shrug the suggestion off with the customary ‘Don’t fix it, it’s not broken.’ Sorry, no, it is broken. See what Marco Arment, a successful Apple developer, says on his blog [emphasis mine]:
“Apple’s App Store design is a big part of the problem. The dominance and prominence of “top lists” stratifies the top 0.02% so far above everyone else that the entire ecosystem is encouraged to design for a theoretical top-list placement that, by definition, won’t happen to 99.98% of them. Top lists reward apps that get people to download them, regardless of quality or long-term use, so that’s what most developers optimize for. Profits at the top are so massive that the promise alone attracts vast floods of spam, sleaziness, clones, and ripoffs.”
“Quality, sustainability, and updates are almost irrelevant to App Store success and usually aren’t rewarded as much as we think they should be, and that’s mostly the fault of Apple’s lazy reliance on top lists instead of more editorial selections and better search.
The best thing Apple could do to increase the quality of apps is remove every top list from the App Store.”
We can now turn to my own biases.
Why do I care? Good question, I’m now 70 and could just sit in zazen and enjoy the show. And there’s a lot of show to enjoy: The tech industry is more exciting now than when I was a rookie at HP France in 1968. But in today’s app stores, the excitement fades — and I’m not just talking about Apple, Android’s Google Play is every bit as frustrating. I see poorly exploited gold mines where quantity obscures quality and the lack of human curation ruins the Joy of Apps. There are caves full of riches but, most of of the time, I can’t find a path to the mother lode.
Is it a lack of courage in anticipation of imagined protests? Hunger sated by too much success too soon? An addiction to solving all problems by algorithm instead of by human judgment?
I hope its none of these, and that we’ll soon see a newsletter/blog and a reasoned, regularly enriched guide that leads us to the better App Store titles.
When Apple announced its 64-bit A7 processor, I dismissed the speculation that this could lead to a switch away from Intel chips for the Macintosh line for a homegrown “desktop-class” chip. I might have been wrong.
“I don’t know exactly when, but sooner or later, Macs will run on Apple-designed ARM chips.” Thus spake Matt Richman in a 2011 blog post titled “Apple and ARM, Sitting in a Tree”. Richman explained why, after a complicated but ultimately successful switch from PowerPC chips to Intel processors in 2005, Apple will make a similar switch, this time to ARM-based descendants of the A4 chip designed by Apple and manufactured by Samsung.
Cost is the first reason invoked for the move to an An processor:
“Intel charges $378 for the i7 chip in the new high-end 15 inch MacBook Pro. They don’t say how much they charge for the i7 chip in the low-end 15 inch MacBook Pro, but it’s probably around $300. …When Apple puts ARM-based SoC’s in Macs, their costs will go down dramatically. ”
We all know why Intel has been able to command such high prices. Given two microprocessors with the same manufacturing cost, power dissipation, and computing power, but where one runs Windows and the other doesn’t, which chip will achieve the higher market price in the PC market? Thus, Intel runs the table, it tells clone makers which new x86 chips they’ll receive, when they’ll receive them, and, most important, how much they’ll cost. Intel’s margins depend on it.
ARM-based processors, on the other hand, are inherently simpler and therefore cost less to make. Prices are driven even lower because of the fierce competition in the world of mobile devices, where the Wintel monopoly doesn’t apply.
Cost is the foremost consideration, but power dissipation runs a close second. The aging x86 architecture is beset by layers of architectural silt accreted from a succession of additions to the instruction set. Emerging media formats demand new extensions, while obsolete constructs must be maintained for the sake of Microsoft’s backward compatibility religion. (I’ll hasten to say this has been admirably successful for more than three decades. The x86 nickname used to designate Wintel chips originates from the 8086 processor introduced in 1978 – itself a backward-compatible extension of the 8088…)
Because of this excess baggage, an x86 chip needs more transistors than its ARM-based equivalent, and thus it consumes more power and must dissipate more heat.
Last but not least, Richman quotes Steve Jobs:
“I’ve always wanted to own and control the primary technology in everything we do.”
Apple’s leader has often been criticized for being too independent and controlling, for ignoring hard-earned industry wisdom. Recall how Apple’s decision to design its own processors was met with howls of protest, accusations of arrogance, and the usual predictions of doom.
Since then, the interest for another Grand Processor Switch has been alive and well. Googling “Mac running on ARM” gets you close to 10M results. (When you Bing the same query, you get 220M hits — 22x Google’s results. SEO experts are welcome to comment.)
Back to the future…
In September 2013, almost a year ago already, Apple introduced the 64-bit A7 processor that powers new iPhones and iPads. The usual suspects pooh-poohed Apple’s new homegrown CPU, and I indulged in a little fun skewering the microprocessor truthers: 64 bits. It’s Nothing. You Don’t Need It. And We’ll Have It In 6 Months. Towards the end of the article, unfortunately, I dismissed the speculation that Apple An processors would someday power the Mac. I cited iMacs and Mac Pros — the high end of the product line —as examples of what descendants of the A7 couldn’t power.
A friend set me straight.
In the first place, Apple’s drive to own “all layers of the stack” continues unabated years after Steve’s passing. As a recent example, Apple created its own Swift programming language that complements its Xcode IDE and Clang/LLVM compiler infrastructure. (For kremlinology’s sake I’ll point out that there is an official Apple Swift blog, a first in Apple 2.0 history if you exclude the Hot News section of the of apple.com site. Imagine what would happen if there was an App Store blog… But I digress.)
Secondly, the Mac line is suspended, literally, by the late delivery of Intel’s Broadwell x86 processors. (The delay stems from an ambitious move to a bleeding edge fabrication technology that shrinks the basic building block of a chip to 14 nanometers, down from 22 nanometers in today’s Haswell chips.) Of course, Apple and its An semiconductor vendor could encounter similar problems – but the company would have more visibility, more control of its own destiny.
Furthermore, it looks like I misspoke when I said an An chip couldn’t power a high-end Mac. True, the A7 is optimized for mobile devices: Battery-optimization, small memory footprint, smaller screen graphics than an iMac or a MacBook Pro with a Retina display. But having shown its muscle in designing a processor for the tight constraints of mobile devices, why would we think that the team that created the most advanced smartphone/tablet processor couldn’t now design a 3GHz A10 machine optimized for “desktop-class” (a term used by Apple’s Phil Schiller when introducing the A7) applications?
If we follow this line of reasoning, the advantages of ARM-based processors vs. x86 devices become even more compelling: lower cost, better power dissipation, natural integration with the rest of the machine. For years, Intel has argued that its superior semiconductor design and manufacturing technology would eventually overcome the complexity downsides of the x86 architecture. But that “eventually” is getting a bit stale. Other than a few showcase design wins that have never amounted to much in the real world, x86 devices continue to lose to ARM-derived SoC (System On a Chip) designs.
The Mac business is “only” $20B a year, while iPhones and iPad generate more than 5 times that. Still, $20B isn’t chump change (HP’s Personal Systems Group generates about $30B in revenue), and unit sales are up 18% in last June’s numbers vs. a year ago. Actually, Mac revenue ($5.5B) approaches the iPad’s flagging sales ($5.9B). Today, a 11” MacBook Air costs $899 while a 128Gb iPad Air goes for $799. What would happen to the cost, battery life, and size of an A10-powered MacBook Air? And so on for the rest of the Mac line.
By moving to ARM, Apple could continue to increase its PC market share and scoop much of the profits – it currently rakes in about half of the money made by PC makers. And it could do this while catering to its customers in the Affordable Luxury segment who like owning both an iPad and a Mac.
While this is entirely speculative, I wonder what Intel’s leadership thinks when contemplating a future where their most profitable PC maker goes native.
Postscript: The masthead on Matt Richman’s blog tells us that he’s now an intern at Intel. After reading several of his posts questioning the company’s future, I can’t help but salute Intel management’s open mind and interest in tightly reasoned external viewpoints.
And if it surprises you that Richman is a “mere” intern, be aware that he was all of 16-years-old when he wrote the Apple and ARM post. Since then, his blog has treated us to an admirable series of articles on Intel, Samsung, Blackberry, Apple, Washington nonsense – and a nice Thank You to his parents.