hardware

Android vs. Apple. Market Share vs. Profit Share, Part 255

 

Conventional wisdom and badly reconstructed history can lead to seemingly comfortable but in reality fragile conclusions. Prepare to be confused. 

Ever since the Android platform emerged as the only real competitor to Apple’s iOS devices, we’ve been treated to a debate which I’ll oversimplify: If Apple makes all the money but Android gets all the volume, who will win? A cursory survey of tech journals and blogs would lead one to believe that the case is closed: Market Share trumps Profit Share. It always does.

So Apple should call it a day? I’m skeptical. Not about the conclusion — Market Share isn’t exactly a dark horse — but about the arguments that are trotted out. False memories of Apple’s past have become a template for its future. For example, a recent Wall Street Journal article ends thus [and, sorry, you need a subscription to see the entire article]:

“Unfortunately, Apple has seen this movie before. A generation ago, it also had a top product whose market share was undercut by cheap, inferior rivals. It hopes the iPhone’s story isn’t a sequel to the Mac’s.”

(I emailed the WSJ writer asking three simple, clarifying questions. No answer, but that’s standard practice, as witheringly described by Philip Elmer-DeWitt at the end of this post.)

I was there “a generation ago”. In 1981, when IBM introduced the IBM PC, I was starting Apple France. Big Blue had made startling changes to its old ways, boldly calling its new machine The Personal Computer (we thought the “The” was ours). In an even bolder move, IBM loosened its tie and its dress code, and tried (successfully) to speak to the “common man” by using a Charlie Chaplin imitator as a mascot:

An interesting choice, particularly when juxtaposed with the real Chaplin’s cine-commentary on “labor-saving devices”:

The original PC from IBM’s Boca Raton group was a faithful homage to the Apple ][, right down to the cassette interface. But it wasn't a cheap imitation. There was one important difference:  Where the Apple ][ used a 8-bit 6502 processor, IBM splurged on the much-more-powerful 16-bit Intel chip.

Almost overnight, the pages of InfoWorld, previously replete with salivating reviews of Apple products, were filled with IBM PC articles. The new machine got a major boost with the launch of Lotus 1-2-3, a multi-function spreadsheet that became the gold standard for office applications, especially on desktops that sported hard disks and large color screens. Against the Apple ][, the IBM PC was a superior product -- and deftly marketed.

For the next few years, the Apple ][ family stumbled. The Apple ///, beset by early hardware failures, didn't answer the 16-bit question. It wasn't the modernization of the Apple ][ that the company had promised. The Apple II GS was even worse, not compatible enough with the Apple ][ and not powerful enough to attract developers, particularly Bill Gates who saw no potential for Microsoft applications.

That brings us to 1984. The Macintosh changed the game, right?

Hardly. At its coming out party, the Mac was two years behind schedule. I recall the "Mac's Last Slip" jibes at company meetings. No one would deny the obvious potential, the elegance, the innovative user interface, the clean square pixels on the bit-mapped screen, the fonts, the LaserWriter connection... But the Mac didn't support external hard drives until 1986, and it would be another year before internal disks, additional modularity, and a great Trinitron color monitor were added.

By that time, IBM had had the market to itself for half a decade, and its PC creation had morphed into the Wintel clone industry.

Contrary to the revisionist WSJ story, the "generation ago" Mac never had a market share to undercut. Apple's flagship product -- innovative, elegant, a generation ahead – was a dreamer's machine. Down-to-earth market wisdom said the Mac was perfect for Stanford undergrads, but not serious enough for real business use. The common view was application developers wouldn't be able to afford the investment in time and hardware. Starved of competitive software, the Macintosh was doomed to irrelevance and, ultimately, failure.

It almost happened, especially after Apple's desperate attempt to prop up platform share numbers by licensing Mac clones, a move that resulted in a brutal drop in Apple's margins. Market share vs. Profit Share...

The Mac was saved by Gil Amelio's unintentionally self-sacrificing decision to hand the Apple reins back to Steve Jobs. What followed was the most amazing turnaround our industry has ever seen, and it started with two controversial moves: Jobs rescinded the Mac OS license, and he made a deal with the Microsoft Devil. He convinced Gates' company to "invest" $150M in non-voting Apple shares and develop new Mac versions of the Explorer browser and Office apps (although, in reality, the agreement was part of a settlement of an older IP dispute).

We know the rest of the story, including a meme-adverse fact: For close to seven years, the Mac has consistently gained market share at the expense of PC clones.

Since the advent of another flagship product, the iPhone this time, the riches-to-rags Mac meme has led to predictions of a similar fate: Death by drowning in a sea of "cheap" Android clones. Apple's high price ($650 per iPhone on average) gives too much low-end room for competitors. The price will be undercut, there will be a decline in unit share that, in turn, will lead to lower profits, lower developer interest, lower ability to invest in future products. The road to irrelevance is paved with high margins and low market share.

Never mind two differences. First, the iPhone never lacked apps, 750,000 of them at last count. And never mind that it is immensely profitable, that Apple is embarrassingly flush with more cash than all its high-tech colleagues combined. The pundits won't accept evidence as an answer. Market Share will trump Profit Share. Why let facts cloud a good argument?

One is tempted to point to the race to the bottom that PC clone makers have experienced over the past decade. HP enjoys the largest Market Share of all PC makers, but it also "enjoys" less than 4% operating profit for its efforts. Meanwhile, Apple's margin is in the 25% range for its Mac line. That may not be as enjoyable as the 60% margin for the iPhone, but it's a solid business, particularly when you consider that the clone makers, HP and Dell foremost, are angling to get out of the business altogether. (See an earlier MN: Post-PC: Wall Street Likes the View.)

Returning to the iOS vs Android debate, I will state an opinion - not to be confused with a prediction, let alone The Truth: I think the vertical simplicity of Apple's business will tilt the field in its favor as the complicated Android world devolves into anarchy. Apple vs Google isn't Apple vs Microsoft/Intel/IBM.

Let's back up a bit. Google's 2005 acquisition of Android was a visionary move. (Some say Google's vision was sharpened by Eric Schmidt's presence on Apple's Board as the company worked on the future iPhone. Jobs was furious about Google's decision and summarily asked Schmidt to leave.) Android's unprecedented growth -- more than 50% share of the smartphone market in the US, and even more worldwide – is a testament to the "open" approach. Google gives away the Open Source Android OS; processors are another kind of "open", custom-designed under ARM licenses open to all payers.

But Android is a "cushion shot", it's an indirect way for Google to make money. Android is a Trojan horse that infects smartphones so it can install services that collect the user data that feeds Google's true business: advertising.

Now, Google faces several problems. Android's openness leads to incompatibilities between devices, a problem for developers that didn't happen under Microsoft's rule in the PC era. Worse (for Google), the many diverging versions of Android (a.k.a. forks) -- especially those created in China -- carry no Google services. They harvest no data and so they bring no advertising revenue potential back to Google.

This is clearly a concern for Google, so much so that the company now offers "pure" Android smartphones by Samsung (for $650) and HTC (for $599) on its Google Play site.

On the other hand, Android 2013 is a mature, stable OS. It isn't Windows '95, which was nothing more than a shell bolted on top of DOS. While the Mac's system software wasn't fully developed when it first came out, many saw it as superior -- or potentially superior -- to Microsoft's OS. Android is a tougher competitor than Windows was at the same age.

Then there is Google's subsidiary Motorola Mobility and the relationship with Samsung, the most powerful Android handset maker. As discussed last week, Motorola's stated intention is to push Android phone prices well below the $650 (unsubsidized) level. Is Samsung in a position to wag the Android dog? And if so, how will they react to Motorola's moves?

Let's not forget "the small matter of execution", one that might prove more important than lofty "strategic" considerations. And, to further complicate predictions, we have the herd's tendency to assume Company X will make all the mistakes while its competitors will play a perfect game.

Confused? Then I have accomplished one of my goals, to show how unhelpful the old bromides are when trying to guess what will happen next.

-- JLG@mondaynote.com

PS: I'd be remiss if I didn't direct you recently discovered articles by John Kirk, who calls himself a recovering attorney and indeed writes tightly reasoned posts on Techpinions. I'll whet your appetite with two quotes. One from Does The Rise Of Android's Market Share Mean The End of Apple's Profits? [emphasis mine]:

Steve Jobs wanted, and Apple wants, market share. But they want the RIGHT market share. Apple wants customers who are willing to pay for their products. And Apple wants customers who are good for their platform. In other words, Apple wants market share in their target demographic. Based on the fact that Apple is taking in 72% of the mobile phone profits with only 8% or 9% of the market share, it sure sounds like they’ve aquired the right market share to me.

Does the rise of Android’s market share mean the end of Apple’s profits? Hardly. You can argue as loudly as you like that developers and profit share must necessarily follow market share. But the facts will shout you down.

The other is from 4 Mobile Business Models, 4 Ways To Keep Score where he concludes:

And if you’re going to prophesy that market share alone gives Google data that will someday, somehow, be worth something to someone, then you need to go back and re-read how the “razor-and-blades” business model is scored.

What we desperately need in analyzing mobile computing is far more attention paid to profits and far less attention paid to prophets.

—-

Post-PC: Wall Street Likes the View

 

The conventional PC business is now on the decline and yet share prices for of key players Microsoft and HP are moving up. Why?

In an April press release, IDC painted a bleak picture for the PC. Compared to last year’s first quarter, worldwide shipments of PCs are down 13.9%, the “steepest decline ever in a single quarter”. US numbers are about the same: -12.7%. On a graph, the trend is unmistakable:

Is this a trend Wall Street likes?

When you consider Microsoft, it seems so. In a corporate blog post titled Windows 8 at 6 months, the company proudly claims to have “recently surpassed the 100 million licenses sold mark for Windows 8.” This is an interesting number. A quarter ago, MS announced it had sold 60 million licenses, meaning that only 40 million were sold in the last three months. That’s a 33% drop…hardly a rousing success. (The “licenses sold” phrase requires caution, it doesn’t only mean “sold with new PCs”, there are also updates to existing machines, with or without enthusiasm for the new Windows OS.)

“Ignore the Windows 8 numbers and IDC analysis”, says Wall Street. While the tech-heavy Nasdaq climbed only 6.6% in the last 60 days, Microsoft shares went up by 21%.

The same apparent illogic holds for Hewlett-Packard. Last week, the largest PC maker disclosed its second quarter numbers. Compared to the same quarter last year, they’re not exactly pretty:

Revenue down by 10% to $27.6B
Operating Margin at 5.8%, down by about 20% (HP prefers “down 1.4 points”)
EPS (Earnings Per Share) at 55 cents, down 31%

Zeroing on HP’s PC business, things look worse:

Revenue down by 20% to $7.6B
Operating Margin at 3.2%, down 44% (“down 2.2 points” sounds better)

As one would expect, Wall Street reacted, and HP shares went…up. By 17.8% the day after the announcement:

What was the good news for investors? Resorting to one of the usual bromides, HP “handily beat Street expectations” by posting Earnings Per Share (EPS) of $0.55 vs. a projected $0.30 to $0.40.

As discussed in the December 16th Monday Note, Chapter 2 of the Turnaround Artist Manual prescribes exactly what we’re seeing: Drastically lower expectations within days of taking on the job. “Things are worse than I was told. We’ll have to touch bottom before we bounce back…’”

Following the script, HP CEO Meg Whitman called 2013 a “fix and rebuild year”. Everyone should expect a “broad-based profit decline”. But a 17% rebound in the stock price can’t be explained solely by a collective sigh of relief when the actual numbers aren’t as bad as the CEO had led everyone to expect.

(In its earnings release, HP still calls itself “The world’s largest technology company”. I guess they think smartphones and tablets aren’t “technology”, but PCs and printers are…)

As quoted in a VentureBeat post, Whitman thinks that the other US PC maker, Dell, is in no better shape:

“You saw a competitor, Dell, completely crater earnings,” Whitman said in response to a question. “Maybe that is what you do when you are going private. We are setting up the company for the long term.”

Ironically, and without a hint of self-awareness, she accuses Dell of playing the Setting Artificially Low Expectations game:

She implied that Dell did that on purpose, since Michael Dell is motivated to repurchase shares in the company as cheaply as possible, and deliberately lowering earnings is a good way to get the share prices to fall.

 Actually, Whitman must envy what Dell is attempting to do: Get out of the PC clone Race To The Bottom. Because PCs make half of Dell’s revenue, getting out of that hopelessly commoditized business would cause trouble if done in public. Going private allows Dell to close the curtain, perform the unappetizing surgery out of view and, later, return to Wall Street with a smaller company endowed with a more robust earnings engine, focused on higher-enterprise gear and services.

This helps explain the apparent paradox: Wall Street doesn’t like HP and Microsoft shares despite their lower PC numbers but because of them. Investors want to believe that future earnings (the ones they count on when buying shares today) will come from “Post-PC” products and services instead of being weighed down by shrinking PC volumes and margins. In particular, those who buy HP shares must believe that the company will sooner or later exit the PC clone business. For Microsoft, the bet is that the company will artfully manage a smooth transition to higher Enterprise and Entertainment revenues and their fatter margins.

I’m not in fond of the “Post-PC” label, it lacks nuance and it’s premature. The desktop and laptop machines we’ve known for more than three decades may no longer be the sole incarnations of our personal computing – our affection, time, and money have shifted smartphones and tablets – but the PC will continue to live in our offices and homes.

Regard Lenovo, the Chinese company that seized on IBM’s PC business when Big Blue decided to exit the race. They’re doing quite well, posting a record $34B in revenue for this year.

There is life left in the PC business, just not for US incumbents.

JLG@mondaynote.com

 

Otellini’s Striking Confession

 

We know Intel shunned ARM processors and played virtually no role in the smartphone revolution. But we now learn Steve Jobs asked Intel to build the iPhone microprocessor. Paul Otellini, Intel’s departing CEO, admits he should have followed his gut – and made the smartphone world a very different place.

CEO valedictions follow a well-known script: My work is done here, great team, all mistakes are mine, all good deeds are theirs, I leave the company in strong hands, the future has never been brighter… It’s an opportunity for a leader to offer a conventional and contrived reminiscence, what the French call la toilette des souvenirs (which Google crudely translates as toilet memories instead of the affectionate and accurate dressing up memories).

For his farewell, Paul Otellini, Intel’s departing CEO, chose the interview format with The Atlantic Monthly’s senior editor Alexis Madrigal. They give us a long (5,700+ words) but highly readable piece titled Paul Otellini’s Intel: Can the Company That Built the Future Survive It?

Photo: Guardian.co.uk

The punctuation mark at the title’s end refers to the elephantine question in the middle of Otellini’s record: Why did Intel miss out on the smartphone? Why did the company that so grandly dominates the PC market sit by while ARM architecture totally, and perhaps irretrievably, took over the new generation of phones — and most other embedded applications?

According to Otellini, it was the result of Intel’s inertia: It took a while to move the machine.

Madrigal backfills this uneasy explanation with equal unease:

“The problem, really, was that Intel’s x86 chip architecture could not rival the performance per watt of power that designs licensed from ARM based on RISC architecture could provide. Intel was always the undisputed champion of performance, but its chips sucked up too much power. In fact, it was only this month that Intel revealed chips that seem like they’ll be able to beat the ARM licensees on the key metrics.”

Note the tiptoeing: Intel’s new chips “seem like” they’ll be fast enough and cheap enough. Madrigal charitably fails to note how Intel, year after year, kept promising to beat ARM at the mobile game, and failed to do so. (See these 2010, 2011 and 2012 Monday Notes.) Last year, Intel was still at it, dismissively predicting “no future for ARM or any of its competitors“. Tell that to ARM Holdings, whose licensees shipped 2.6 billions chips in the first quarter of this year.

Elsewhere in the article, Otellini offers a striking revelation: Fresh from anointing Intel as the microprocessor supplier for the Mac, Steve Jobs came back and asked Intel to design and build the CPU for Apple’s upcoming iPhone. (To clarify the chronology, the iPhone was announced early January, 2007; the CPU conversation must have taken place two years prior, likely before the June, 2005 WWDC where Apple announced the switch to x86. See Chapter 36 of Walter Isaacson’s Jobs bio for more.)

Intel passed on the opportunity [emphasis mine]:

“We ended up not winning it or passing on it, depending on how you want to view it. And the world would have been a lot different if we’d done it, […]

Indeed, the world would have been different. Apple wouldn’t be struggling through a risky transition away from Samsung, its frenemy CPU supplier; the heart of the iPhone would be Made In America; Intel would have supplied processors for more than 500 million iOS devices, sold even more such chips to other handset makers to become as major a player in the smartphone (and tablet) space as it is in the PC world.

Supply your own adjectives…

Indulging briefly in more What If reverie, compare the impact of Intel’s wrong turn to a better one: How would the world look like if, at the end of 1996, Gil Amelio hadn’t returned Apple back to Steve Jobs? (My recollection of the transaction’s official wording could be faulty.)

So, again, what happened?

At the end of the day, there was a chip that they were interested in that they wanted to pay a certain price for and not a nickel more and that price was below our forecasted cost. I couldn’t see it. It wasn’t one of these things you can make up on volume. And in hindsight, the forecasted cost was wrong and the volume was 100x what anyone thought.

A little later, Otellini completes the train of thought with a wistful reverie, a model of la toilette des souvenirs:

“The lesson I took away from that was, while we like to speak with data around here, so many times in my career I’ve ended up making decisions with my gut, and I should have followed my gut,” he said. “My gut told me to say yes.”

The frank admission is meant to elicit respect and empathy. Imagine being responsible for missing the opportunity to play a commanding role in the smartphone revolution.

But perhaps things aren’t as simple as being a “gut move” short of an epochal $100B opportunity.

Intel is a prisoner of its x86 profit model and Wall Street’s expectations. It’s dominant position in the x86 space give Intel the pricing power to command high margins. There’s no such thing in the competitive ARM space, prices are lower. Even factoring in the lower inherent cost of the somewhat simpler devices (simpler for the time being; they’ll inevitably grow more complex), the profit-per-ARM chip is too thin to sustain Intel’s business model.

(Of course, this assumes a substitution, an ARM chip that displaces an x86 device. As it turns out, the smartphone business could have been largely additive, just as we now see with tablets that cannibalize classical PCs.)

Another factor is the cultural change that would have been required were Intel to have gotten involved in making ARM devices. As both the designer and manufacturer of generation after generation of x86 microprocessors, Intel can wait until they’re good and ready before they allow PC makers to build the chips into their next products. The ARM world doesn’t work that way. Customers design their own chips (often called a System on a Chip, or SoC), and then turn to a semiconductor manufacturer (a foundry) to stamp out the hardware. Taking orders from others isn’t in Intel’s DNA.

And now?

The answer might lie in another French expression: L’histoire ne repasse pas les plats. Google Translate is a bit more felicitous this time: History does not repeat itself. I prefer the more literal image — History doesn’t come around offering seconds — but the point remains: Will there be seconds at the smartphone repast?

Officially, Intel says its next generation of x86 processors will (finally!) topple the ARM regime, that their chips will offer more computing might with no cost or power dissipation penalty. In their parlance “the better transistor” (the basic unit of logic processing) will win.

I doubt it. The newer x86 devices will certainly help Microsoft and its OEMs make Windows 8 devices more competitive, but that won’t prevent the spread of ARM in the legion of devices on which Windows is irrelevant. For these, Intel would have to adopt ARM, a decision Otellini has left to the new tandem leadership of Brian Krzanich (CEO) and Renée James (President). Will they stick to the old creed, to the belief Intel’s superior silicon design and manufacturing technology will eventually overcome the disadvantages of the more complex x86 architecture? Or will they take the plunge?

They might be helped by a change in the financial picture.

In 2006, that is after throwing Jobs in Samsung’s arms (pun unintended), Intel sold its ARM business, the XScale line, to Marvell. The reason was purely financial: for similar capital expenditures (costly fabs), ARM processors achieved much lower per-unit profit, this because of the much more competitive scene than in the x86 space.

Now, if Intel really wants to get a place at the smartphone table with new and improved x86 devices, the company will have to price those to compete with established ARM players. In other words, Intel will have to accept the lower margins they shunned in 2006. Then, why not do it with the ARM-based custom processors Apple and others require?

JLG@mondaynote.com

—————————-

(I’ll confess a weakness for The Atlantic and, in particular, for its national correspondent James Fallows, a literate geek and instrument-rated pilot who took upon himself to live in Beijing for a while and, as a result, can speak more helpfully about China than most members of the Fourth Estate. Going back to last week’s reference to the Gauche Caviar, when my Café de Flore acquaintances fall into their usual rut of criticizing my adopted country for its lack of “culture”, I hold out that The Atlantic — which sells briskly at the kiosk next door — is one of many examples of American journalistic excellence.

And, if you’re interested in more strange turns, see this other string Alexis Madrigal piece in the same Atlantic: The Time Exxon Went Into the Semiconductor Business (and Failed). I was there, briefly running an Exxon Information Systems subsidiary in France and learning the importance of corporate culture.)–JLG

This Wristband Could Change Healthcare

 

Jawbone is launching is UP wristband in Europe. Beyond the quirky gadget lies a much larger project: Changing healthcare — for better or for worst. 

 Hyperkinetic as he is, Hosain Rahman, the Jawbone founder, must be saturating his Jawbone UP wristband with data. The rubberized band, nicely designed by Yves Behar, is filled with miniaturized electronics: accelerometers and sensors monitor your activity through out the day, recording every motion in your life, from walking in the street to the micro-movements of your hand in a paradoxical sleep phase. For the fitness freak, the Up is a great stimulus to sweat even more; for the rest of us, it’s more like an activity and sleep monitoring device. (For a complete product review, see this article from Engadget, and also watch Hosain Rahman’s interview by Kevin Rose, it’s well worth your time.) Last week in Paris, after my meeting with Hosain, I headed straight to the nearest Apple Store to pick-up my Up (for €129), with the goal of exploring my sleeping habits in greater depth.

After using the device for a couple of days, the app that comes with it tells me I’m stuck in a regime of 5 to 6 hours of bad sleep — including less than three hours of slow-wave sleep commonly known as deep sleep. Interesting: Two years ago, I spend 36 hours covered with electrodes and sensors in a hospital specializing in studying and (sometimes) treating insomnia — after a 6 months on a wait list to get the test. At one point, to monitor my sleep at home, doctors lent me a cumbersome wristband, the size of a matchbox. The conclusion was unsurprising: I was suffering from severe insomnia, and there was very little they could do about it. The whole sleep exploration process must have cost 3000€ to the French public health care system, 20 times more than the Jawbone gadget (or the ones that do a similar job). I’m not contending that medical monitoring performed by professionals can be matched by a wristband loaded with sensors purchased in an electronics store. But, aside from the cost, there is another key difference: the corpus of medical observations is based on classic clinical tests of a small number of patients. On the other hand, Jawbone thinks of the UP wristband — to be worn 24/7 by millions of people — in a Big Data frame of mind. Hosain Rahman is or will soon be right when he says his UP endeavor contributes to the largest sleep study ever done.

Then it gets interesting. As fun as they can be, existing wearable monitoring devices are in the stone age compared to what they will become in three to five years. When I offered Hosain a list of features that could be embedded in future versions of the UP wristband — such as a GPS module (for precise location, including altitude), heartbeat, blood pressure, skin temperature and acidity sensors, bluetooth transmitter — he simply smiled and conceded that my suggestions were not completely off-track. (Before going that far, Jawbone must solve the battery-life issue and most likely design its own, dedicated super-low consumption processor.) But Hosain also acknowledges his company is fueled by a much larger ambition than simply build a cool piece of hardware aimed at fitness enthusiasts or hypochondriacs.

His goal is nothing less than disrupting the healthcare system.

The VC firms backing Jawbone are on the same page. The funding calendar compiled by Crunchbase speaks for itself: out of the stunning $202m raised since 2007, most of it ($169m), has been raised since 2011, the year of the first iteration of the UP wristband (it was a failure due to major design flaws). All the big houses are on board: Khosla Ventures, Sequoia, Andreessen-Horowitz, Kleiner Perkins, Deutsche Telekom… They all came with an identical scheme in mind: a massive deployment of the monitoring wristband, a series of deals with the biggest healthcare companies in America to subsidize the device. All this could result in the largest health-related dataset ever build.

The next logical step would be the development of large statistical models based on customers’ recorded data. As far as privacy is concerned, no surprise: Jawbone is pretty straightforward and transparent: see their disclosure here. It collects everything: name, gender, size and weight, location (thanks to the IP address) and, of course, all the information gathered by the device, or entered by the user, such as the eating habits. A trove of information.

Big Data businesses focusing on health issues drool over what can be done with such a detailed dataset coming from, potentially, millions of people. Scores of predictive morbidity models can be built, from the most mundane — back pain correlated to sleep deprivation — to the most critical involving heart conditions linked to various lifestyle factors. When asked about privacy issues, Hosain Rahman insists on Jawbone’s obsessive protection of his customers, but he also acknowledges his company can build detailed population profiles and characterize various risk factors with substantially greater granularity.

This means serious business for the health care and insurance sectors — and equally serious concerns for citizens. Imagine, just for a minute, the impact of such data on the pricing structure of your beloved insurance company? What about your credit rating if you fall into a category at risk? Or simply your ability to get a job? Of course, the advent of predictive health models potentially benefits everyone. But, at this time, we don’t know if and how the benefits will outweigh the risks.

frederic.filloux@mondaynote.com

Apple Buys Intel

 

Getting rid of Samsung as a processor supplier and, at the same time, capturing the crown jewel of the American semiconductor industry. How could Apple resist the temptation to solve its cash problem and make history again?

Halfway through the second quarter of the 2013 fiscal year, most of Apple’s top execs meet at an undisclosed location (Eddy Cue’s chair is empty – he’s been called away to a Ferrari board meeting). They’re joined by a few trusted industry insiders: Bill “the Coach” Campbell, Apple and Intuit Director and adviser to Google’s founders, Mssrs. Page and Brin; Larry Sonsini, the Silicon Valley consigliere of more than three decades; and Frank Quattrone, the star investment banker with nine lives.

The meeting isn’t about the company’s dwindling profit margins. The smaller margins were expected and invited: The reduced-price iPad and heavy promotion of the “old” iPhone 4 as an entry-level product are part of the long term strategy of guarding Apple’s lower end (so to speak). And no whining about AAPL’s grim slide over the last six months, a problem that has only one solution: Apple needs to record a series of better quarters.

The problem of the day is, once again, what to do with Apple’s obscene pile of cash.

By the end of December 2012, the company held about $137B in cash (or equivalents such as marketable securities), including $23B from operations for the quarter.

CFO Peter Oppenheimer delivers the bad news: It looks like operations will disgorge another $35B this quarter. The stock buy-back and dividend program that was designed to bleed off $45B over the next few years (see this March 2012 Monday Note) won’t be enough if the company continues at this rate.

Apple needs something bigger.

Quattrone has been sitting quietly at the end of the table. He clears his throat and speaks:

Buy Intel.

Well, yes, Frank (says Tim Cook), we’ve been buying Intel processors for the Mac since 2005.

Not the chips. The company. The planets are aligned for Apple to strike a blow that will leave the industry forever changed. Make history, acquire Intel.

Quattrone has their attention. He unfolds the celestial calibration:

  • Apple needs to extract itself from the toxic relationship with Samsung, its ARM supplier.
  • Intel is the best large-scale silicon manufacturer in the world. They have the people, the technology, and the plant capacity to match Apple’s needs for years to come.
  • “But Intel doesn’t do ARM!” you say. Indeed, Intel has no interest in the fierce competition and small margins in the ARM-based SoC market. Joining the ARM fray would severely disrupt Intel’s numbers and infuriate Wall Street. But if Intel were to essentially “go private” as Apple’s semiconductor manufacturing arm (pun intended), catering to all of Apple’s x86 and ARM needs (and whatever else Bob Mansfield is secretly plotting), Wall Street would have no such objection.
  • Intel is flailing. The traditional PC market – Intel’s lifeblood – continues to shrink, yet the company does nothing to break into the ARM-dominated mobile sector. In the meantime, the company makes perplexing investments such as buying McAfee for $7.68B.
  • There’s a leadership vacuum at Intel. Six months after announcing CEO Paul Otellini‘s “retirement”, Intel’s Board has yet to find a replacement who can sail the ship in more competitive waters. Apple could commission Pat Gelsinger, a 30-year Intel veteran and former CTO (Intel’s first) who fled to VMware after his career stalled at Intel. Despite being a bit of a Bill Gates look-alike (once upon a time), Gelsinger is a real technologist who would fit well within Apple, especially if he were given the opportunity to really “go for” the ARM architecture instead of iteratively tweaking x86 devices.
  • Last but not least, Intel’s market cap is about $115B, eminently affordable. The company is profitable and generates a good deal of cash, even after the heavy capital expenditures required by its constant need to build new and expensive manufacturing plants.
  • …oh, and one more thing: Wouldn’t it be fun to “partner” more closely with Microsoft, HP and Dell, working on x86 developments, schedules and… pricing?

A lively discussion ensues. Imagine solving many of Apple’s problems with a single sweeping motion. This would really make Cupertino the center of the high-tech world.

It’s an interesting idea, but there will be obstacles, both cultural and legal.

The Coach goes first: “Knowing both of these companies more than a little bit, I can attest to the pride they have in their respective cultures. They’re both disinclined to reconsider their beliefs in any meaningful way. Merging these two dissimilar groups, shedding unnecessary activities such as McAfee and the like would be dangerously disruptive to Apple’s well-honed, cohesive culture. As a general rule, merging two large organization rarely succeeds… unless you consider merging airlines a success…”

Finally, the Consigliere speaks: “It’s a tempting fantasy, it will mean years of work for my firm and many, many others, but as a friend of the company, as a past confidant of your departed Founder, don’t do it. There will be too much legal trouble with the Feds, with competitors, with Intel partners. Most fantasies aren’t meant to be enacted.”

I won’t dwell on the reality of the meeting: I made it up as a way to explain why Apple really has no choice other than submit to another cash phlebotomy, this time for an additional $60B. And, as with real-world phlebotomies, the procedure will treat the problem, but it won’t cure it. With $30B from operations per quarter, the $60B lancing will have to be repeated.

Some read the decision to return gobs of cash to shareholders as an admission of defeat. Apple has given up making big moves, as in one or more big acquisitions.

I don’t agree: We ought to be glad that the Apple execs (and their wise advisers) didn’t allow themselves to succumb to transaction fever, to a mirage of ego aggrandizement held out by a potential “game changing” acquisition.

A final word on taxes. To return the additional $60B (for a total of $100B when including the ongoing program announced last year) through increased dividends and repurchased shares, Apple will have to borrow money.

Borrow? When they have so much cash?

Yes, thanks to our mangled tax code. As explained here, about $100B of Apple’s cash is stored overseas. If repatriated, it would be “heavily” (read “normally”) taxed. Like most US companies that have international operations, Apple plays complicated, entirely legal tax games that allow their international profits to be taxed at very low rates as long as the profits — and the resulting cash — stay outside Uncle Sam’s reach. And thus we have the apparent paradox of borrowing money when cash-rich.

The benefit of these tax code contortions is difficult to explain to normal humans — as opposed to legislators who allowed the loopholes.

All this now makes Apple a different company. Once a fledgling challenger of established powerhouses such as IBM, Microsoft or HP, it now makes “too much cash” and is condemned to a life of paying dividends and buying back shares — like the old fogies it once derided.

JLG@mondaynote.com

 

 

Privacy: You Have Nothing To Fear

 

Pervasive sensors and IP connections, coupled with the “infinite” storage and computing power in the Cloud, threaten our privacy. We need to defend ourselves and get control of our personal data amassed by private companies and government agencies.

Optimists and pessimists may inhabit opposing camps, but they do have one thing in common: Their inclinations lead to behaviors that verify their prophecies. I’ve chosen my side: I’m an optimist and have been rewarded accordingly. As a reminder of my attitude, to make sure that the occasional frustrations don’t derail my determination, I keep a little figurine from the Provençal Crèche (Nativity Scene) on my desk. He’s called Lou Ravi, the Enraptured One:

The traditional characterization is that of a gent who wanders the world, innocently marveling at the simplest of miracles. (At times, I wonder if he isn’t just a polite version of the village idiot.)

Recently, a seemingly trivial incident cast a shadow over my life-long optimism, an event that awakened dark thoughts about technology’s impact on our privacy.

As I’m driving on the A10 not-so-freeway towards the Loire châteaux, I see my license plate displayed on a sign that tells me that I’m exceeding the speed limit (130kph, about 80mph). This is novel… where we used to have an anonymous flashing nag, now we’re individually fingered. On the one hand, it’s certainly more helpful than a broad, impersonal warning; on the other, it’s now personal.

Stirred from my enraptured stupor, I start counting other ways in which we’re targeted.

Staying within the realm of license plates, we have an official, Wikipedia-sanctioned acronym: ALPR, the Automatic License Plate Reader, a device that’s used (or mis-used) by municipalities to scan every vehicle that enters the city’s limits. An ALPR system is already operational in ritzy Tiburon just north of the Golden Gate Bridge, and it’s being considered in ritzier Piedmont, an island of wealth surrounded by Oakland. The NYPD has used mobile license plate readers to build a “database of 16 million license plates, along with locations where the car was spotted”. (A Google search for Automatic License Plate Reader yields more than 1M hits.)

We also have various flavors of “event data recorders” in our cars. Similar to a plane’s black box, an EDR can regurgitate the sequence of events that preceded a crash. According to the NHTSA (National Highway Traffic Safety Administration), 96% of all 2013 vehicles carry such a device and there is a proposal to make them mandatory in 2015 models.

Insurance companies see the EDR as an opportunity to better evaluate risk so they can offer lower premiums to good drivers. Privacy advocates are concerned that the data could be used for less benevolent purposes:

Though the information is being collected with the best of intentions – safer cars or to provide drivers with more services and conveniences – there is always the danger it can end up in lawsuits, or in the hands of the government or with marketers looking to drum up business from passing motorists.

Again, if you Google “car black box recorder”, you get about 6M hits and a wide range of third-party devices. Some come with a dashboard camera such as we see in American patrol cars (and that have been adopted by a huge number of Russian drivers); others plug into the OBD-II (On-Board Diagnostic) connector that’s present on all modern cars. Combined with accelerometers and precision GPS recording, these draw a very accurate picture of everything we do at the wheel, where, when and how.

It’s not all sinister: With appropriate software, weekend track drivers can visualize and analyze their braking, acceleration, and effective use of apexes. Still, the overall picture is one of omnipresent surveillance. And I’m certainly not encouraged when I read that “anyone with a handheld scanner and access to the port under your steering column can download a wealth of information about your vehicle.”

The regard for privacy that’s demonstrated by the public sector — the government agencies that can have an enormous impact on our lives — is also less than encouraging. We now realize that the IRS reads our email without requiring any authorization or judicial supervision; the DEA complains about iMessage encryption; we have National Security Letters that confer broad and little-supervised snooping powers to US government agencies.

On the private side, Google, Facebook, and cellular carriers amass and trade on our personal data, again, with little or no practical oversight. Try asking any of these companies what sort of information they have on you, to whom they sell it, and if you can have a peek at it.

The litany goes on: Escalating healthcare expenditures give insurers equally escalating incentives to acquire personal behavior data in order to improve their risk calculation (and reject claims). We’re photographed, videoed, and, now, face-recognized everywhere. Try counting the cameras that see you on the street, in stores, elevators, offices.

When we worry about such practices, we get the sort of rote retort infelicitously typified by Eric Schmidt: “If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place.”

Sure, if you have nothing to hide, you have nothing to fear. All you need to do is lead a pristine life. Drive carefully; wait for the green light before you cross the street; eat a balanced diet; don’t take, view, or exchange the wrong pictures; don’t consort with undesirable people; don’t say or write bad words; don’t inhale the wrong smoke…

This is unrealistic.

If there is nowhere to hide, how can disagreements safely ferment in political life, at work, in relationships? By definition, change disturbs something or annoys someone. And, moving to paranoia, or full awareness, the age-old question arises: Who will guard us from the guardians?

Returning to my now slightly-strained optimism, I hope we’ll support the people and organizations, such as the ACLU and many others, who work for our privacy, and that we’ll use our votes to unseat those who sell us out to private and state encroachers. We can start with demanding a handle on who has what data on us. Playing on Habeas Corpus, it’s already called Habeas Data.

I’m curious to see what Google, Verizon, Orange, Facebook, Amazon and many others know about me. Insights await…

JLG@mondaynote.com

Apple is Losing The War – Of Words

 

Besides its ads, Apple says very little, confident numbers will do the talking. This no longer works as others have seized the opportunity to drive the narrative. 

The day before Samsung’s big Galaxy S4 announcement, Apple’s VP of Marketing, Phil Schiller, sat down for an interview with Reuters and promptly committed what Daring Fireball’s John Gruber calls an unforced error:

“…the news we are hearing this week [is] that the Samsung Galaxy S4 is being rumored to ship with an OS that is nearly a year old,” [Schiller] said, “Customers will have to wait to get an update.”

Not so, as Gruber quickly corrects:

But it ends up the S4 is — to Samsung’s credit — shipping with Android 4.2.2, the latest available version. Not sure why Schiller would speculate on something like this based solely on rumors.

To Samsung’s delight, we can be sure, the interview received wide coverage in publications such as the Wall Street Journal and Bloomberg, just hours before the S4 was unveiled, complete with the month-old Android operating system.

This didn’t go over well. Even before the “year old Android version” was exposed as unfounded conjecture, reactions to Schiller’s trash talk were uniformly negative. Apple was accused of being on the defensive.

But, the true-believers ask, isn’t this something of a double-standard? What about the trash talk Samsung ads that depicted the iPhone as old-fashioned and its users as either cult sheep or doddering golden agers, weren’t they also a form of defensiveness? Why were Samsung’s mean-spirited ads seen as fun and creative, while Schiller’s slight misstep is called “defensive”?

Yes, Apple is held to a (well earned) different standard. Once a challenger with an uncertain future, Apple has become The Man. Years ago, it could productively poke fun at Microsoft in the great I’m a Mac, You’re a PC campaign (the full series of ads is here), but the days of taking potshots at the incumbent are over. Because of its position at the top, Apple should have the grace to not trash its competitors, especially when the digs are humorless and further weakened by error.

Schiller’s faux pas will soon be forgotten — it was a minor infraction, a five yard penalty — but it stoked my enduring frustration with a different sort of Apple-speak characteristic: The way Apple execs abuse words such as incredible“, “great“, “best when they’re discussing the company’s products and business.

My accusation of language molestation needs examples. Citing a page from W. Edwards Deming’s gospel, In God We Trust, Everyone Else Brings Data, I downloaded a handful of Apple earnings calls, such as this one, courtesy of Seeking Alpha, and began to dig.

[Speaking of language faux pas, Deming’s saying was shamelessly and badly appropriated — without attribution — by Google’s Eric Schmidt in a talk at MIT.)

Looking just for the words that emanated from the horses’ mouths, I stripped the intros and outros and the question parts of the Q&As, and pasted into Pages (which has, sadly, lain fallow since January 2009).  Pages has a handy Search function (in the Edit > Find submenu) that compiles a list of all occurrences of a word in a document; here’s what I found… .

  • Across the five earnings statements, some form of the word “incredible” appears 7, 9, 9, 11 and 9 times. The Search function offers a handy snippet display so you can check the context in which the word was used:

  • “Tremendous”, in its various forms, appears 12 times.
  • Amazing: 8
  • Strong: 58
  • Thrilled: 13
  • Maniacally focused: 2
  • All told, “great” appears 70 times. A bit more than half are pathetic superlatives (“great products”, “great progress”, “we feel great about…”), some are innocuous (“greater visibility”), but there’s an interesting twist: The snippet display showed that six were part of the phrase “Greater China”:

“Greater” or not, China is mentioned 71 times, much more than any other country or region I checked (Korea =  1, Japan = 6, Europe = 12).

(In the interest of warding off accusations of a near-obsessive waste of energy, I used a command line program to generate some of these numbers. Android? give me a second…4. Google=0, Facebook=4, Samsung=2.)

Now let’s try some “sad” words:

  • Disappoint: 0
  • Weak: 7. Six of these were part of “weak dollar”; the other was “weak PC market”. By contrast, only five or six of the 58 “strongs” referred to the dollar; the rest were along the lines of “strong iPad sales”.
  • Bad: 0
  • Fail: 0

The dissection can go on and on, but let’s end it with a comparison between more and less . Eliminating instances of less as a suffix (“wireless”), the result shows a remarkable unbalance: morewins each of the five sessions with a consistently lopsided score: 28 to 3…more or less.

But, you’ll object, what’s wrong with being positive?

Nothing, but this isn’t about optimism, it’s about hyperbole and the abuse of language. Saying “incredible” too many times leads to incredulity. Saying “maniacally focused” at all is out of place and gauche in an earnings call. One doesn’t brag about one’s performance in the boudoir; let happy partners sing your praise.

When words become empty, the listener loses faith in the speaker. Apple has lost control of the narrative; the company has let others define its story. This is a war of words and Apple is proving to be inept at verbal warfare.

In another of his sharply worded analyses titled Ceding the Crown, John Gruber makes the same point, although from a different angle:

The desire for the “Oh, how the mighty Apple has fallen” narrative is so strong that the narrative is simply being stated as fact, evidence to the contrary be damned. It’s reported as true simply because they want it to be true. They’re declaring “The King is dead; long live the King” not because the king has actually died or abdicated the throne, but because they’re bored with the king and want to write a new coronation story.

I agree with the perception, but blaming the media rarely produces results, we shouldn’t point our criticism in the wrong direction. The media have their priorities, which more often than not veer in the direction of entertainment passed as fair and balanced information (see Amusing Ourselves To Death by Neil Postman). If Apple won’t feed them an interesting, captivating story, they’ll find it elsewhere, even in rumors and senseless hand-wringing.

Attacking competitors, pointing to their weaknesses, and trumpeting one’s achievements is better done by hired media assassins. A company, directly or through a PR firm, engages oft-quoted consultants who provide the required third-party stats, barbs, and encomiums. This isn’t theorizing, I once was a director at a company, one of many, that used such an arrangement to good effect.

A brief anecdote: When Microsoft was Microsoft, Waggener Edstrom, the company’s PR powerhouse, was an exemplary propagandist. I distinctly remember a journalist from a white-shoe East Coast business publication coming to my office more than twenty years ago, asking very pointed questions. I asked my own questions in return and realized that the individual didn’t quite know the meaning of certain terms that he was throwing around. A bit of hectoring and cajoling, and the individual finally admitted that the questions were talking points provided by the Seattle PR firm. A few years later, I got a comminatory phone call from one of the firm’s founders. My offense? I had made an unflattering quip about Microsoft when it was having legal troubles with Apple (the IP battle that was later settled as part of the 1997 “investment” in Apple and Steve Jobs). PR firms have long memories and sharp knives.

The approach may seem cynical, but it’s convenient and effective. The PR firm maintains a net (and that’s the right word) of relationships with the media and their pilot fish. If it has the talent of a Waggener Edstrom, it provides sound strategic advice, position papers, talking points, and freeze-dried one-liners.

Furthermore, a PR firm has the power of providing access. I once asked a journalist friend how his respected newspaper could have allowed one of its writers to publish a fellacious piece that described, in dulcet tones, a worldwide Microsoft R&D tour by the company’s missus dominicus. “Access, Jean-Louis, access. That’s the price you pay to get the next Ballmer interview…”

Today, look at the truly admirable job Frank Shaw does for Microsoft. Always on Twitter, frequently writing learned and assertive pieces for the company’s official blog. By the way, where’s Apple’s blog?

The popular notion is that Apple rose to the top without these tools and tactics, but that’s not entirely true. Dear Leader was a one-man propagandastaffel, maintaining his own small network of trusted friends in the media. Jobs also managed to get exemptions from good-behavior rules, exemptions that seem to have expired with him…

Before leaving us, Jobs famously admonished “left-behind” Apple execs to think for themselves instead of trying to guess what he would have done. Perhaps it’s time for senior execs to rethink the kind of control they want to exercise on what others say about Apple. Either stay the old course and try to let the numbers do the talking, or go out and really fight the war of words. Last week’s misstep didn’t belong to either approach.

One last word: In the two trading days bracketing the Samsung S4 launch Schiller clumsily attempted to trash, Apple shares respectively gained 1%, followed by a 2.58% jump the day after the intro. Schiller could have said nothing before the launch and, today, let others point to early criticism of the S4′s apparent featuritis.

JLG@mondaynote.com

More iWatch Fun

 

When looking at the potential for a really smart watch, the idea of an Apple iWatch looks almost sensible. Still, there is a long way between the attractive idea and stuffing the required computer power in a wristwatch.

As I somberly contemplate the death of personal privacy, our being spied upon everywhere, at all times (for our own good, you understand), a tweet from an ex-coworker known for his stiletto wit evokes a welcome smile:

Frank is referring to Nick Hayek Jr., the cigar-wielding head of Swatch Group AG (and Zino Davidoff doppelgänger):

In a Bloomberg article (from which the above photo is extracted), Hayek dismisses the iWatch rumors:

“Personally, I don’t believe it’s the next revolution,” the chief of the largest Swiss watchmaker said at a press conference on annual results in Grenchen, Switzerland. “Replacing an iPhone with an interactive terminal on your wrist is difficult. You can’t have an immense display.”

Hayek’s pronouncement triggered many sharp reactions, such as this history lesson from another sharp tweeter:

As Kontra (a “veteran design and management surgeon”) reminds us, Palm CEO Ed Colligan once famously pooh-poohed the unannounced iPhone

We’ve learned and struggled for a few years here figuring out how to make a decent phone, […] PC guys are not going to just figure this out. They’re not going to just walk in.

Colligan’s brush-off wasn’t the first time, or the last, that Apple’s “unauthorized intrusions” were decried by industry incumbents and arbiters of business taste:

  • The iPod: A doomed foray into the saturated, profitless market of commodity MP3 players.
  • iTunes: Single tracks for 99 cents? Not a chance against free online music sites.
  • Apple Stores: Another folly, zero experience in the cutthroat and manpower intensive retail business.
  • iPhone: The status quotidians scoff.
  • Homegrown ARM-based processors: A billion dollar mistake.
  • iPad: Ridiculous name. Steve Ballmer derides its lack of keyboard and mouse.

This isn’t to deny that the Apple Midas Touch is occasionally fat fingered. Prior to its launch, Steve Jobs touted MobileMe as Exchange For The Rest of Us; afterwards, he told the MobileMe team they should “hate each other for letting each other down”. Last year, Tim Cook had no choice but to apologize for the iMaps fiasco (and then showed a couple Apple executives the door).

So how would this hypothetical iWatch play out? Can Apple re-invent a known device à la the iPod, or are they venturing into territory without a map (or, one can’t resist, with an iMap)?

First, a brief look at today’s watches, smart and not.

After five centuries of improvements to their time keeping mechanisms (or movements), mechanical watches are no longer judged for their temporal accuracy, but for their beauty and, just as important, for the number and ingeniousness of their complications — what non-horologists would call “additional functions”. It’s not enough to just tell the time, watches must display the phases of the moon and positions of the planets, function as a  chronograph, provide a perpetual calendar… The moniker grande complication is applied to the most advanced, such as this one from the Gallet company (founded in 1466):

These complications come at a price: For $300k you can pick up the double-faced Patek Philippe Sky Moon Tourbillon with its 2800-star celestial chart. The Franck Muller Aeternitas Mega 4, which holds the record with 36 complications and 1400 parts, will set you back $2.7M:

These luxury watches function more as engineering marvels than utilitarian timepieces, and, accordingly, they’re worn as adornments — and status symbols.

The more common electronic watch, which uses a precise quartz oscillator and typically has no moving parts, hasn’t entirely killed the mechanical watch, but it hasn’t been for lack of trying. Electronic watchmakers, aided by the tiny microprocessors embedded in many of these devices, have piled on even more more functions — calculators, multiple repeating alarms, even circular slide rules…it’s simply an exercise in the proverbial mere matter of software.

But each new function introduces UI complexity, as this page from the instruction manual for my Seiko multi-function watch establishes:

Most of the manual’s 33 pages are in the same vein. As a result, normal humans find these electronic complications baffling and leave most of the functions unmolested.

And now we have the smartwatch, a true computer that’s strapped to your wrist. Today’s smartwatch will tell you the time and run some rudimentary applications, but its primary role is to act as an extension of the smartphone that you’ve paired through Bluetooth. A phone call comes in, your watch shows you the number; an email message arrives, your watch scrolls the sender’s address; if the music you’re streaming on your phone is too quiet, just tap your watch to turn it up…at least in theory.

These are all good ideas, but, as the NYT’s David Pogue found after test driving a sampling of these devices, their execution leaves something to be desired. His conclusion:

…you have to wonder if there’s a curse on this blossoming category. Why are these smartwatches so buggy, half-baked and delayed?
The Casio and Martian watches are worth considering. But if you ask the other watches what time it is, they’ll tell you: too soon.

So, again, where does the putative iWatch fit into all of this?

Let’s start with the UI. If we just regard the traditional chronological functions (date and time formats, alarms, stopwatch) an iPhone-like touch interface, albeit on a smaller screen, would easily eclipse the clunky buttons-along-the perimeter controls on my Seiko. For the more advanced “smart” functions, one assumes that Apple won’t be satisfied unless the user experience far exceeds the competition. (Of the five smartwatches that Pogue reviews, only one, the Cukoo, has even a hint of touch screen capability.)

Then there’s the matter of overall style. This isn’t a fair fight; there’s something viscerally compelling about a traditional mechanical watch with exposed movement. Even on the low end of the market you can find a mechanical watch that displays its inner beauty. Nonetheless, we can trust Sir Jony to rise to the challenge, to imagine the kind of style we’ve come to expect.

There’s also the battery question. Will the iWatch suffer from having a two or three days battery life as suggested by “[s]ources close to Apples [sic] project team”? Leaving aside conjectures about the anatomical location whence emerged these sources’ information, two thoughts come up…

First, it’s a safe assumption that the target audience for the iWatch are iDevice owners that Apple has “trained” (subjugated, critics will say) to charge their devices at night. For them, charging the iWatch, as well, won’t be a dealbreaker. The Lightning connector and charger for an iPhone or iPad should be small enough to fit a largish watch. Or perhaps the addition of the iWatch to the iDevice constellation will convince Apple to incorporate wireless charging (despite the diffidence of Phil Schiller, Apple’s VP of marketing).

Second, some electronic watches don’t need batteries at all. In Seiko’s Kinetic line, the kinetic motion of the wearer’s hand drives a tiny generator that feeds electricity into a capacitor for storage. (For the inert watch wearer, stem winding works as well. In a clever twist, some of newer models preserve the stored charge by halting the motion of the hands when the watch isn’t being worn.) It’s unclear whether the energy captured from hand movements will suffice to feed an ambitious Apple smartwatch, but the technology exists.

Turning to more advanced functionality: Will the iWatch be an iOS device? I think it’s very likely. That doesn’t mean that the iWatch will be an iPhone/iPod Touch, only smaller. Instead, and as we see with today’s Apple TV, the iWatch will enrich the iOS ecosystem: Reasonably useful on its own, but most important as a way to increase the value/enjoyment of other iDevices…at least for now.

Eventually, and as I’ve written here several times, I believe the Apple TV will become a first class citizen, it will have its own versions of apps that were written for the iPhone/iPad, as well as apps that are for TV alone. With iOS as the lingua franca, the iWatch could be treated with the same respect.

There are plenty of examples of apps that would work on a very small screen, either in conjunction with existing data (calendar, address book, stock market, iMessage, weather) or as a remote for other devices, including non-Apple products (the Nest thermostat comes to mind).

We should also consider biometric applications. The intimate contact of the iWatch makes it a natural carrier for the ever-improving sensors we find in today’s health monitors, devices that measure and record heart rate and perspiration during a workout, or that monitor sleep patterns and analyze food intake. What we don’t find, in these existing gadgets, is the ability to download new apps. An iWatch with health sensors coupled with the App Store would open whole new health and wellness avenues.

Finally, there’s (always) the money question. Would our mythical iWatch sell in sufficient volume — and with a high enough margin — to make it a significant product line for Apple? Given that watches easily sell for hundreds of dollars, and that we would almost certainly use an Apple iWatch more often and for more purposes than an Apple TV, the volume/margin question isn’t too hard to answer.

Back to reality, translating a fantasy into a real product is by no means a sure thing. A pleasant, instantaneous user experience requires computing power. Computing power requires energy; energy means battery drain and heat dissipation. These are challenges for real grown-ups. And sometimes a grown-up has to make the vital No We Won’t Do This decision that separates bloated demi-failures from truly elegant genre-creating breakthroughs.

JLG@mondaynote.com

iPad and File Systems: Failure of Empathy

 

The iPad placed a clear bet on simplicity — and was criticized for it. The bet won. But now, can the iPad evolve toward more business applications without sacrificing its simplicity, without becoming a “fridge-toaster”?

Three years ago, the iPad came out. The device was an immediate hit with customers and (most) critics. Steve Jobs’ latest — and, unfortunately, last — creation truly deserved the oft-abused game changer moniker.

But, as always, there were grumblings up in the cheap seats. As Mike Monteiro, co-founder of Mule Design observed:

“Following along on Twitter I was seeing things like ‘underwhelming’, ‘meh’ , ‘it’s not open’, ‘it’s just a big iPhone’, etc. And most of this stuff was coming from people who design and build interactive experiences.”

Monteiro penned a sharp, relevant response to the naysayers. Titled “The Failure of Empathy“, his post is summarized by this picture:

A generation ago, geeks were the arbiters of taste in the world of personal computing. Programmers, designers, hobbyists and tinkerers…these were the inhabitants of “user space”, and we built computers with them in mind. By designing the Apple ][ for himself (and his fellow travelers) Steve Wozniak hit the bull’s eye of a large, untapped target.

Today, geeks are but a smallish subset of computer users. Their (typically exaggerated) negative comments may have some sting if you’re responsible for engineering the “brain dead” backing store for a windowing system, but in the real world, no one cares about “byte sex” or “loop unrolling”. What counts is how non-technical users think, feel, and respond. Again, from Monteiro’s post:

“As an industry, we need to understand that not wanting root access doesn’t make you stupid. It simply means you do not want root access. Failing to comprehend this is not only a failure of empathy, but a failure of service.”

This was written in February 2010; I doubt that anyone at the time thought the iPad would ascend to such heights so quickly: 65.7M sold in 2012, 121M since the 2010 debut, rising even faster than the iPhone.

This is all well and good, but with success comes side effects. As the iPad gets used in ways its progenitors didn’t anticipate, another failure of empathy looms: Ignoring the needs of people who want to perform “complicated” tasks on their iPads.

When the iPad was introduced, even the most obliging reviewers saw the device as a vehicle for consumption, not creation. David Pogue in the New York Times:

“…the iPad is not a laptop. It’s not nearly as good for creating stuff. On the other hand, it’s infinitely more convenient for consuming it — books, music, video, photos, Web, e-mail and so on.”

This is still true…but that hasn’t stopped users from trying — struggling — to use their iPads for more ambitious tasks: Building rich media presentations and product brochures, preparing course material, even running a business. Conventional wisdom tells us that these are tasks that fall into the province of “true” personal computers, but these driven users can’t help themselves, they want to do it all on their iPads. They want the best of both worlds: The power of a PC but without its size, weight, (relative) unresponsiveness, and, certainly, price.

The evidence is all around us. Look at how many people in cafés, offices and airport lounges use a keyboard with their iPad, such as this Origami combo:

Or the Logitech Keyboard Cover:

Both keyboards are prominently displayed in the Apple Store. We’ll assume that shelf space isn’t doled out by lottery (or philanthropically), so these devices must be selling briskly.

Of course, this could just be anecdotal evidence. What isn’t anecdotal is that Apple itself claims that the iPad has penetrated a large proportion of Fortune 500 companies. In some of its stores, the company conducts sessions to promote the use of iPads in business applications.

I attended one such gathering last year. There was a very basic demonstration of Keynote, iPad’s presentation app, plus the testimony of a happy customer who described the usefulness of the iPad in sales situations. All quite pleasant, but the Q&A session that followed was brutal and embarrassing: How do you compose real-world, mixed-document presentation? No real answer. Why can’t the iPad access all the documents — not just iWork files — that I dropped into iCloud from my Mac? No answer there, either.

This brings us to a major iPad obstacle: On a “real” PC the file system is visible, accessible; on the iPad, it’s hidden. The act of creating, arranging, accessing files on a PC is trivial and natural. We know how to use Finder on the Mac and Explorer on Windows. We’re not perplexed by folder hierarchies: The MyGreatNovel folder might contain a lengthy set of “MGN-1″, “MGN-2″, “MGN-3″ drafts, as well as subfolders such as ArtWork, Reference, and RejectionLetters, each of which contain further subfolder refinements (RejectedByGrove, RejectedByPenguin, RejectedByRandomHouse…).

On an iPad you don’t navigate a file system but, instead, you launch an app that has it’s own trove of documents that it understands — but it can’t “see” anything else.

For example: Keynote doesn’t let you see the graphics, videos, and PDFs that you want to assemble into your presentation. Unlike on the Mac, there’s no Finder, no place where you can see “everything” at one glance. Even more important, there’s no natural way to combine heterogeneous documents into one.

On the other hand, we all know users who love the iPad for its simplicity. They can download and play music, read books, respond to email and tweets, view photos, and stream movies without having to navigate a file hierarchy. For them, the notion of a “file system” is neither natural nor trivial — it’s foreign and geeky. Why throw them into a maze of folders and files?

Apple’s decision to hide the iOS file system from iPad (and iPhone) users comforts the non-geek and is consistent with Steve Jobs’ idea that applications such as Mail, iTunes, iPhoto, iCal, and Contacts shouldn’t reveal their files and folders. Under the hood, the application stores its data in the Mac’s file system but, on the surface, the user sees appointments, photo albums and events, mailboxes and messages.

Still, some of us see this as the storage equivalent of Seinfeld’s Soup Nazi: No File System For You!

App developers and customers keep trying. iOS apps such as GoodReader and File Manager Pro valiantly attempt to work around the iPad strictures. PhoneView will expose and manipulate your iPad’s file system (not recommended). But success with any of these apps is limited and comes at a price: The iPad’s simplicity and fluidity is long gone by the time you achieve the desired result, the multimedia brochure or HR tutorial.

This places Apple at a fork on the road. On the left is the current path: more/better/lighter/faster of the same. Only evolutionary changes to the simple and successful worldview. This is today’s trajectory, validated by history (think of the evolution of the MacBook) and strong revenue numbers.

On the right, Apple could transform the iPad so that power users can see and combine data in ways that are impossible today. This could attract business customers who are hesitant about making the plunge into the world of tablets, or who may be considering alternatives such as Microsoft’s PC/tablet combo or Android devices with Google services.

The easiest decision is no decision. Let’s have two user interfaces, two modes: The Easy mode for my Mother-In-Law, and the Pro Mode for engineers, McKinsey consultants, and investment bankers. Such dual-mode systems haven’t been very popular so far, it’s been tried without success on PCs and Macs. (Re-reading this, I realize the Mac itself could be considered such a dual-mode machine: Fire up the Terminal app and you have access to a certified Unix engine living inside…)

The drive to “pervert” the iPad is unmistakable. I think it will prove irresistible in the end. But I have trouble forming a coherent picture of an evolution that would let Apple open the iPad to more demanding users — without sacrificing its great simplicity and falling into the fridge + toaster trap.
It’s a delicate balancing act.

JLG@mondaynote.com

 

The Next Apple TV: iWatch

 

Rumors don’t actual Apple products make, see the perennial Apple TV — and the latest iWatch rumors. This is an opportunity to step back, look at Apple’s one and only love –personal computers — and use this thought to sift through rumors. 

Every week brings new rumors of soon-to-be-released Apple products. The mythical Apple TV set is always a favorite: Gossip of an Apple buyout of troubled TV maker Löwe has sent the German company’s stock soaring. We also hear of a radio streaming service that will challenge Pandora and Spotify, and there’s the usual gaggle of iPhone, iPad, and Mac variations. More interesting is the racket surrounding Apple’s “stealth” projects:  an iWatch and other wearable devices (and “racket” is the right word — see these intimations of stock manipulation).

There is a way to see through the dust, to bring some clarity, to organize our thoughts when considering what Apple might actually do, why the company would (or wouldn’t) do it, and how a rumored product would fit into the game plan.

The formula is simple: Apple engineers may wax poetic about the crystalline purity of the software architecture, execs take pride in the manufacturing chain and distribution channels (and rightly so), marketing can point to the Apple Customer Experience (when they’re not pitching regrettable Genius ads or an ill-timed campaign featuring Venus and Serena Williams). But what really floats their bots, what hardens Apple’s resolve is designing, making, and selling large numbers of personal computers, from the traditional desktop/laptop Mac, to the genre-validating iPad, and on to the iPhone — the Very Personal Computer. Everything else is an ingredient, a booster, a means to the noblest end.

Look at Apple’s report to its owners: there’s only one Profit and Loss (P&L) statement for the entire $200B business. Unlike Microsoft or HP, for example, there is no P&L by division. As Tim Cook put it:

We manage the company at the top and just have one P&L and don’t worry about the iCloud team making money and the Siri team making money…we don’t do that–we don’t believe in that…

Apple’s appreciation for the importance and great economic potential of personal computers — which were invented to act as dumb servants to help us with data storage, text manipulation, math operations — may have been, at first, more instinctual than reasoned. But it doesn’t matter; the company’s monomania, it’s collective passion is undeniable. More than any other company, Apple has made computers personal, machines we can lift with our hands and our credit cards.

With these personal computer glasses on, we see a bit more clearly.

For example: Is Apple a media distribution company? Take a look at Apple’s latest 10-Q SEC filing, especially the Management Discussion and Analysis (MD&A) section starting page 21. iTunes, now reported separately, clocked $3.7B for the last quarter of 2012.  Elsewhere, Horace Dediu sees $13.5B for the entire year. A big number indeed, and, certainly, iTunes is a key to Apple’s success: Without iTunes there would have been no iPod, Apple’s “halo product“, proof that the company could come up with a winner.  Later, iTunes begat the App Store, a service that solidified the App Phone genre.

Some misguided analysts look at the numbers and argue that Apple ought to spin off iTunes. They use the old “shareholder value” gambit, but the “value” simply isn’t there: Horace Dediu puts iTunes margins in the 15% region, well below Apple’s overall 38%. iTunes is a hugely important means to the personal computer end, but it’s not a separate business.

How about Apple as a retail company? The success of the Apple Store is stellar, a word that’s almost too weak: The Apple Stores welcomed three times more visitors than all of the Disney parks, and generated more than $20B in revenue last year — that works out to an astonishing $6000 per square foot, twice as much as the #2 shop (Tiffany and Co.). But Apple’s 400 stores aren’t a business, they only exist to create an experience that will lead to more sales, enhanced customer satisfaction, and, as a consequence, increased margins.

Apple as a software company? No. The raison d’être for OS X, iOS, iWork, and even Garage Band is to breathe life into Apple hardware. By now, the calls for Apple to see the error of its ways, to not repeat the original sin of not licensing Mac OS, to sell iOS licenses to all comers have (almost) died.
During my first visit to Apple’s hypergalactic headquarters and warehouse in February 1981, I was astonished at the sight of forklifts moving pallets of Apple ][ software. The term “ecosystem” wasn’t part of the industry lingo yet, but I had witnessed the birth of the notion.
Apple had a much harder time building a similarly rich set of applications for the Macintosh, but the lesson was eventually learned, partly due to the NeXT acquisition and the adoption of object oriented programming. We now have a multi-dimensional macrocosm — a true ecosystem — in which our various forms of personal computing work together, share data, media, services.

Where does the current Apple TV device (the black puck, not the mythical TV set) fit into this scheme? Apple TV runs on a version of iOS, and it knows how to communicate with a Bluetooth keyboard — but that doesn’t mean the device is a personal computer. Perhaps Apple will (someday) provide a TV Software Development Kit (SDK) so developers can adapt existing iOS apps or write new ones. But I still see it as a lean-back device, as opposed to a lean-forward PC.

In any case, sales of the $100 black puck don’t move the needle. Four million Apple TVs were sold in 2012; even if ten million are sold this year — and that’s a very optimistic estimate — it won’t make a noticeable difference, at least not directly. Apple TV is a neat part of the ecosystem, it makes iPhones, iPads, Macs and our iTunes libraries more valuable, but it’s still just a member of the supporting cast.

This brings us back to the putative iWatch. Computer history buffs will recall the HP 01 watch. Buoyed by the success of its handheld calculators, including the programable HP 65 with its magnetic card reader, HP convinced itself it could make a calculator watch, introduced in 1977:

A technology tour de force, fondly remembered by aging geeks, but a market failure: too expensive, too hard to use, ill-fitting distribution channels.

Apple is in a different spot. Today, you can find a number of iPod watchbands such as this one:

It’s hard to imagine that Apple would merely integrate an existing accessory into a new iPod. Sales of the iPod proper are decelerating, so the iPod-as-iWatch could give the line a much needed boost, but it’s difficult to reconcile the rumors of “100 people” working on the project if it’s just a retrofit job. Is Apple working on an iWatch that can be experienced as an Even More Personal personal computer — an “intimate computer”? If so, many questions arise: user interface, sensors, iOS version, new types of apps, connection with other iDevices… And, of course price.

This would be much more interesting than the perennially in-the-future Apple TV set. Of course, iWatch and Apple TV aren’t necessarily mutually exclusive. If the Löwe buyout rumors are true, Apple could do both — the company could develop its own watch device and repurpose Löwe’s TV. (I still doubt the TV set part, as opposed to enhancing the black puck.)

But once we understand what Apple’s only business is, and that the related software, retail, and services are simply part of the supporting cast, Apple’s attitude towards big acquisitions becomes clearer. Apple isn’t looking at buying a big new business, it already owns The Big One. So, no movie studio, no retail chain or cable company, no HP or Dell, or Yahoo!. (But… a big law firm, perhaps?) Integrating a large group of people into Apple’s strong, unbending culture would, alone, prove to be impossible.

A small acquisition to absorb technology (and talented people) makes sense. The cultural integration risks remain, but at a manageable scale, unlike what happened to Exxon in the early eighties when it burned $4B (that was real money, then) in a failed attempt to become an information systems company — you know, the Oil of the Twenty-First Century.

Let’s just hope Apple doesn’t talk itself into a “because we can” move.

JLG@mondaynote.com