intel

The Intel Enigma

 

by Jean-Louis Gassée

Intel once turned down the opportunity to become the sole supplier of iPhone processors. Why haven’t they let go of their defocused search for the Next Big Thing and, instead, used All Means Necessary to regain the account?

Intel is a prosperous company. For the quarter ended last September, Intel scored $14.6B in Sales, 65% Gross Margin and $4.5B in Operating Income, a nice progression from the same period a year ago:

348_PL

A 65% Gross Margin is enviable for any company, and exceptional for a hardware maker: Intel’s GM is up in software territory. By comparison, Apple’s Gross Margin – considered too comfortable by followers of the Church of Market Share – stands at 38.6% for the 2014 Fiscal Year ended last September.

But when we take a closer look at the numbers, the picture isn’t as rosy.  Nearly 90% of Intel’s revenue — $12.9B of the total $14.6B  — comes from two groups: PC and Data Center (servers, networking, storage). Intel’s presence in the mobile world? Nonexistent:

348_non_existent

Essentially no revenue for Mobile and Communications, and a $1B loss. Looking at the past four quarters, Intel has lost about $4B in the pursuit of the mobile market (Daniel Eran Dilger says $7B in the past two years).

How did Intel handle the problem? By sweeping it under the rug. In November, Intel CEO Brian Krzanich announced that the company was merging Mobile into the PC group and would discontinue its $51 per Android tablet subsidy in 2015. This came just weeks after Krzanich had proclaimed Mission Accomplished in the tablet field:

“‘We’ve made good progress getting into tablets’ Krzanich told reporters ahead of the annual Intel Developer Forum in San Francisco. ‘We’ve gone from nothing to something where I consider us a real tablet manufacturer.’”

348The company’s inability to break into the mobile field — into any field other than PCs and servers — isn’t new, and it has worried Intel for decades. Company execs and strategists aren’t happy being the hardware half of Wintel, with being yoked to Microsoft’s fortunes. They like the money, but they want a “second source” for their profits, something other than the x-86 market, so they’ve embarked on a never-ending quest for the next stage in the Intel rocket.

(Of course, the company isn’t blind to the benefits of the Wintel alliance: Given two processors of equal merit, the one running Windows fetches the higher price, hence the ferocious tactics that have landed the company in court on several occasions.)

In its search for the Next Big Thing, Intel has tried alternatives to the x-86 architecture and come up with failures such as the iAPX 32 and the Itanium high-end server processor. The latter, a puzzling adoption of HP’s PA-RISC architecture, was quickly dubbed Itanic by tech wags as results failed to match lofty launch projections.

Intel has tried server farms, modems, networking equipment and, I kid you not, toy microscopes, but they somehow never got around to mobile. In the pre-iPhone days of the mobile world, the dominant players — Nokia, Motorola, Palm, Blackberry — all used processors based on the ARM architecture, processors that were too small and inexpensive to interest Intel. No money there, they cost 1/10th or less of a PC processor.

Steve Jobs offered Intel a chance to get into the mobile game: He asked the company to bid on an ARM-derivative for the iPhone. As Paul Otellini, Intel’s CEO at the time, wistfully and gallantly recounted, he gave the opportunity a pass, thinking the numbers (price and quantity) were too low. (An ex-Intel acquaintance told me that the business people felt they should go after Nokia, instead, because of its huge volume at the time.)

In 2006, after missing the iPhone, Intel sold its ARM processor business to Marvell.

When iPhones and Android-based smartphones took off, Intel insisted they weren’t concerned, that they would triumph in the end: We will win because our unapproachable manufacturing technology will produce x-86 processors that are superior in every way to ARM-based competitors.

We’ve heard this line every year since. The latest version is summarized in this slide from a November Investor Meeting:

348_intel_transistor

What Intel contends here is that they always have a three-year lead over their competition. — it’s just a given. What company execs fail to explain is why smartphone manufacturers have failed to see the light, and why Android tablet makers had to be bribed.

Now it seems that Intel has discovered the Internet of Things… and Wearables, of course. If you have the patience, flip through this 66-slide presentation that tells us that IoT will be huge because the objects around us will all become intelligent (a story we’ve already heard from companies such as Cisco — which is also looking for its Next Big Thing).

348_intel_IoT_evryw

Naturally, wearables are in there:

348_intel_IoT_evryw

This is painful. The whole presentation is an Everything And The Kitchen Sink assemblage of unoriginal ideas. There’s no focus in Intel’s Theory of Everything, no way to see when, where, and how the company will actually rise above the IoT noise.

As for wearables — now fashionable in more ways than one — Intel touts its new MICA bracelet:

348_fashion

You can “pre-order” yours at Opening Ceremony and have it delivered in time for Christmas.

Let’s not forget Intel’s partnership with Google for the next-gen Google Glass, nor the company’s acquisition of Basis, a maker of fitness wearables.

Certainly, the more “initiatives” Intel throws at the wall the higher the chances that one of them will stick. But from the outside, it feels like Intel is being driven by courtiers and PowerPoint makers, that senior management really doesn’t know what to do – and what not to do. (Krzanich says he green-lighted the MICA project because his wife approved of it “after using it for several days”.)

Of all the things Intel should and shouldn’t have done, the Apple element figures mightily. Since Intel offered a whopping $51 Android tablet subsidy, a charity that landed its mobile activities $7B in the red over two years, why didn’t the company offer Apple a $10 or $20 subsidy per processor as a way get the manufacturing relationship restarted? ‘We’ll beat Samsung’s prices, we’ll be your second source.’ If Intel’s 14nm process is so superior, how come Intel execs didn’t convince Apple to dump frenemy Samsung?

I see three possible answers.

One is that the 14 nanometer process is woefully late. Deliveries of some Broadwell chips (the nickname of the next round of x-86 processors) are now slated for early- to mid-2105. Apple might feel that Intel’s process needs to mature before it can deliver 300M units.

The second is that Intel’s claim of a three-year technology lead might be less than reliable. Samsung could be closer to delivering 14nm chips than Intel would like us (and itself) to believe.

Or perhaps Intel sees Apple as a real adversary that’s intent on designing all of its own processors, even for laptops and desktops that are currently powered by x-86 chips. But even so, why not become the preferred fabricator?

The Intel enigma remains: There’s no clear, resounding answer to the What’s Next?question, only some lingering puzzlement over What Happened?

JLG@mondaynote.com

Macintel: The End Is Nigh

When Apple announced its 64-bit A7 processor, I dismissed the speculation that this could lead to a switch away from Intel chips for the Macintosh line for a homegrown “desktop-class” chip. I might have been wrong.

“I don’t know exactly when, but sooner or later, Macs will run on Apple-designed ARM chips.” Thus spake Matt Richman in a 2011 blog post titled “Apple and ARM, Sitting in a Tree”. Richman explained why, after a complicated but ultimately successful switch from PowerPC chips to Intel processors in 2005, Apple will make a similar switch, this time to ARM-based descendants of the A4 chip designed by Apple and manufactured by Samsung.

Cost is the first reason invoked for the move to an An processor:

“Intel charges $378 for the i7 chip in the new high-end 15 inch MacBook Pro. They don’t say how much they charge for the i7 chip in the low-end 15 inch MacBook Pro, but it’s probably around $300. …When Apple puts ARM-based SoC’s in Macs, their costs will go down dramatically. ”

We all know why Intel has been able to command such high prices. Given two microprocessors with the same manufacturing cost, power dissipation, and computing power, but where one runs Windows and the other doesn’t, which chip will achieve the higher market price in the PC market? Thus, Intel runs the table, it tells clone makers which new x86 chips they’ll receive, when they’ll receive them, and, most important, how much they’ll cost. Intel’s margins depend on it.

ARM-based processors, on the other hand, are inherently simpler and therefore cost less to make. Prices are driven even lower because of the fierce competition in the world of mobile devices, where the Wintel monopoly doesn’t apply.

329_A7chip

Cost is the foremost consideration, but power dissipation runs a close second. The aging x86 architecture is beset by layers of architectural silt accreted from a succession of additions to the instruction set. Emerging media formats demand new extensions, while obsolete constructs must be maintained for the sake of Microsoft’s backward compatibility religion. (I’ll hasten to say this has been admirably successful for more than three decades. The x86 nickname used to designate Wintel chips originates from the 8086 processor introduced in 1978 – itself a backward-compatible extension of the 8088…)
Because of this excess baggage, an x86 chip needs more transistors than its ARM-based equivalent, and thus it consumes more power and must dissipate more heat.

Last but not least, Richman quotes Steve Jobs:

“I’ve always wanted to own and control the primary technology in everything we do.”

Apple’s leader has often been criticized for being too independent and controlling, for ignoring hard-earned industry wisdom. Recall how Apple’s decision to design its own processors was met with howls of protest, accusations of arrogance, and the usual predictions of doom.

Since then, the interest for another Grand Processor Switch has been alive and well. Googling “Mac running on ARM” gets you close to 10M results. (When you Bing the same query, you get 220M hits — 22x Google’s results. SEO experts are welcome to comment.)

Back to the future…

In September 2013, almost a year ago already, Apple introduced the 64-bit A7 processor that powers new iPhones and iPads. The usual suspects pooh-poohed Apple’s new homegrown CPU, and I indulged in a little fun skewering the microprocessor truthers: 64 bits. It’s Nothing. You Don’t Need It. And We’ll Have It In 6 Months. Towards the end of the article, unfortunately, I dismissed the speculation that Apple An processors would someday power the Mac. I cited iMacs and Mac Pros — the high end of the product line —as examples of what descendants of the A7 couldn’t power.

A friend set me straight.

In the first place, Apple’s drive to own “all layers of the stack” continues unabated years after Steve’s passing. As a recent example, Apple created its own Swift programming language that complements its Xcode IDE and Clang/LLVM compiler infrastructure. (For kremlinology’s sake I’ll point out that there is an official Apple Swift blog, a first in Apple 2.0 history if you exclude the Hot News section of the of apple.com site. Imagine what would happen if there was an App Store blog… But I digress.)

Secondly, the Mac line is suspended, literally, by the late delivery of Intel’s Broadwell x86 processors. (The delay stems from an ambitious move to a bleeding edge fabrication technology that shrinks the basic building block of a chip to 14 nanometers, down from 22 nanometers in today’s Haswell chips.) Of course, Apple and its An semiconductor vendor could encounter similar problems – but the company would have more visibility, more control of its own destiny.

Furthermore, it looks like I misspoke when I said an An chip couldn’t power a high-end Mac. True, the A7 is optimized for mobile devices: Battery-optimization, small memory footprint, smaller screen graphics than an iMac or a MacBook Pro with a Retina display. But having shown its muscle in designing a processor for the tight constraints of mobile devices, why would we think that the team that created the most advanced smartphone/tablet processor couldn’t now design a 3GHz A10 machine optimized for “desktop-class” (a term used by Apple’s Phil Schiller when introducing the A7) applications?

If we follow this line of reasoning, the advantages of ARM-based processors vs. x86 devices become even more compelling: lower cost, better power dissipation, natural integration with the rest of the machine. For years, Intel has argued that its superior semiconductor design and manufacturing technology would eventually overcome the complexity downsides of the x86 architecture. But that “eventually” is getting a bit stale. Other than a few showcase design wins that have never amounted to much in the real world, x86 devices continue to lose to ARM-derived SoC (System On a Chip) designs.

The Mac business is “only” $20B a year, while iPhones and iPad generate more than 5 times that. Still, $20B isn’t chump change (HP’s Personal Systems Group generates about $30B in revenue), and unit sales are up 18% in last June’s numbers vs. a year ago. Actually, Mac revenue ($5.5B) approaches the iPad’s flagging sales ($5.9B). Today, a 11” MacBook Air costs $899 while a 128Gb iPad Air goes for $799. What would happen to the cost, battery life, and size of an A10-powered MacBook Air? And so on for the rest of the Mac line.

By moving to ARM, Apple could continue to increase its PC market share and scoop much of the profits – it currently rakes in about half of the money made by PC makers. And it could do this while catering to its customers in the Affordable Luxury segment who like owning both an iPad and a Mac.

While this is entirely speculative, I wonder what Intel’s leadership thinks when contemplating a future where their most profitable PC maker goes native.

JLG@mondaynote.com

———-

Postscript: The masthead on Matt Richman’s blog tells us that he’s now an intern at Intel. After reading several of his posts questioning the company’s future, I can’t help but salute Intel management’s open mind and interest in tightly reasoned external viewpoints.

And if it surprises you that Richman is a “mere” intern, be aware that he was all of 16-years-old when he wrote the Apple and ARM post. Since then, his blog has treated us to an admirable series of articles on Intel, Samsung, Blackberry, Apple, Washington nonsense – and a nice Thank You to his parents.

 

Legacy Media: The Missing Gene

 

Legacy media is at great risk of losing against tech culture. This is because incumbents miss a key driver: an obsession with their own mortality. Such missing paranoia gene negatively impacts every aspect of their business. 

At the last Code conference (the tech gathering hosted by Walter Mossberg and Kara Swisher), Google co-founder Sergey Brin made a surprising statement (at least to me): Asked by Swisher how Google sees itself, Brin responded in his usual terse manner: “There is the external and the internal view. For the outside, we are Goliath and the rest are Davids. From the inside, we are the Davids”. From someone who co-founded a $378bn market cap company that commands more than 80% of the global internet search, this is indeed an unexpected acknowledgement.

Sergey Brin’s statement echoes Bill Gates’ own view when, about fifteen years ago, he was asked about his biggest concern: Was it a decisive move or product by another big tech company? No, says, Gates, it is the fact that somewhere, somehow, a small group of people is inventing something that will change everything… With the rise of Google and Facebook, his fears came true on a scale he couldn’t even imagine. Roughly at the same time, Andy Grove, then CEO of Intel, published a book with a straightforward title: “Only the Paranoid Survives“. Among my favorites Grove quotes:

“Business success contains the seeds of its own destruction. The more successful you are, the more people want a chunk of your business and then another chunk and then another until there is nothing.”

Still, Intel wasn’t paranoid enough and completely missed the mobile revolution, leaving to ARM licensees the entire market of microprocessors for smartphones and tablets.

This deep-rooted sense of fragility is a potent engine of modern tech culture. It spurs companies to grow as fast as they can by raising lots of capital in the shortest possible time. It also drives them to capture market share by all means necessary (including the worst ones), and to develop a culture of excellence by hiring the best people at any cost while trimming the workforce as needed while obsessively maintaining a culture of agility to quickly learn form mistakes and to adapt to market conditions. Lastly, the ever-present sense of mortality drives rising tech companies to quickly erect barriers-to-entry and to generate network effects needed to keep incumbents at bay.

For a large part, these drives stem from these companies’ early history and culture. Most started combining a great idea with clever execution – as opposed to being born within an expensive infrastructure. Take Uber or AirBnB. Both started with a simple concept: harness digital tools to achieve swift and friction-free connections between customers and service providers. Gigantic infrastructure or utterly complicated applications weren’t required. Instead, the future of these companies was secured by a combination of flawless execution and fast growth (read this New York Times story about the Uber network effect challenge). Hence the rapid-fire rounds of financing that will boost Uber’s valuation to $17bn, allowing it to accelerate its worldwide expansion – and also combat a possible price war, as stated by its founder himself at the aforementioned Code Conference.

Unfortunately, paranoia-driven growth sometimes comes with ugly business practices. Examples abound: Amazon’s retaliation against publishers who fight its pricing conditions; Uber bullying tactics against its rival – followed by an apology; Google offering for free what others were used to sell, or distorting search results, etc.

Such behaviors leave the analog world completely flummoxed. Historical players had experienced nothing but a cosy competitive gentlemen-like environment, with a well-defined map of players. This left incumbents without the genes, the culture required to fight digital barbarians. Whether they are media dealing with Google, publishers negotiating with Amazon, hotels fighting Booking.com or AirBnB, or taxi confronting Uber, legacy players look like the proverbial deer caught in the headlights. In some instances, they created their own dependency to new powerful distributors (like websites whose traffic relies largely on Google), before realizing that it was time to sue the dope dealer. (This is exactly what the European press is doing by assigning Google before the European Court of Justice invoking antitrust violations — a subject for a future Monday Note). The appeal to legislators underlines the growing feeling of impotence vis-a-vis the take-no-prisoners approach of new digital players: Unable to respond on the business side, the old guard turns to political power to develop a legal (but short-lasting) containment strategy.

In the media industry, historic players never developed a sense of urgency. The situation varies from one market to another but, in many instances, the “too important to fail” was the dominant belief. It always amazed me: As I witnessed the rise of the digital sector – its obsession with fast growth, and its inevitable collision course with legacy media – incumbents were frozen in the quiet certitude that their role in society was in fact irreplaceable, and that under no circumstances they would be left to succumb to a distasteful Darwinian rule. This deep-rooted complacency is, for a large part, responsible for the current state of the media industry.

Back in 1997, Andy Grove’s book explained how to deal with change :

“The implication was that either the people in the room needed to change their areas of knowledge and expertise or people themselves needed to be changed” 

Instead, our industry made too few changes, too late. Since the first digital tremors hit business models ten years ago, we have been through one or two generations of managers in traditional media company. It is amazing to see how the same DNA is being replicated over and over. Some layers are moving faster than others, though. The higher you go in the food chain, the more people are penetrated by a sense of vital urgency. But the rank-and-file and middle management are holding back, unable to exit their comfort zone.

Earlier this year, the French newspaper Liberation chose the outdated slogan: “We are a Newspaper” in reaction to its new owners ideas (read this story in the NYT). Last week, Liberation opted to appoint as it editor-in-chief one of the strongest opponent to digital media (he is just out from the weekly Le Nouvel Observateur which he gently led into a quiet nursing home, leaving it worth next to nothing).

The gap between the managers of pure digital players and those who still lead legacy media has never been greater. Keenly aware of their own mortality, the former rely more than ever on brutal street-fight tactics, while the incumbents evolve at a different pace, still hoping that older models will resist longer than feared. For old media, it is time for a radical genetic alteration — if performed down to every layer of the media industry.

frederic.filloux@mondaynote.com

 

Peak PC. Intel Fork.

 

Propelled by Moore’s Law and the Internet, PCs have enjoyed four decades of strong growth, defying many doomsday prophecies along the way. But, with microprocessor performance flattening out, the go-go years have come to an end. Intel, the emperor of PC processors, and a nobody in mobile devices needs to react.]

I’m suspicious of Peak <Anything> predictions. Some of us became aware of the notion of a resource zenith during the 1973 OPEC oil embargo, with its shocking images of cars lined up at gas stations (in America!):

Gas Lines Oil Embargo

This was Peak Oil, and it spelled doom to the auto industry.

We know what happened next: Cars improved in design and performance, manufacturers became more numerous. Looking at this bit of history through my geek glasses, I see three explanations for the rebound: computers, computers, and computers. Computer Assisted Design (CAD) made it easier to design new car models as variations on a platform; Volkswagen’s MQB is a good example. Massive computer systems were used to automate the assembly line and manage the supply chain. It didn’t take long for computers to work their way into the cars themselves, from the ECU under the hood to the processors that monitor the health of the vehicle and control the entertainment and navigation systems.

Since then, we’ve had repeated predictions of Peak Oil, only to be surprised by the news that the US will soon become a net oil exporter and, as Richard Muller points out in his must-read Physics for Future Presidents, we have more than a century of coal reserves. (Unfortunately, the book, by a bona fide, middle-of-the-road physicist, can’t promise us that physics will eventually push politics aside when considering the rise of CO2 in the atmosphere…)

I’ve heard similar End of The Go-Go Days predictions about personal computers since 1968 when my love affair with these machines started at HP France (I was lucky enough to be hired to launch their first desktop machine).

I heard the cry again in 1985 when I landed in Cupertino in time for the marked slowdown in Apple ][ sales. The never-before round of layoffs at Apple prompted young MBAs, freshly imported from Playtex and Pepsi, to intone the It’s All Commodities Now dirge. I interpreted the cry (undiplomatically -- I hadn’t yet learned to speak Californian) as a self-serving It’s All Marketing Now ploy. In the meantime, engineers ignored the hand-wringing, went back to work, and, once again, proved that the technology “mines” were far from exhausted.

In 1988, a Sun Microsystems executive charitably warned me: “PCs are driving towards the Grand Canyon at 100 mph!”.  A subscriber to Sun’s The Network Is The Computer gospel, the gent opined that heavy-duty computing tasks would be performed by muscular computers somewhere (anywhere) on the network. Desktop devices (he confusingly called them “servers” because they were to “serve” a windowing protocol, X11) would become commodities no more sophisticated or costly than a telephone. He had no answer for multimedia applications that require local processing of music, video, and graphics, nor could he account for current and imminent mobile devices. His view wasn’t entirely new. In 1965, Herb Grosch gave us his Law, which told us that bigger computers provide better economics; smaller machines are uneconomical.

And yet, personal computers flourished.

I have vivid memories of the joy of very early adopters, yours truly included. Personal computers are liberating in many ways.

First, they don’t belong to the institution, there’s no need for the intercession of a technopriest, I can lift my PC with my arms, my brains, and my credit card.

Second, and more deeply, the PC is a response to a frustration, to a sense of something amiss. One of mankind’s most important creations is the symbol, a sign without a pre-existing meaning: X as opposed to a drawing of a deer on a cave wall. Strung together, these symbols show formidable power. The expressive and manipulative power of symbol strings runs through the Song of Songs, Rumi’s incandescent poetry, Wall Street greed, and quantum physics.

But our central nervous system hasn’t kept up with our invention. We don’t memorize strings well, we struggle with long division, let alone extracting cubic roots in our heads.

The PC comes to the rescue, with its indefatigable ability to remember and combine symbol strings. Hence the partnership with an object that extends the reach of our minds and bodies.

Around 1994, the Internet came out of the university closet, gave the PC access to millions of servers around the world (thus fulfilling a necessary part of the Sun exec’s prophecy), and extended our grasp.

It’s been great and profitable fun.

But today, we once again hear Peak PC stories. Sales have gone flat, never to return:

PC shipments 2014-18 - PNG

This time, I’m inclined to agree.

Why?

Most evenings, my home-builder spouse and I take a walk around Palo Alto. Right now, this smallish university town is going through a building boom. Offices and three-layer retail + office + residence are going up all around University Avenue. Remodels and raze-and-build projects can be found in the more residential parts of town. No block is left unmolested.

I can’t help but marvel. None of this activity, none of Silicon Valley would exist without Moore’s Law, the promise made in 1965 that semiconductor performance would double every 18 months. And, for the better part of 40 years, it did - and rained money on the tech ecosystem, companies and people. PCs, servers, embedded electronics, giant network routers, cars...they’ve all been propelled because Moore’s Law has been upheld...until recently.

The 1977 Apple ][ had a 1MHz 8-bit processor. Today’s PCs and Mac’s reach 3.7GHz, but number that hasn’t changed in more than three years. This isn’t to say that Intel processors aren’t still improving, but the days when each new chip brought substantial increases in clock speed seem to be over.

One should never say never, but Moore’s Law is now bumping into the Laws of Physics. The energy needed to vibrate matter (electrons in our case) increases with frequency. The higher the clock frequency, the higher the power dissipation and the greater the heat that’s generated…and a PC can withstand only so much heat. Consider the cooling contraptions used by PC gamers when they push the performance envelope of their “rigs”:

EK-Thermosphere_right2_12001

To work around the physical limits, Intel and others resort to stratagems such as “multiple cores”, more processors on the same chip. But if too many computations need the result of the previous step before moving forward, it doesn’t matter how many cores you have. Markitects have an answer to that as well: “speculative branch execution”, the use of several processors to execute possible next steps. When the needed outcome appears, the “bad” branches are pruned and the process goes forward on the already-computed good branch. It makes for interesting technical papers, but it’s no substitute for a 8GHz clock speed.

If we need confirmation of the flattening out of microprocessor progress, we can turn to Intel and the delays in implementing its Broadwell chips. The move to a 14 nanometers  “geometry” — the term here denotes the size of a basic circuit building block — is proving more difficult than expected. And the design isn’t meant to yield faster processors, just less power-hungry ones (plus other goodies such as better multi-media processing).

One possible reaction to this state of affairs is to look at tablets as a new engine of growth. This is what Microsoft seems to be doing by promoting its Intel-inside Surface Pro 3 as a laptop replacement. But even if Microsoft tablets turn out to be every bit as good as Microsoft says they are, they aren’t immune to the flattening out of Intel processor performance. (I don’t have an opinion yet on the product — I tried to buy one but was told to wait till June 20th.)

Does this broaden the opening for ARM-based devices? Among their advantages is a cleaner architecture, one devoid of the layers of backwards compatibility silt x86 devices need. ARM derivaties need less circuitry for the same computing task and, as a result, dissipate less power. This is one of the key reasons for their dominance in the battery-powered world of mobile devices. (The other is the customization and integration flexibility provided by the ARM ecosystem.) But today’s ARM derivatives run at lower speeds (a little above 1GHz for some) than Intel chips. Running at higher speeds will challenge them to do so without hurting battery life and having to add the fan that Microsoft tablets need.

With no room to grow, PC players exit the game. Sony just did. Dell took itself private and is going through the surgery and financial bleeding a company can’t withstand in public. Hewlett-Packard, once the leading PC maker, now trails Lenovo. With no sign of turning its PC business around, HP will soon find itself in an untenable position.

Intel doesn’t have the luxury of leaving their game — they only have one. But I can’t imagine that Brian Krzanich, Intel’s new CEO, will look at Peak PC and be content with the prospect of increasingly difficult x86 iterations. There have been many discussions of Intel finally taking the plunge and becoming a “foundry” for someone else’s ARM-based SoC (System On a Chip) designs instead of owning x86 design and manufacturing decisions. Peak PC will force Intel CEO’s hand.

JLG@mondaynote.com

Intel Is Under New Management – And It Shows

 

Intel rode the PC wave with Microsoft and built an seemingly insurmountable lead in the field of “conventional” (PCs and laptops) microprocessors. But, after his predecessor missed the opportunity to supply the CPU chip for Apple’s iPhone, Intel’s new CEO must now find a way to gain relevance in the smartphone world.

In last May’s The Atlantic magazine, Intel’s then-CEO Paul Otellini confessed to a mistake of historic proportions. Apple had given Intel the chance to be part of the smartphone era, to supply the processor for the first iPhone… and Otellini said no [emphasis and light editing mine]:

“The thing you have to remember is that this was before the iPhone was introduced and no one knew what the iPhone would do… At the end of the day, there was a chip that they were interested in that they wanted to pay a certain price for and not a nickel more and that price was below our forecasted cost. I couldn’t see it. It wasn’t one of these things you can make up on volume. And in hindsight, the forecasted cost was wrong and the volume was 100x what anyone thought.”
“…while we like to speak with data around here, so many times in my career I’ve ended up making decisions with my gut, and I should have followed my gut. [...] My gut told me to say yes.”

That Otellini found the inner calm to publicly admit his mistake — in an article that would be published on his last day as CEO, no less — is a testament to his character. More important, Otellini’s admission unburdened his successor, Brian Krzanich, freeing him to steer the company in a new direction.

And Krzanich is doing just that.

First: House cleaning. Back in March 2012, the Wall Street Journal heralded Intel as The New Cable Guy. The idea was to combine an Intel-powered box with content in order to serve up a quality experience not found elsewhere (read Apple, Netflix, Roku, Microsoft…). To head the project, which was eventually dubbed OnCue, Intel hired Erik Huggers, a senior industry executive and former head of BBC Online.

At the All Things D conference in February, Huggers announced that the TV service would be available later this year. The Intel TV chief revealed no details about how the service OnCue would differ from existing competitors, or how much the thing would cost…but he assured us that the content would be impressive (“We are working with the entire industry”), and the device’s capabilities would be comprehensive (“This is not a cherry-pick… this is literally everything”).

Intel seemed to be serious. We found out that more than 1,000 Intel employees in Oregon had been engaged in testing the product/service.

Then Krzanich stepped in, and applied a dose of reality:

Intel continues to look at the business model…. we are not experts in the content industry and we’re being careful.” [AllThingsD: New Intel CEO Says Intel TV Sounds Great in Theory. But …]

Indeed, to those of us who have followed the uneasy dance between Apple and content providers since the first Apple TV shipped in 2007, the Intel project sounded bold, to say the least.

Late September, the project was put on hold and, last week, the news came that OnCue had been cancelled and allegedly offered to Verizon, whose V Cast media distribution feats come to mind…

Even before OnCue’s cancellation was made official, the well-traveled Erik Huggers appeared to show an interest in the Hulu CEO job. (If Mr Huggers happens to be reading this: I’d be more than happy to relieve you of the PowerPoints that you used to pitch the project to Intel’s top brass, not to mention the updates on the tortuous negotiations for content, and the reports from the user testing in Oregon. These slides must make fascinating corpospeak logic.)

Krzanich quickly moved from doubt to certainty. He saw that OnCue would neither make money by itself, nor stimulate sales or margins for its main act, x86 processors. OnCue would never be an Apple TV “black puck”, a supporting character whose only mission is to make the main personal computers (small, medium and large; smartphones, tablets and conventional PCs) more useful and pleasant.

So he put an end to the impossible-to-justify adventure.

That was easy.

Tackling Intel’s failure to gain a significant role in the (no longer) new world of smartphones is a much more complicated matter.

With its x86 processors, Intel worked itself into a more-than-comfortable position as part of the Wintel ecosystem. The dominant position achieved by the Microsoft-Intel duopoly over two decades yielded correspondingly high margins for both.

But smartphones changed the game. ARM processors proved themselves better than x86 at the two tasks that are integral to personal, portable devices: lowering power consumption and customization. The ARM architecture didn’t have to wait for the iPhone and Android handsets to dominate the cell phone business. Just as Windows licensing spawned a large number of PC makers, ARM licensing contributed to the creation of a wide range of processor design and manufacturing companies. The ARM site claims 80 licensees for its newer Cortex family and more than 500 for its older Classic Arm processors. No monopoly means lower margins.

Intel saw the unattractive margins offered by ARM processors and didn’t want to commit the billions of dollars required by a fab (a chip manufacturing plant) for a product that would yield profits that were well below Wall Street expectations.

The prospect of bargain basement margins undoubtedly figured in Otellini’s decision to say no to the iPhone. In 2006, no one could have predicted that it could have been made up in volume, that there would be a billion smartphone sales in 2014. (I’m basing the 1B number for the entire industry on Horace Dediu’s estimate of 250 million iOS devices for 2014.)

Even if the Santa Clara company had had the foresight to accept lower margins in order to ensure their future in the smartphone market, there would still have been the problem of customization.

Intel knows how to design and manufacture processors that used “as is” by PC makers. No customization, no problems.

This isn’t how the ARM world works. Licensees design processors that are customized for their specific device, and they send the design to a manufacturer. Were Intel to enter this world, they would no longer design processors, just manufacture them, an activity with less potential for profit.

This explains why Intel, having an ARM license and making XScale processors, sold the business to Marvell in 2006 – a fateful date when looking back on the Apple discussions.

But is Intel’s new CEO is rethinking the “x86 and only x86″ strategy? Last week, a specialty semiconductor company called Altera announced that Intel would fabricate some if its chips containing a 64-bit ARM processor. The company’s business consists of offering faster development times through “programmable logic” circuits. Instead of a “hard circuit” to be designed, manufactured, tested, debugged, modified and sent back to the manufacturing plant in lengthy and costly cycles, you buy a “soft circuit” from Altera and similar companies (Xilinx comes to mind). This more expensive device can be reprogrammed on the spot to assume a different function, or correct the logic in the previous iteration. Pay more and get functioning hardware sooner, without slow and costly turns through a manufacturing process.

With this in mind, what Intel will someday manufacture for Altera isn’t the 64-bit ARM processor that excited some observers: “Intel Makes 14nm ARM for Altera“. The Stratix 10 circuits Altera contracts to Intel manufacturing are complicated and expensive ($500 and up) FPGA (Field Programmable Gate Array) devices where the embedded ARM processor plays a supporting, not central, role. This isn’t the $20-or-less price level arena in which Intel has so far declined to compete.

Manufacturing chips for Altera might simply be work-for-hire, a quick buck for Intel, but I doubt it. Altera’s yearly revenue is just shy of $2B; Intel is a $50B company. The newly announced device, just one in Altera’s product lines, will not “move the needle” for Intel — not in 2014 (the ship date isn’t specified), or ever.

Instead, I take this as a signal, a rehearsal.  250M ARM SoCs at $20 each would yield $5B in revenue, 10% of Intel’s current total…

This might be what Krzanich had in mind about when he inked the “small” manufacturing agreement with Altera; perhaps he was weighing the smaller margins of ARM processors against the risk of slowing PC sales.

Graciously freed from the past by his predecessor, it’s hard to see how Intel’s new CEO won’t take the plunge and use the company’s superb manufacturing technology to finally

make ARM processors.

JLG@mondaynoye.com

 

Otellini’s Striking Confession

 

We know Intel shunned ARM processors and played virtually no role in the smartphone revolution. But we now learn Steve Jobs asked Intel to build the iPhone microprocessor. Paul Otellini, Intel’s departing CEO, admits he should have followed his gut – and made the smartphone world a very different place.

CEO valedictions follow a well-known script: My work is done here, great team, all mistakes are mine, all good deeds are theirs, I leave the company in strong hands, the future has never been brighter… It’s an opportunity for a leader to offer a conventional and contrived reminiscence, what the French call la toilette des souvenirs (which Google crudely translates as toilet memories instead of the affectionate and accurate dressing up memories).

For his farewell, Paul Otellini, Intel’s departing CEO, chose the interview format with The Atlantic Monthly’s senior editor Alexis Madrigal. They give us a long (5,700+ words) but highly readable piece titled Paul Otellini’s Intel: Can the Company That Built the Future Survive It?

Photo: Guardian.co.uk

The punctuation mark at the title’s end refers to the elephantine question in the middle of Otellini’s record: Why did Intel miss out on the smartphone? Why did the company that so grandly dominates the PC market sit by while ARM architecture totally, and perhaps irretrievably, took over the new generation of phones — and most other embedded applications?

According to Otellini, it was the result of Intel’s inertia: It took a while to move the machine.

Madrigal backfills this uneasy explanation with equal unease:

“The problem, really, was that Intel’s x86 chip architecture could not rival the performance per watt of power that designs licensed from ARM based on RISC architecture could provide. Intel was always the undisputed champion of performance, but its chips sucked up too much power. In fact, it was only this month that Intel revealed chips that seem like they’ll be able to beat the ARM licensees on the key metrics.”

Note the tiptoeing: Intel’s new chips “seem like” they’ll be fast enough and cheap enough. Madrigal charitably fails to note how Intel, year after year, kept promising to beat ARM at the mobile game, and failed to do so. (See these 2010, 2011 and 2012 Monday Notes.) Last year, Intel was still at it, dismissively predicting “no future for ARM or any of its competitors“. Tell that to ARM Holdings, whose licensees shipped 2.6 billions chips in the first quarter of this year.

Elsewhere in the article, Otellini offers a striking revelation: Fresh from anointing Intel as the microprocessor supplier for the Mac, Steve Jobs came back and asked Intel to design and build the CPU for Apple’s upcoming iPhone. (To clarify the chronology, the iPhone was announced early January, 2007; the CPU conversation must have taken place two years prior, likely before the June, 2005 WWDC where Apple announced the switch to x86. See Chapter 36 of Walter Isaacson’s Jobs bio for more.)

Intel passed on the opportunity [emphasis mine]:

“We ended up not winning it or passing on it, depending on how you want to view it. And the world would have been a lot different if we’d done it, […]

Indeed, the world would have been different. Apple wouldn’t be struggling through a risky transition away from Samsung, its frenemy CPU supplier; the heart of the iPhone would be Made In America; Intel would have supplied processors for more than 500 million iOS devices, sold even more such chips to other handset makers to become as major a player in the smartphone (and tablet) space as it is in the PC world.

Supply your own adjectives…

Indulging briefly in more What If reverie, compare the impact of Intel’s wrong turn to a better one: How would the world look like if, at the end of 1996, Gil Amelio hadn’t returned Apple back to Steve Jobs? (My recollection of the transaction’s official wording could be faulty.)

So, again, what happened?

At the end of the day, there was a chip that they were interested in that they wanted to pay a certain price for and not a nickel more and that price was below our forecasted cost. I couldn’t see it. It wasn’t one of these things you can make up on volume. And in hindsight, the forecasted cost was wrong and the volume was 100x what anyone thought.

A little later, Otellini completes the train of thought with a wistful reverie, a model of la toilette des souvenirs:

“The lesson I took away from that was, while we like to speak with data around here, so many times in my career I’ve ended up making decisions with my gut, and I should have followed my gut,” he said. “My gut told me to say yes.”

The frank admission is meant to elicit respect and empathy. Imagine being responsible for missing the opportunity to play a commanding role in the smartphone revolution.

But perhaps things aren’t as simple as being a “gut move” short of an epochal $100B opportunity.

Intel is a prisoner of its x86 profit model and Wall Street’s expectations. It’s dominant position in the x86 space give Intel the pricing power to command high margins. There’s no such thing in the competitive ARM space, prices are lower. Even factoring in the lower inherent cost of the somewhat simpler devices (simpler for the time being; they’ll inevitably grow more complex), the profit-per-ARM chip is too thin to sustain Intel’s business model.

(Of course, this assumes a substitution, an ARM chip that displaces an x86 device. As it turns out, the smartphone business could have been largely additive, just as we now see with tablets that cannibalize classical PCs.)

Another factor is the cultural change that would have been required were Intel to have gotten involved in making ARM devices. As both the designer and manufacturer of generation after generation of x86 microprocessors, Intel can wait until they’re good and ready before they allow PC makers to build the chips into their next products. The ARM world doesn’t work that way. Customers design their own chips (often called a System on a Chip, or SoC), and then turn to a semiconductor manufacturer (a foundry) to stamp out the hardware. Taking orders from others isn’t in Intel’s DNA.

And now?

The answer might lie in another French expression: L’histoire ne repasse pas les plats. Google Translate is a bit more felicitous this time: History does not repeat itself. I prefer the more literal image — History doesn’t come around offering seconds — but the point remains: Will there be seconds at the smartphone repast?

Officially, Intel says its next generation of x86 processors will (finally!) topple the ARM regime, that their chips will offer more computing might with no cost or power dissipation penalty. In their parlance “the better transistor” (the basic unit of logic processing) will win.

I doubt it. The newer x86 devices will certainly help Microsoft and its OEMs make Windows 8 devices more competitive, but that won’t prevent the spread of ARM in the legion of devices on which Windows is irrelevant. For these, Intel would have to adopt ARM, a decision Otellini has left to the new tandem leadership of Brian Krzanich (CEO) and Renée James (President). Will they stick to the old creed, to the belief Intel’s superior silicon design and manufacturing technology will eventually overcome the disadvantages of the more complex x86 architecture? Or will they take the plunge?

They might be helped by a change in the financial picture.

In 2006, that is after throwing Jobs in Samsung’s arms (pun unintended), Intel sold its ARM business, the XScale line, to Marvell. The reason was purely financial: for similar capital expenditures (costly fabs), ARM processors achieved much lower per-unit profit, this because of the much more competitive scene than in the x86 space.

Now, if Intel really wants to get a place at the smartphone table with new and improved x86 devices, the company will have to price those to compete with established ARM players. In other words, Intel will have to accept the lower margins they shunned in 2006. Then, why not do it with the ARM-based custom processors Apple and others require?

JLG@mondaynote.com

—————————-

(I’ll confess a weakness for The Atlantic and, in particular, for its national correspondent James Fallows, a literate geek and instrument-rated pilot who took upon himself to live in Beijing for a while and, as a result, can speak more helpfully about China than most members of the Fourth Estate. Going back to last week’s reference to the Gauche Caviar, when my Café de Flore acquaintances fall into their usual rut of criticizing my adopted country for its lack of “culture”, I hold out that The Atlantic — which sells briskly at the kiosk next door — is one of many examples of American journalistic excellence.

And, if you’re interested in more strange turns, see this other string Alexis Madrigal piece in the same Atlantic: The Time Exxon Went Into the Semiconductor Business (and Failed). I was there, briefly running an Exxon Information Systems subsidiary in France and learning the importance of corporate culture.)–JLG

Apple Buys Intel

 

Getting rid of Samsung as a processor supplier and, at the same time, capturing the crown jewel of the American semiconductor industry. How could Apple resist the temptation to solve its cash problem and make history again?

Halfway through the second quarter of the 2013 fiscal year, most of Apple’s top execs meet at an undisclosed location (Eddy Cue’s chair is empty – he’s been called away to a Ferrari board meeting). They’re joined by a few trusted industry insiders: Bill “the Coach” Campbell, Apple and Intuit Director and adviser to Google’s founders, Mssrs. Page and Brin; Larry Sonsini, the Silicon Valley consigliere of more than three decades; and Frank Quattrone, the star investment banker with nine lives.

The meeting isn’t about the company’s dwindling profit margins. The smaller margins were expected and invited: The reduced-price iPad and heavy promotion of the “old” iPhone 4 as an entry-level product are part of the long term strategy of guarding Apple’s lower end (so to speak). And no whining about AAPL’s grim slide over the last six months, a problem that has only one solution: Apple needs to record a series of better quarters.

The problem of the day is, once again, what to do with Apple’s obscene pile of cash.

By the end of December 2012, the company held about $137B in cash (or equivalents such as marketable securities), including $23B from operations for the quarter.

CFO Peter Oppenheimer delivers the bad news: It looks like operations will disgorge another $35B this quarter. The stock buy-back and dividend program that was designed to bleed off $45B over the next few years (see this March 2012 Monday Note) won’t be enough if the company continues at this rate.

Apple needs something bigger.

Quattrone has been sitting quietly at the end of the table. He clears his throat and speaks:

Buy Intel.

Well, yes, Frank (says Tim Cook), we’ve been buying Intel processors for the Mac since 2005.

Not the chips. The company. The planets are aligned for Apple to strike a blow that will leave the industry forever changed. Make history, acquire Intel.

Quattrone has their attention. He unfolds the celestial calibration:

  • Apple needs to extract itself from the toxic relationship with Samsung, its ARM supplier.
  • Intel is the best large-scale silicon manufacturer in the world. They have the people, the technology, and the plant capacity to match Apple’s needs for years to come.
  • “But Intel doesn’t do ARM!” you say. Indeed, Intel has no interest in the fierce competition and small margins in the ARM-based SoC market. Joining the ARM fray would severely disrupt Intel’s numbers and infuriate Wall Street. But if Intel were to essentially “go private” as Apple’s semiconductor manufacturing arm (pun intended), catering to all of Apple’s x86 and ARM needs (and whatever else Bob Mansfield is secretly plotting), Wall Street would have no such objection.
  • Intel is flailing. The traditional PC market – Intel’s lifeblood – continues to shrink, yet the company does nothing to break into the ARM-dominated mobile sector. In the meantime, the company makes perplexing investments such as buying McAfee for $7.68B.
  • There’s a leadership vacuum at Intel. Six months after announcing CEO Paul Otellini‘s “retirement”, Intel’s Board has yet to find a replacement who can sail the ship in more competitive waters. Apple could commission Pat Gelsinger, a 30-year Intel veteran and former CTO (Intel’s first) who fled to VMware after his career stalled at Intel. Despite being a bit of a Bill Gates look-alike (once upon a time), Gelsinger is a real technologist who would fit well within Apple, especially if he were given the opportunity to really “go for” the ARM architecture instead of iteratively tweaking x86 devices.
  • Last but not least, Intel’s market cap is about $115B, eminently affordable. The company is profitable and generates a good deal of cash, even after the heavy capital expenditures required by its constant need to build new and expensive manufacturing plants.
  • …oh, and one more thing: Wouldn’t it be fun to “partner” more closely with Microsoft, HP and Dell, working on x86 developments, schedules and… pricing?

A lively discussion ensues. Imagine solving many of Apple’s problems with a single sweeping motion. This would really make Cupertino the center of the high-tech world.

It’s an interesting idea, but there will be obstacles, both cultural and legal.

The Coach goes first: “Knowing both of these companies more than a little bit, I can attest to the pride they have in their respective cultures. They’re both disinclined to reconsider their beliefs in any meaningful way. Merging these two dissimilar groups, shedding unnecessary activities such as McAfee and the like would be dangerously disruptive to Apple’s well-honed, cohesive culture. As a general rule, merging two large organization rarely succeeds… unless you consider merging airlines a success…”

Finally, the Consigliere speaks: “It’s a tempting fantasy, it will mean years of work for my firm and many, many others, but as a friend of the company, as a past confidant of your departed Founder, don’t do it. There will be too much legal trouble with the Feds, with competitors, with Intel partners. Most fantasies aren’t meant to be enacted.”

I won’t dwell on the reality of the meeting: I made it up as a way to explain why Apple really has no choice other than submit to another cash phlebotomy, this time for an additional $60B. And, as with real-world phlebotomies, the procedure will treat the problem, but it won’t cure it. With $30B from operations per quarter, the $60B lancing will have to be repeated.

Some read the decision to return gobs of cash to shareholders as an admission of defeat. Apple has given up making big moves, as in one or more big acquisitions.

I don’t agree: We ought to be glad that the Apple execs (and their wise advisers) didn’t allow themselves to succumb to transaction fever, to a mirage of ego aggrandizement held out by a potential “game changing” acquisition.

A final word on taxes. To return the additional $60B (for a total of $100B when including the ongoing program announced last year) through increased dividends and repurchased shares, Apple will have to borrow money.

Borrow? When they have so much cash?

Yes, thanks to our mangled tax code. As explained here, about $100B of Apple’s cash is stored overseas. If repatriated, it would be “heavily” (read “normally”) taxed. Like most US companies that have international operations, Apple plays complicated, entirely legal tax games that allow their international profits to be taxed at very low rates as long as the profits — and the resulting cash — stay outside Uncle Sam’s reach. And thus we have the apparent paradox of borrowing money when cash-rich.

The benefit of these tax code contortions is difficult to explain to normal humans — as opposed to legislators who allowed the loopholes.

All this now makes Apple a different company. Once a fledgling challenger of established powerhouses such as IBM, Microsoft or HP, it now makes “too much cash” and is condemned to a life of paying dividends and buying back shares — like the old fogies it once derided.

JLG@mondaynote.com