ARM

Wintel: Le Divorce Part II

 

At CES 2011, Ballmer told the world Windows would “fork”, that it would also run on lower power ARM chips for mobile devices. This was seen as a momentous breach in the long-standing Wintel duopoly. Two years later, the ARM tooth of the fork looks short and dull.

This is what I wrote almost two years ago:

After years of monogamy with the x86 architecture, Windows will soon run on ARM processors.

As in any divorce, Microsoft and Intel point fingers at one another. Intel complains about Microsoft’s failure to make a real tablet OS. They say MS has tried to shoehorn “Windows Everywhere” onto a device that has an incompatible user interface, power management, and connectivity requirements while the competition has created device-focused software platforms.

Microsoft rebuts: It’s Intel’s fault. Windows CE works perfectly well on ARM-based devices, as do Windows Mobile and now Windows Phone 7. Intel keeps telling us they’re “on track”, that they’ll eventually shrink x86 processors to the point where the power dissipation will be compatible with smartphones and tablets. But…when?

Today, a version of Windows (RT) does indeed run on an ARM processor, on Microsoft’s Surface tablet-PC hybrid. Has Microsoft finally served Intel with divorce papers?

Not so fast. The market’s reaction to Redmond’s ambitious Surface design has fallen far short of the heights envisioned in the company’s enthusiastic launch: Surface machines aren’t flying off Microsoft Store shelves. Ballmer himself admits sales are “modest” (and then quickly backpedals); Digitimes, admittedly not always reliable, quotes suppliers who say that Surface orders have been cut by half; anecdotally, but amusingly, field research by Piper Jaffray’s Gene Munster (who can be a bit excitable) shows zero Surfaces sold during a two hour period at the Mall of America on Black Friday, while iPads were selling at a rate of 11-an-hour.

Traditional PC OEMs aren’t enthusiastic either. Todd Bradley, head of HP’s Personal Systems Group, is unimpressed:

“It tends to be slow and a little kludgey as you use it .…”

Acer exec Linxian Lang warns:

“Redmond will have to eat ‘hard rice’ with Surface…it should stick to its more readily-chewed software diet.”

To be sure, there are happy Surface users, such as Steve Sinofsky, the former Windows Division President, as captured in lukew’s Instagram picture:

(An aside: I went back to Sinofsky’s 8,000 words blog post that lovingly describes the process of developing “WOA” — Windows on ARM. At the time, WOA was presented as part of the Windows 8 universe. Later, Microsoft swapped the “8″ designation and chose to use “RT” instead. These naming decisions aren’t made lightly. Is there any wonder why WOA was moved out of the Windows 8 camp?)

It’s possible that the jury is still out… Surface sales could take off, Windows RT could be embraced by leading PC OEMs… but what are the odds? In addition to the tepid reception from customers and vendors alike, Microsoft must surmount the relentless market conquest of Android and iOS tablets whose numbers (210 million units) are expected to exceed laptop sales next year.

So, no… the Wintel Divorce isn’t happening. Intel’s x86 chips will remain the processors of choice to run Windows. Next month, we’ll have CES and its usual burst of announcements, both believable and dubious (remember when 2010 was declared the Year Of The Tablet PC?). We’ll have to sort the announcements that are merely that from those that will yield an actual device, but in the end I doubt we’ll see many new and really momentous Windows RT products out there.

Microsoft’s lackluster attempt at Post-PC infidelity doesn’t help Intel in its efforts to gain a foothold in the mobile world. Intel’s perennial efforts to break into the mobile market with lower power, lower cost x86 chips have, also perennially, failed. As a result, there is renewed speculation about a rapprochement between Intel and Apple, that the Santa Clara microprocessor giant could become an ardent (and high-volume) ARM SoC foundry.

As discussed here, some of this makes sense: Samsung is Apple’s biggest and most successful competitor in the smartphone/tablet space, spending billions more than anyone else in global marketing programs. At the same time, the South Korean company is Apple’s only supplier of ARM chips. Intel has the technology and manufacturing capacity to become an effective replacement for Samsung.

This wouldn’t be an easy decision for Intel: the volumes are high — as high as 415M ARM chips for 2013 according to one analyst — but the margins are low. And Intel doesn’t do low margins. Because of the Wintel duopoly, Intel’s x86 chips have always commanded a premium markup. Take Windows out of the picture and the margin disappears.

(As another aside, the 415,000 ARM chips number seems excessive. Assuming about 50 million iPhone 5s and 15 million iPads in the current quarter, and using the 4X rule of thumb for the following calendar year, we land somewhere between 250M and 300M ARM chips for Apple in 2013.)

Also, Intel would almost certainly not be Apple’s sole supplier of ARM chips. Yes, Apple needs to get out of its current and dangerous single source situation. But Tim Cook’s Supply Chain Management expertise will come into play to ensure that Apple doesn’t fall into a similar situation with Intel, that the company will secure at least a second source, such as the rumored TSMC.

The speculation by an RBC analyst that Intel will offer its services to build ARM chips for the iPhone on the condition Apple picks an x86 device for the iPad is nonsensical: Apple won’t fork iOS. Life is complicated enough with OS X on Intel and iOS on ARM.

Historically, a sizable fraction of Intel’s profits came from the following comparison. Take two microprocessor chips of equal “merit”: manufacturing cost, computing output, power dissipation… And add one difference: one runs Windows, the other doesn’t. Which one will get the highest profit margin?

In the ARM world and its flurry of customized chips and software platforms, the “runs Windows” advantage is no longer. ARM chips generate significantly lower margins than in the Intel-dominated world (its competitor AMD is ailing).

This leaves the chip giant facing a choice: It can have a meager meal at the tablet/smartphone fest, or not dine at all at the mobile table…while it watches its PC business decline.

In other news… Paul Otellini, Intel’s CEO, unexpectedly announced he’ll leave next May, a couple years ahead of the company’s mandatory 65-year retirement age. No undignified exit here. Intel’s Board pointedly stated they’ll be looking outside as well as inside for a successor, another unusual move in a company that so far stuck to successions orchestrated around carefully groomed execs. This could be seen as a sanction for Otellini missing the mobile wave and, much more important, a desire to bring new blood willing and able to look past the old x86 orthodoxy.

JLG@mondaynote.com

 

Apple, ARM, and Intel

 

Apple and Samsung are engaged in a knives-out smartphone war, most infamously in the courts but, more importantly, in the marketplace. In its latest ad campaign, Samsung has cleverly “borrowed” a page from Apple’s own marketing playbook, posturing the iPhone as the choice of autumn-aged parents and brainwashed queue sheep.

But when it comes to chips, the two companies must pretend to be civil for the sake of the children: Samsung is the sole supplier of ARM-based processors for the iPhone.

Something has to give.

Since no one sees Samsung getting out of its booming smartphone business, the conclusion is that Apple will assume full custody, it will take its iDevices processor business elsewhere.

But where? There are rumors (which we’ll get to), and none of them so much as hint at Intel.

Except for the rare cameo appearance, Intel is nowhere in the Post-PC world (or, as Frank Shaw, the literate and witty head of Microsoft’s corporate PR obdurately insists, the “PC Plus” world). Becoming Apple’s ARM source wouldn’t just put the Santa Clara company in the race, it would vault them into the lead.

They’ve been there before: Intel scored a coup when Apple switched to the x86 architecture for its Macintosh line in 2005. An iDevice encore would mark an even bigger score as smartphones and tablets have already reached much higher volumes and grow much faster.

So… Why hasn’t Intel jumped at the chance?

The first explanation is architectural disdain. Intel sees “no future for ARM“, it’s a culture of x86 true believers. And they have a right to their conviction: With each iteration of its manufacturing technology, Intel has full control over how to improve its processors. They can reduce x86 power consumption by using smaller building blocks (they’re already down to 22 nanometers wide). They can micro-manage (literally) which parts of a complex chip will be turned on, off, or somewhere in between, in a kind of hibernation.

A further problem is that Intel would need to change roles. Today, the company designs the microprocessors that it manufactures. It tells PC clone makers what these chips will do, how many they will get, when, and for how much. Its development model (called Tick Tock in industry argot) essentially defines the schedules and finances of hardware makers.

This dictatorial model won’t work for iDevices. Apple crossed the border into Intel’s chipset empire back in the Macintosh era, but, today, it has far too much invested in its ARM design to again surrender complete control. As evidenced by the A6 processor running inside the iPhone 5, Apple goes to great lengths to customize the basic ARM cores, adding graphic processors, memory, and large amounts of support logic, and even resorts to aggressive hand-optimization of the silicon layout — as opposed to just letting CAD software tools do the job.

Intel would have to accept Apple’s design and “pour” it into silicon — it would become a lowly “merchant foundry“. Intel knows how to design and manufacture standard parts, it has little experience manufacturing other people’s custom designs…or pricing them.

Which leads us to the most likely answer to the Why Not Intel question: Money. Intel is a sophisticated business entity that expertly balances both terms of the profit equation. On the one hand, they use brand identity, marketing incentives, and a little strong-arming to keep prices “acceptable”, while on the other, the Tick Tock technology and product development pushes its costs down.

The company meticulously tunes the price points for its processors to generate the revenue that will fund development as well as the Intel Inside campaigns that have cost hundreds of millions of dollars over the years, to say nothing of the more recent $300M Ultrabook fund.

One way to visualize Intel’s money pump is to think of what the industry calls a Wafer Start. Here, “wafer” refers to the basic silicon “galette” that will go through the manufacturing steps and emerge with thousands of chips ready to be diced out. For Intel, profit comes from the difference between the cost of running a wafer through the $5B manufacturing unit (a “fab” in our argot) and the revenue that the marketplace will grant each chip.

Intel’s published prices range from a “low” $117 for a Core i3 processor to $999 for a top-of-the-line Core i7 device. Of course, these are the publicly advertised price tags, so we can assume that Acer, Lenovo, and HP pay less… but compare this to iSuppli’s estimate for the cost of the A6 processor: $17.50.

Even if more A6 chips could be produced per wafer — an unproven assumption — Intel’s revenue per A6 wafer start would be much lower than with their x86 microprocessors. In Intel’s perception of reality, this would destroy the business model.

In the meantime, the rumor of the day is that Apple will use TSMC, a well-regarded Taiwanese foundry, the world’s largest. TSMC is known to have made test runs of the A4 last year, and is now reportedly doing the same for the A5 processors that power the new iPad. Furthermore, “industry insiders” have reported that Apple attempted to secure exclusive access to TMSC’s semiconductor output but were rebuffed. (Qualcomm tried, as well; same result.)

This raises a big Disruption question for Intel: In the name of protecting today’s business model, will it let TSMC and others take the huge mobile volume, albeit with lower profit per unit? Can Intel afford to shun ARM?

For all of Intel’s semiconductor design and manufacturing feats, its processors suffer from a genetic handicap: They have to support the legacy x86 instruction set, and thus they’re inherently more complicated than legacy-free ARM devices, they require more transistors, more silicon. Intel will argue, rightly, that they’ll always be one technological step ahead of the competition, but is one step enough for x86 chips to beat ARM microprocessors?

JLG@mondaynote.com

 

Intel 3-D Transistors: Why and When?

A few days ago, Intel teased: On May 4th, the company would make “its most significant technology announcement of the year.”

Tongues wagged. Will Intel make ARM chips for Apple? The speculation has roots in reality.

We’ll start with the public breakup of the Wintel marriage. At this year’s CES in January, Steve Ballmer made it clear that x86 exclusivity was done for. With an eye on reentering the tablet market, the next release of Microsoft’s legacy OS, Windows 8, would also run on ARM SOCs. This will “fork” Windows: There’ll be two versions, one on x86 processors, another on ARM chips. Tablets, which introduce UI differences, add a couple more tines to the fork. The impact on application development isn’t clear yet (food for a future Monday Note). Surprisingly, there’s been little talk of Intel “going ARM” to repair the Wintel relationship.

Now let’s consider Intel’s complete absence from the mobile scene. Not a single smartphone contains an x86 processor. Not a single tablet, no GPS device, nothing.

For the past four years Intel has told us we’d see x86 mobile devices Real Soon Now. The company developed its own mobile version of Linux, MobLin, and they made a big deal of joining forces with Nokia’s Maemo to create MeeGo. But Nokia’s new CEO, Stephen Elop, kicked Meego to the kerb, wisely deciding to focus on one software platform, his ex-employer’s Windows Phone 7.

(We’ll see how wise this decision turns out to be. Perhaps Elop should have put his money on the front-running Android horse. Perhaps Microsoft should have “gone Apple” — pardon, “vertical.” They could have acquired Nokia, controlled the hardware and the software. They did so, successfully, with the Xbox and Kinect. Again, more food for future Monday Notes.)

The x86 mobile devices never materialized. Each new low-power processor promise from Intel was matched by ever more attractive ARM development. Now that the PC market is in its twilight, with mobile devices proliferating and stealing growth from the PC, surely Intel has to get into the race.

Then there’s the long-standing relationship between Steve Jobs and Intel — or, more specifically, with Intel co-founder Andy Grove. The relationship flourished at NeXT when Jobs moved the platform to Intel processors. After Jobs returned to Apple, efforts got under way to move the Macintosh away from the PowerPC, which was deemed poorly supported by IBM and Motorola, to the more robust x86 line.

It isn’t hard to imagine Intel offering Apple its advanced 22nanometer fabs, along with some kind of exclusivity and price advantage. And there’s a bonus: They’d be kicking Samsung, an annoying combination of supplier, competitor, and adversary in IP lawsuits. In return, Apple would give Intel the kind of volume the company likes, 100 million ARM chips in 2012.

From there, the train of thought continues to the terminus: the Macintosh line switches wholly to ARM, and Intel supplies the processors. It’s not impossible. Intel hedges its bets, secures an inexpensive ARM license and uses its technology and marketing prowess to grab their share of the explosive growth.

As the rumor site says: “This is going to cause meetings.”

Now, the reality.

What Intel announced last week is a new “3-D” transistor technology. 3-D here doesn’t refer to images but to a design and manufacturing technique: Making transistors in three dimensions, as opposed to today’s “planar” technology where the microscopic silicon circuitry is laid out on a flat surface. Just as you can store more cars in a multi-storey garage than in a flat parking lot, more circuitry can be packed in three dimensions.

The new 22nm semiconductor manufacturing process also helps. The circuitry building blocks are smaller, they waste less electrical power through heat dissipation. All of this — cue the cymbals — is ideal for mobile applications. In plain English: This is Intel’s ARM killer. (Cruelly, Google tells us we heard the same story three years ago. And two years ago. And last year.)

Intel’s press release is firmly planted in hyperbole:

Intel’s scientists and engineers have once again reinvented the transistor, this time utilizing the third dimension,” said Intel President and CEO Paul Otellini. “Amazing, world-shaping devices will be created from this capability as we advance Moore’s Law into new realms.”

The part about “once again” reinventing the transistor is a bit far-fetched. On Intel’s website, you’ll find the company’s own timeline, replete with innovations, and bowdlerization…but nothing about reinventing the transistor. There’s some dispute as to the transistor’s actual invention: when, where, by whom. Most history books credit William Shockley at Bell Labs Research with the first silicon transistor, which was produced in 1954 by Texas Instruments. (At my Breton Roman Catholic boarding school, the Prefect of Discipline was a certified geek. In 1955, instead of looking at religious pictures, we were in his office drooling at this incredible Philips OC 71 germanium transistor…)

We’re meant to be impressed by the promised performance and power dissipation improvements:

The 22nm 3-D Tri-Gate transistors provide up to 37 percent performance increase at low voltage versus Intel’s 32nm planar transistors. This incredible gain means that they are ideal for use in small handheld devices, which operate using less energy to “switch” back and forth. Alternatively, the new transistors consume less than half the power when at the same performance as 2-D planar transistors on 32nm chips.

Note the Alternatively: it’s either more performance or less power dissipation.

We’ll have to wait a year to see how this markitecture translates into actual devices.

Will this be enough to unseat ARM? Most observers doubt it. The big news was received with an equally big yawn. Wall Street didn’t pay much attention. We’ve been here before: The “product” of the announcement is the announcement. (And there’s the suspicion that “breakthrough” revelations are an attempt to mask a lack of spanking new products.)

But let’s return to the rumor, from SemiAccurate, that the Mac and Intel will soon be “arm-in-ARM.” (That bad pun isn’t mine.)

First, let’s consider the name of the website.

Second, what will Apple do at the high-end, for media creation and editing? What about Photoshop, FinalCut, and other applications, including CAD where the Mac is getting back in the game? There’s no roadmap for ARM chips to beat Intel in these computationally intensive areas.

Today, going ARM is technically feasible on entry-level Macs. Tomorrow, newer multicore ARM chips might work for middle-of-the-line Macintosh products. But will Apple abandon the faster x86 processors at the high end just to avoid the kind of forking that awaits Windows in its own move to ARM? If not, we’ll again see Universal applications (a.k.a. fat binaries–two versions inside the same container), just as we did with the PowerPC to x86 transition. Microsoft is doing it because it must; Apple did it because the PowerPC didn’t have a future. But now?

———

On a related note…and more food for thought: I’d love to know how the iPad line will evolve. For example: will pressure-sensitive stylus input ever happen? Eschewing stylus input in the early days was a thoughtful move. Perhaps it’s time to relax the restriction and thus enable richer media creation applications.

The next iOS and OS X releases will shed more light on the relative roles of Apple’s tablet and PC product lines, how they will coexist, what they’ll have in common and what will keep them apart. We should know in about a month.

JLG@mondaynote.com