intel

Legacy Media: The Missing Gene

 

Legacy media is at great risk of losing against tech culture. This is because incumbents miss a key driver: an obsession with their own mortality. Such missing paranoia gene negatively impacts every aspect of their business. 

At the last Code conference (the tech gathering hosted by Walter Mossberg and Kara Swisher), Google co-founder Sergey Brin made a surprising statement (at least to me): Asked by Swisher how Google sees itself, Brin responded in his usual terse manner: “There is the external and the internal view. For the outside, we are Goliath and the rest are Davids. From the inside, we are the Davids”. From someone who co-founded a $378bn market cap company that commands more than 80% of the global internet search, this is indeed an unexpected acknowledgement.

Sergey Brin’s statement echoes Bill Gates’ own view when, about fifteen years ago, he was asked about his biggest concern: Was it a decisive move or product by another big tech company? No, says, Gates, it is the fact that somewhere, somehow, a small group of people is inventing something that will change everything… With the rise of Google and Facebook, his fears came true on a scale he couldn’t even imagine. Roughly at the same time, Andy Grove, then CEO of Intel, published a book with a straightforward title: “Only the Paranoid Survives“. Among my favorites Grove quotes:

“Business success contains the seeds of its own destruction. The more successful you are, the more people want a chunk of your business and then another chunk and then another until there is nothing.”

Still, Intel wasn’t paranoid enough and completely missed the mobile revolution, leaving to ARM licensees the entire market of microprocessors for smartphones and tablets.

This deep-rooted sense of fragility is a potent engine of modern tech culture. It spurs companies to grow as fast as they can by raising lots of capital in the shortest possible time. It also drives them to capture market share by all means necessary (including the worst ones), and to develop a culture of excellence by hiring the best people at any cost while trimming the workforce as needed while obsessively maintaining a culture of agility to quickly learn form mistakes and to adapt to market conditions. Lastly, the ever-present sense of mortality drives rising tech companies to quickly erect barriers-to-entry and to generate network effects needed to keep incumbents at bay.

For a large part, these drives stem from these companies’ early history and culture. Most started combining a great idea with clever execution – as opposed to being born within an expensive infrastructure. Take Uber or AirBnB. Both started with a simple concept: harness digital tools to achieve swift and friction-free connections between customers and service providers. Gigantic infrastructure or utterly complicated applications weren’t required. Instead, the future of these companies was secured by a combination of flawless execution and fast growth (read this New York Times story about the Uber network effect challenge). Hence the rapid-fire rounds of financing that will boost Uber’s valuation to $17bn, allowing it to accelerate its worldwide expansion – and also combat a possible price war, as stated by its founder himself at the aforementioned Code Conference.

Unfortunately, paranoia-driven growth sometimes comes with ugly business practices. Examples abound: Amazon’s retaliation against publishers who fight its pricing conditions; Uber bullying tactics against its rival – followed by an apology; Google offering for free what others were used to sell, or distorting search results, etc.

Such behaviors leave the analog world completely flummoxed. Historical players had experienced nothing but a cosy competitive gentlemen-like environment, with a well-defined map of players. This left incumbents without the genes, the culture required to fight digital barbarians. Whether they are media dealing with Google, publishers negotiating with Amazon, hotels fighting Booking.com or AirBnB, or taxi confronting Uber, legacy players look like the proverbial deer caught in the headlights. In some instances, they created their own dependency to new powerful distributors (like websites whose traffic relies largely on Google), before realizing that it was time to sue the dope dealer. (This is exactly what the European press is doing by assigning Google before the European Court of Justice invoking antitrust violations — a subject for a future Monday Note). The appeal to legislators underlines the growing feeling of impotence vis-a-vis the take-no-prisoners approach of new digital players: Unable to respond on the business side, the old guard turns to political power to develop a legal (but short-lasting) containment strategy.

In the media industry, historic players never developed a sense of urgency. The situation varies from one market to another but, in many instances, the “too important to fail” was the dominant belief. It always amazed me: As I witnessed the rise of the digital sector – its obsession with fast growth, and its inevitable collision course with legacy media – incumbents were frozen in the quiet certitude that their role in society was in fact irreplaceable, and that under no circumstances they would be left to succumb to a distasteful Darwinian rule. This deep-rooted complacency is, for a large part, responsible for the current state of the media industry.

Back in 1997, Andy Grove’s book explained how to deal with change :

“The implication was that either the people in the room needed to change their areas of knowledge and expertise or people themselves needed to be changed” 

Instead, our industry made too few changes, too late. Since the first digital tremors hit business models ten years ago, we have been through one or two generations of managers in traditional media company. It is amazing to see how the same DNA is being replicated over and over. Some layers are moving faster than others, though. The higher you go in the food chain, the more people are penetrated by a sense of vital urgency. But the rank-and-file and middle management are holding back, unable to exit their comfort zone.

Earlier this year, the French newspaper Liberation chose the outdated slogan: “We are a Newspaper” in reaction to its new owners ideas (read this story in the NYT). Last week, Liberation opted to appoint as it editor-in-chief one of the strongest opponent to digital media (he is just out from the weekly Le Nouvel Observateur which he gently led into a quiet nursing home, leaving it worth next to nothing).

The gap between the managers of pure digital players and those who still lead legacy media has never been greater. Keenly aware of their own mortality, the former rely more than ever on brutal street-fight tactics, while the incumbents evolve at a different pace, still hoping that older models will resist longer than feared. For old media, it is time for a radical genetic alteration — if performed down to every layer of the media industry.

frederic.filloux@mondaynote.com

 

Peak PC. Intel Fork.

 

Propelled by Moore’s Law and the Internet, PCs have enjoyed four decades of strong growth, defying many doomsday prophecies along the way. But, with microprocessor performance flattening out, the go-go years have come to an end. Intel, the emperor of PC processors, and a nobody in mobile devices needs to react.]

I’m suspicious of Peak <Anything> predictions. Some of us became aware of the notion of a resource zenith during the 1973 OPEC oil embargo, with its shocking images of cars lined up at gas stations (in America!):

Gas Lines Oil Embargo

This was Peak Oil, and it spelled doom to the auto industry.

We know what happened next: Cars improved in design and performance, manufacturers became more numerous. Looking at this bit of history through my geek glasses, I see three explanations for the rebound: computers, computers, and computers. Computer Assisted Design (CAD) made it easier to design new car models as variations on a platform; Volkswagen’s MQB is a good example. Massive computer systems were used to automate the assembly line and manage the supply chain. It didn’t take long for computers to work their way into the cars themselves, from the ECU under the hood to the processors that monitor the health of the vehicle and control the entertainment and navigation systems.

Since then, we’ve had repeated predictions of Peak Oil, only to be surprised by the news that the US will soon become a net oil exporter and, as Richard Muller points out in his must-read Physics for Future Presidents, we have more than a century of coal reserves. (Unfortunately, the book, by a bona fide, middle-of-the-road physicist, can’t promise us that physics will eventually push politics aside when considering the rise of CO2 in the atmosphere…)

I’ve heard similar End of The Go-Go Days predictions about personal computers since 1968 when my love affair with these machines started at HP France (I was lucky enough to be hired to launch their first desktop machine).

I heard the cry again in 1985 when I landed in Cupertino in time for the marked slowdown in Apple ][ sales. The never-before round of layoffs at Apple prompted young MBAs, freshly imported from Playtex and Pepsi, to intone the It’s All Commodities Now dirge. I interpreted the cry (undiplomatically -- I hadn’t yet learned to speak Californian) as a self-serving It’s All Marketing Now ploy. In the meantime, engineers ignored the hand-wringing, went back to work, and, once again, proved that the technology “mines” were far from exhausted.

In 1988, a Sun Microsystems executive charitably warned me: “PCs are driving towards the Grand Canyon at 100 mph!”.  A subscriber to Sun’s The Network Is The Computer gospel, the gent opined that heavy-duty computing tasks would be performed by muscular computers somewhere (anywhere) on the network. Desktop devices (he confusingly called them “servers” because they were to “serve” a windowing protocol, X11) would become commodities no more sophisticated or costly than a telephone. He had no answer for multimedia applications that require local processing of music, video, and graphics, nor could he account for current and imminent mobile devices. His view wasn’t entirely new. In 1965, Herb Grosch gave us his Law, which told us that bigger computers provide better economics; smaller machines are uneconomical.

And yet, personal computers flourished.

I have vivid memories of the joy of very early adopters, yours truly included. Personal computers are liberating in many ways.

First, they don’t belong to the institution, there’s no need for the intercession of a technopriest, I can lift my PC with my arms, my brains, and my credit card.

Second, and more deeply, the PC is a response to a frustration, to a sense of something amiss. One of mankind’s most important creations is the symbol, a sign without a pre-existing meaning: X as opposed to a drawing of a deer on a cave wall. Strung together, these symbols show formidable power. The expressive and manipulative power of symbol strings runs through the Song of Songs, Rumi’s incandescent poetry, Wall Street greed, and quantum physics.

But our central nervous system hasn’t kept up with our invention. We don’t memorize strings well, we struggle with long division, let alone extracting cubic roots in our heads.

The PC comes to the rescue, with its indefatigable ability to remember and combine symbol strings. Hence the partnership with an object that extends the reach of our minds and bodies.

Around 1994, the Internet came out of the university closet, gave the PC access to millions of servers around the world (thus fulfilling a necessary part of the Sun exec’s prophecy), and extended our grasp.

It’s been great and profitable fun.

But today, we once again hear Peak PC stories. Sales have gone flat, never to return:

PC shipments 2014-18 - PNG

This time, I’m inclined to agree.

Why?

Most evenings, my home-builder spouse and I take a walk around Palo Alto. Right now, this smallish university town is going through a building boom. Offices and three-layer retail + office + residence are going up all around University Avenue. Remodels and raze-and-build projects can be found in the more residential parts of town. No block is left unmolested.

I can’t help but marvel. None of this activity, none of Silicon Valley would exist without Moore’s Law, the promise made in 1965 that semiconductor performance would double every 18 months. And, for the better part of 40 years, it did - and rained money on the tech ecosystem, companies and people. PCs, servers, embedded electronics, giant network routers, cars...they’ve all been propelled because Moore’s Law has been upheld...until recently.

The 1977 Apple ][ had a 1MHz 8-bit processor. Today’s PCs and Mac’s reach 3.7GHz, but number that hasn’t changed in more than three years. This isn’t to say that Intel processors aren’t still improving, but the days when each new chip brought substantial increases in clock speed seem to be over.

One should never say never, but Moore’s Law is now bumping into the Laws of Physics. The energy needed to vibrate matter (electrons in our case) increases with frequency. The higher the clock frequency, the higher the power dissipation and the greater the heat that’s generated…and a PC can withstand only so much heat. Consider the cooling contraptions used by PC gamers when they push the performance envelope of their “rigs”:

EK-Thermosphere_right2_12001

To work around the physical limits, Intel and others resort to stratagems such as “multiple cores”, more processors on the same chip. But if too many computations need the result of the previous step before moving forward, it doesn’t matter how many cores you have. Markitects have an answer to that as well: “speculative branch execution”, the use of several processors to execute possible next steps. When the needed outcome appears, the “bad” branches are pruned and the process goes forward on the already-computed good branch. It makes for interesting technical papers, but it’s no substitute for a 8GHz clock speed.

If we need confirmation of the flattening out of microprocessor progress, we can turn to Intel and the delays in implementing its Broadwell chips. The move to a 14 nanometers  “geometry” — the term here denotes the size of a basic circuit building block — is proving more difficult than expected. And the design isn’t meant to yield faster processors, just less power-hungry ones (plus other goodies such as better multi-media processing).

One possible reaction to this state of affairs is to look at tablets as a new engine of growth. This is what Microsoft seems to be doing by promoting its Intel-inside Surface Pro 3 as a laptop replacement. But even if Microsoft tablets turn out to be every bit as good as Microsoft says they are, they aren’t immune to the flattening out of Intel processor performance. (I don’t have an opinion yet on the product — I tried to buy one but was told to wait till June 20th.)

Does this broaden the opening for ARM-based devices? Among their advantages is a cleaner architecture, one devoid of the layers of backwards compatibility silt x86 devices need. ARM derivaties need less circuitry for the same computing task and, as a result, dissipate less power. This is one of the key reasons for their dominance in the battery-powered world of mobile devices. (The other is the customization and integration flexibility provided by the ARM ecosystem.) But today’s ARM derivatives run at lower speeds (a little above 1GHz for some) than Intel chips. Running at higher speeds will challenge them to do so without hurting battery life and having to add the fan that Microsoft tablets need.

With no room to grow, PC players exit the game. Sony just did. Dell took itself private and is going through the surgery and financial bleeding a company can’t withstand in public. Hewlett-Packard, once the leading PC maker, now trails Lenovo. With no sign of turning its PC business around, HP will soon find itself in an untenable position.

Intel doesn’t have the luxury of leaving their game — they only have one. But I can’t imagine that Brian Krzanich, Intel’s new CEO, will look at Peak PC and be content with the prospect of increasingly difficult x86 iterations. There have been many discussions of Intel finally taking the plunge and becoming a “foundry” for someone else’s ARM-based SoC (System On a Chip) designs instead of owning x86 design and manufacturing decisions. Peak PC will force Intel CEO’s hand.

JLG@mondaynote.com

Intel Is Under New Management – And It Shows

 

Intel rode the PC wave with Microsoft and built an seemingly insurmountable lead in the field of “conventional” (PCs and laptops) microprocessors. But, after his predecessor missed the opportunity to supply the CPU chip for Apple’s iPhone, Intel’s new CEO must now find a way to gain relevance in the smartphone world.

In last May’s The Atlantic magazine, Intel’s then-CEO Paul Otellini confessed to a mistake of historic proportions. Apple had given Intel the chance to be part of the smartphone era, to supply the processor for the first iPhone… and Otellini said no [emphasis and light editing mine]:

“The thing you have to remember is that this was before the iPhone was introduced and no one knew what the iPhone would do… At the end of the day, there was a chip that they were interested in that they wanted to pay a certain price for and not a nickel more and that price was below our forecasted cost. I couldn’t see it. It wasn’t one of these things you can make up on volume. And in hindsight, the forecasted cost was wrong and the volume was 100x what anyone thought.”
“…while we like to speak with data around here, so many times in my career I’ve ended up making decisions with my gut, and I should have followed my gut. [...] My gut told me to say yes.”

That Otellini found the inner calm to publicly admit his mistake — in an article that would be published on his last day as CEO, no less — is a testament to his character. More important, Otellini’s admission unburdened his successor, Brian Krzanich, freeing him to steer the company in a new direction.

And Krzanich is doing just that.

First: House cleaning. Back in March 2012, the Wall Street Journal heralded Intel as The New Cable Guy. The idea was to combine an Intel-powered box with content in order to serve up a quality experience not found elsewhere (read Apple, Netflix, Roku, Microsoft…). To head the project, which was eventually dubbed OnCue, Intel hired Erik Huggers, a senior industry executive and former head of BBC Online.

At the All Things D conference in February, Huggers announced that the TV service would be available later this year. The Intel TV chief revealed no details about how the service OnCue would differ from existing competitors, or how much the thing would cost…but he assured us that the content would be impressive (“We are working with the entire industry”), and the device’s capabilities would be comprehensive (“This is not a cherry-pick… this is literally everything”).

Intel seemed to be serious. We found out that more than 1,000 Intel employees in Oregon had been engaged in testing the product/service.

Then Krzanich stepped in, and applied a dose of reality:

Intel continues to look at the business model…. we are not experts in the content industry and we’re being careful.” [AllThingsD: New Intel CEO Says Intel TV Sounds Great in Theory. But …]

Indeed, to those of us who have followed the uneasy dance between Apple and content providers since the first Apple TV shipped in 2007, the Intel project sounded bold, to say the least.

Late September, the project was put on hold and, last week, the news came that OnCue had been cancelled and allegedly offered to Verizon, whose V Cast media distribution feats come to mind…

Even before OnCue’s cancellation was made official, the well-traveled Erik Huggers appeared to show an interest in the Hulu CEO job. (If Mr Huggers happens to be reading this: I’d be more than happy to relieve you of the PowerPoints that you used to pitch the project to Intel’s top brass, not to mention the updates on the tortuous negotiations for content, and the reports from the user testing in Oregon. These slides must make fascinating corpospeak logic.)

Krzanich quickly moved from doubt to certainty. He saw that OnCue would neither make money by itself, nor stimulate sales or margins for its main act, x86 processors. OnCue would never be an Apple TV “black puck”, a supporting character whose only mission is to make the main personal computers (small, medium and large; smartphones, tablets and conventional PCs) more useful and pleasant.

So he put an end to the impossible-to-justify adventure.

That was easy.

Tackling Intel’s failure to gain a significant role in the (no longer) new world of smartphones is a much more complicated matter.

With its x86 processors, Intel worked itself into a more-than-comfortable position as part of the Wintel ecosystem. The dominant position achieved by the Microsoft-Intel duopoly over two decades yielded correspondingly high margins for both.

But smartphones changed the game. ARM processors proved themselves better than x86 at the two tasks that are integral to personal, portable devices: lowering power consumption and customization. The ARM architecture didn’t have to wait for the iPhone and Android handsets to dominate the cell phone business. Just as Windows licensing spawned a large number of PC makers, ARM licensing contributed to the creation of a wide range of processor design and manufacturing companies. The ARM site claims 80 licensees for its newer Cortex family and more than 500 for its older Classic Arm processors. No monopoly means lower margins.

Intel saw the unattractive margins offered by ARM processors and didn’t want to commit the billions of dollars required by a fab (a chip manufacturing plant) for a product that would yield profits that were well below Wall Street expectations.

The prospect of bargain basement margins undoubtedly figured in Otellini’s decision to say no to the iPhone. In 2006, no one could have predicted that it could have been made up in volume, that there would be a billion smartphone sales in 2014. (I’m basing the 1B number for the entire industry on Horace Dediu’s estimate of 250 million iOS devices for 2014.)

Even if the Santa Clara company had had the foresight to accept lower margins in order to ensure their future in the smartphone market, there would still have been the problem of customization.

Intel knows how to design and manufacture processors that used “as is” by PC makers. No customization, no problems.

This isn’t how the ARM world works. Licensees design processors that are customized for their specific device, and they send the design to a manufacturer. Were Intel to enter this world, they would no longer design processors, just manufacture them, an activity with less potential for profit.

This explains why Intel, having an ARM license and making XScale processors, sold the business to Marvell in 2006 – a fateful date when looking back on the Apple discussions.

But is Intel’s new CEO is rethinking the “x86 and only x86″ strategy? Last week, a specialty semiconductor company called Altera announced that Intel would fabricate some if its chips containing a 64-bit ARM processor. The company’s business consists of offering faster development times through “programmable logic” circuits. Instead of a “hard circuit” to be designed, manufactured, tested, debugged, modified and sent back to the manufacturing plant in lengthy and costly cycles, you buy a “soft circuit” from Altera and similar companies (Xilinx comes to mind). This more expensive device can be reprogrammed on the spot to assume a different function, or correct the logic in the previous iteration. Pay more and get functioning hardware sooner, without slow and costly turns through a manufacturing process.

With this in mind, what Intel will someday manufacture for Altera isn’t the 64-bit ARM processor that excited some observers: “Intel Makes 14nm ARM for Altera“. The Stratix 10 circuits Altera contracts to Intel manufacturing are complicated and expensive ($500 and up) FPGA (Field Programmable Gate Array) devices where the embedded ARM processor plays a supporting, not central, role. This isn’t the $20-or-less price level arena in which Intel has so far declined to compete.

Manufacturing chips for Altera might simply be work-for-hire, a quick buck for Intel, but I doubt it. Altera’s yearly revenue is just shy of $2B; Intel is a $50B company. The newly announced device, just one in Altera’s product lines, will not “move the needle” for Intel — not in 2014 (the ship date isn’t specified), or ever.

Instead, I take this as a signal, a rehearsal.  250M ARM SoCs at $20 each would yield $5B in revenue, 10% of Intel’s current total…

This might be what Krzanich had in mind about when he inked the “small” manufacturing agreement with Altera; perhaps he was weighing the smaller margins of ARM processors against the risk of slowing PC sales.

Graciously freed from the past by his predecessor, it’s hard to see how Intel’s new CEO won’t take the plunge and use the company’s superb manufacturing technology to finally

make ARM processors.

JLG@mondaynoye.com

 

Otellini’s Striking Confession

 

We know Intel shunned ARM processors and played virtually no role in the smartphone revolution. But we now learn Steve Jobs asked Intel to build the iPhone microprocessor. Paul Otellini, Intel’s departing CEO, admits he should have followed his gut – and made the smartphone world a very different place.

CEO valedictions follow a well-known script: My work is done here, great team, all mistakes are mine, all good deeds are theirs, I leave the company in strong hands, the future has never been brighter… It’s an opportunity for a leader to offer a conventional and contrived reminiscence, what the French call la toilette des souvenirs (which Google crudely translates as toilet memories instead of the affectionate and accurate dressing up memories).

For his farewell, Paul Otellini, Intel’s departing CEO, chose the interview format with The Atlantic Monthly’s senior editor Alexis Madrigal. They give us a long (5,700+ words) but highly readable piece titled Paul Otellini’s Intel: Can the Company That Built the Future Survive It?

Photo: Guardian.co.uk

The punctuation mark at the title’s end refers to the elephantine question in the middle of Otellini’s record: Why did Intel miss out on the smartphone? Why did the company that so grandly dominates the PC market sit by while ARM architecture totally, and perhaps irretrievably, took over the new generation of phones — and most other embedded applications?

According to Otellini, it was the result of Intel’s inertia: It took a while to move the machine.

Madrigal backfills this uneasy explanation with equal unease:

“The problem, really, was that Intel’s x86 chip architecture could not rival the performance per watt of power that designs licensed from ARM based on RISC architecture could provide. Intel was always the undisputed champion of performance, but its chips sucked up too much power. In fact, it was only this month that Intel revealed chips that seem like they’ll be able to beat the ARM licensees on the key metrics.”

Note the tiptoeing: Intel’s new chips “seem like” they’ll be fast enough and cheap enough. Madrigal charitably fails to note how Intel, year after year, kept promising to beat ARM at the mobile game, and failed to do so. (See these 2010, 2011 and 2012 Monday Notes.) Last year, Intel was still at it, dismissively predicting “no future for ARM or any of its competitors“. Tell that to ARM Holdings, whose licensees shipped 2.6 billions chips in the first quarter of this year.

Elsewhere in the article, Otellini offers a striking revelation: Fresh from anointing Intel as the microprocessor supplier for the Mac, Steve Jobs came back and asked Intel to design and build the CPU for Apple’s upcoming iPhone. (To clarify the chronology, the iPhone was announced early January, 2007; the CPU conversation must have taken place two years prior, likely before the June, 2005 WWDC where Apple announced the switch to x86. See Chapter 36 of Walter Isaacson’s Jobs bio for more.)

Intel passed on the opportunity [emphasis mine]:

“We ended up not winning it or passing on it, depending on how you want to view it. And the world would have been a lot different if we’d done it, […]

Indeed, the world would have been different. Apple wouldn’t be struggling through a risky transition away from Samsung, its frenemy CPU supplier; the heart of the iPhone would be Made In America; Intel would have supplied processors for more than 500 million iOS devices, sold even more such chips to other handset makers to become as major a player in the smartphone (and tablet) space as it is in the PC world.

Supply your own adjectives…

Indulging briefly in more What If reverie, compare the impact of Intel’s wrong turn to a better one: How would the world look like if, at the end of 1996, Gil Amelio hadn’t returned Apple back to Steve Jobs? (My recollection of the transaction’s official wording could be faulty.)

So, again, what happened?

At the end of the day, there was a chip that they were interested in that they wanted to pay a certain price for and not a nickel more and that price was below our forecasted cost. I couldn’t see it. It wasn’t one of these things you can make up on volume. And in hindsight, the forecasted cost was wrong and the volume was 100x what anyone thought.

A little later, Otellini completes the train of thought with a wistful reverie, a model of la toilette des souvenirs:

“The lesson I took away from that was, while we like to speak with data around here, so many times in my career I’ve ended up making decisions with my gut, and I should have followed my gut,” he said. “My gut told me to say yes.”

The frank admission is meant to elicit respect and empathy. Imagine being responsible for missing the opportunity to play a commanding role in the smartphone revolution.

But perhaps things aren’t as simple as being a “gut move” short of an epochal $100B opportunity.

Intel is a prisoner of its x86 profit model and Wall Street’s expectations. It’s dominant position in the x86 space give Intel the pricing power to command high margins. There’s no such thing in the competitive ARM space, prices are lower. Even factoring in the lower inherent cost of the somewhat simpler devices (simpler for the time being; they’ll inevitably grow more complex), the profit-per-ARM chip is too thin to sustain Intel’s business model.

(Of course, this assumes a substitution, an ARM chip that displaces an x86 device. As it turns out, the smartphone business could have been largely additive, just as we now see with tablets that cannibalize classical PCs.)

Another factor is the cultural change that would have been required were Intel to have gotten involved in making ARM devices. As both the designer and manufacturer of generation after generation of x86 microprocessors, Intel can wait until they’re good and ready before they allow PC makers to build the chips into their next products. The ARM world doesn’t work that way. Customers design their own chips (often called a System on a Chip, or SoC), and then turn to a semiconductor manufacturer (a foundry) to stamp out the hardware. Taking orders from others isn’t in Intel’s DNA.

And now?

The answer might lie in another French expression: L’histoire ne repasse pas les plats. Google Translate is a bit more felicitous this time: History does not repeat itself. I prefer the more literal image — History doesn’t come around offering seconds — but the point remains: Will there be seconds at the smartphone repast?

Officially, Intel says its next generation of x86 processors will (finally!) topple the ARM regime, that their chips will offer more computing might with no cost or power dissipation penalty. In their parlance “the better transistor” (the basic unit of logic processing) will win.

I doubt it. The newer x86 devices will certainly help Microsoft and its OEMs make Windows 8 devices more competitive, but that won’t prevent the spread of ARM in the legion of devices on which Windows is irrelevant. For these, Intel would have to adopt ARM, a decision Otellini has left to the new tandem leadership of Brian Krzanich (CEO) and Renée James (President). Will they stick to the old creed, to the belief Intel’s superior silicon design and manufacturing technology will eventually overcome the disadvantages of the more complex x86 architecture? Or will they take the plunge?

They might be helped by a change in the financial picture.

In 2006, that is after throwing Jobs in Samsung’s arms (pun unintended), Intel sold its ARM business, the XScale line, to Marvell. The reason was purely financial: for similar capital expenditures (costly fabs), ARM processors achieved much lower per-unit profit, this because of the much more competitive scene than in the x86 space.

Now, if Intel really wants to get a place at the smartphone table with new and improved x86 devices, the company will have to price those to compete with established ARM players. In other words, Intel will have to accept the lower margins they shunned in 2006. Then, why not do it with the ARM-based custom processors Apple and others require?

JLG@mondaynote.com

—————————-

(I’ll confess a weakness for The Atlantic and, in particular, for its national correspondent James Fallows, a literate geek and instrument-rated pilot who took upon himself to live in Beijing for a while and, as a result, can speak more helpfully about China than most members of the Fourth Estate. Going back to last week’s reference to the Gauche Caviar, when my Café de Flore acquaintances fall into their usual rut of criticizing my adopted country for its lack of “culture”, I hold out that The Atlantic — which sells briskly at the kiosk next door — is one of many examples of American journalistic excellence.

And, if you’re interested in more strange turns, see this other string Alexis Madrigal piece in the same Atlantic: The Time Exxon Went Into the Semiconductor Business (and Failed). I was there, briefly running an Exxon Information Systems subsidiary in France and learning the importance of corporate culture.)–JLG

Apple Buys Intel

 

Getting rid of Samsung as a processor supplier and, at the same time, capturing the crown jewel of the American semiconductor industry. How could Apple resist the temptation to solve its cash problem and make history again?

Halfway through the second quarter of the 2013 fiscal year, most of Apple’s top execs meet at an undisclosed location (Eddy Cue’s chair is empty – he’s been called away to a Ferrari board meeting). They’re joined by a few trusted industry insiders: Bill “the Coach” Campbell, Apple and Intuit Director and adviser to Google’s founders, Mssrs. Page and Brin; Larry Sonsini, the Silicon Valley consigliere of more than three decades; and Frank Quattrone, the star investment banker with nine lives.

The meeting isn’t about the company’s dwindling profit margins. The smaller margins were expected and invited: The reduced-price iPad and heavy promotion of the “old” iPhone 4 as an entry-level product are part of the long term strategy of guarding Apple’s lower end (so to speak). And no whining about AAPL’s grim slide over the last six months, a problem that has only one solution: Apple needs to record a series of better quarters.

The problem of the day is, once again, what to do with Apple’s obscene pile of cash.

By the end of December 2012, the company held about $137B in cash (or equivalents such as marketable securities), including $23B from operations for the quarter.

CFO Peter Oppenheimer delivers the bad news: It looks like operations will disgorge another $35B this quarter. The stock buy-back and dividend program that was designed to bleed off $45B over the next few years (see this March 2012 Monday Note) won’t be enough if the company continues at this rate.

Apple needs something bigger.

Quattrone has been sitting quietly at the end of the table. He clears his throat and speaks:

Buy Intel.

Well, yes, Frank (says Tim Cook), we’ve been buying Intel processors for the Mac since 2005.

Not the chips. The company. The planets are aligned for Apple to strike a blow that will leave the industry forever changed. Make history, acquire Intel.

Quattrone has their attention. He unfolds the celestial calibration:

  • Apple needs to extract itself from the toxic relationship with Samsung, its ARM supplier.
  • Intel is the best large-scale silicon manufacturer in the world. They have the people, the technology, and the plant capacity to match Apple’s needs for years to come.
  • “But Intel doesn’t do ARM!” you say. Indeed, Intel has no interest in the fierce competition and small margins in the ARM-based SoC market. Joining the ARM fray would severely disrupt Intel’s numbers and infuriate Wall Street. But if Intel were to essentially “go private” as Apple’s semiconductor manufacturing arm (pun intended), catering to all of Apple’s x86 and ARM needs (and whatever else Bob Mansfield is secretly plotting), Wall Street would have no such objection.
  • Intel is flailing. The traditional PC market – Intel’s lifeblood – continues to shrink, yet the company does nothing to break into the ARM-dominated mobile sector. In the meantime, the company makes perplexing investments such as buying McAfee for $7.68B.
  • There’s a leadership vacuum at Intel. Six months after announcing CEO Paul Otellini‘s “retirement”, Intel’s Board has yet to find a replacement who can sail the ship in more competitive waters. Apple could commission Pat Gelsinger, a 30-year Intel veteran and former CTO (Intel’s first) who fled to VMware after his career stalled at Intel. Despite being a bit of a Bill Gates look-alike (once upon a time), Gelsinger is a real technologist who would fit well within Apple, especially if he were given the opportunity to really “go for” the ARM architecture instead of iteratively tweaking x86 devices.
  • Last but not least, Intel’s market cap is about $115B, eminently affordable. The company is profitable and generates a good deal of cash, even after the heavy capital expenditures required by its constant need to build new and expensive manufacturing plants.
  • …oh, and one more thing: Wouldn’t it be fun to “partner” more closely with Microsoft, HP and Dell, working on x86 developments, schedules and… pricing?

A lively discussion ensues. Imagine solving many of Apple’s problems with a single sweeping motion. This would really make Cupertino the center of the high-tech world.

It’s an interesting idea, but there will be obstacles, both cultural and legal.

The Coach goes first: “Knowing both of these companies more than a little bit, I can attest to the pride they have in their respective cultures. They’re both disinclined to reconsider their beliefs in any meaningful way. Merging these two dissimilar groups, shedding unnecessary activities such as McAfee and the like would be dangerously disruptive to Apple’s well-honed, cohesive culture. As a general rule, merging two large organization rarely succeeds… unless you consider merging airlines a success…”

Finally, the Consigliere speaks: “It’s a tempting fantasy, it will mean years of work for my firm and many, many others, but as a friend of the company, as a past confidant of your departed Founder, don’t do it. There will be too much legal trouble with the Feds, with competitors, with Intel partners. Most fantasies aren’t meant to be enacted.”

I won’t dwell on the reality of the meeting: I made it up as a way to explain why Apple really has no choice other than submit to another cash phlebotomy, this time for an additional $60B. And, as with real-world phlebotomies, the procedure will treat the problem, but it won’t cure it. With $30B from operations per quarter, the $60B lancing will have to be repeated.

Some read the decision to return gobs of cash to shareholders as an admission of defeat. Apple has given up making big moves, as in one or more big acquisitions.

I don’t agree: We ought to be glad that the Apple execs (and their wise advisers) didn’t allow themselves to succumb to transaction fever, to a mirage of ego aggrandizement held out by a potential “game changing” acquisition.

A final word on taxes. To return the additional $60B (for a total of $100B when including the ongoing program announced last year) through increased dividends and repurchased shares, Apple will have to borrow money.

Borrow? When they have so much cash?

Yes, thanks to our mangled tax code. As explained here, about $100B of Apple’s cash is stored overseas. If repatriated, it would be “heavily” (read “normally”) taxed. Like most US companies that have international operations, Apple plays complicated, entirely legal tax games that allow their international profits to be taxed at very low rates as long as the profits — and the resulting cash — stay outside Uncle Sam’s reach. And thus we have the apparent paradox of borrowing money when cash-rich.

The benefit of these tax code contortions is difficult to explain to normal humans — as opposed to legislators who allowed the loopholes.

All this now makes Apple a different company. Once a fledgling challenger of established powerhouses such as IBM, Microsoft or HP, it now makes “too much cash” and is condemned to a life of paying dividends and buying back shares — like the old fogies it once derided.

JLG@mondaynote.com

 

 

Wintel: Le Divorce Part II

 

At CES 2011, Ballmer told the world Windows would “fork”, that it would also run on lower power ARM chips for mobile devices. This was seen as a momentous breach in the long-standing Wintel duopoly. Two years later, the ARM tooth of the fork looks short and dull.

This is what I wrote almost two years ago:

After years of monogamy with the x86 architecture, Windows will soon run on ARM processors.

As in any divorce, Microsoft and Intel point fingers at one another. Intel complains about Microsoft’s failure to make a real tablet OS. They say MS has tried to shoehorn “Windows Everywhere” onto a device that has an incompatible user interface, power management, and connectivity requirements while the competition has created device-focused software platforms.

Microsoft rebuts: It’s Intel’s fault. Windows CE works perfectly well on ARM-based devices, as do Windows Mobile and now Windows Phone 7. Intel keeps telling us they’re “on track”, that they’ll eventually shrink x86 processors to the point where the power dissipation will be compatible with smartphones and tablets. But…when?

Today, a version of Windows (RT) does indeed run on an ARM processor, on Microsoft’s Surface tablet-PC hybrid. Has Microsoft finally served Intel with divorce papers?

Not so fast. The market’s reaction to Redmond’s ambitious Surface design has fallen far short of the heights envisioned in the company’s enthusiastic launch: Surface machines aren’t flying off Microsoft Store shelves. Ballmer himself admits sales are “modest” (and then quickly backpedals); Digitimes, admittedly not always reliable, quotes suppliers who say that Surface orders have been cut by half; anecdotally, but amusingly, field research by Piper Jaffray’s Gene Munster (who can be a bit excitable) shows zero Surfaces sold during a two hour period at the Mall of America on Black Friday, while iPads were selling at a rate of 11-an-hour.

Traditional PC OEMs aren’t enthusiastic either. Todd Bradley, head of HP’s Personal Systems Group, is unimpressed:

“It tends to be slow and a little kludgey as you use it .…”

Acer exec Linxian Lang warns:

“Redmond will have to eat ‘hard rice’ with Surface…it should stick to its more readily-chewed software diet.”

To be sure, there are happy Surface users, such as Steve Sinofsky, the former Windows Division President, as captured in lukew’s Instagram picture:

(An aside: I went back to Sinofsky’s 8,000 words blog post that lovingly describes the process of developing “WOA” — Windows on ARM. At the time, WOA was presented as part of the Windows 8 universe. Later, Microsoft swapped the “8″ designation and chose to use “RT” instead. These naming decisions aren’t made lightly. Is there any wonder why WOA was moved out of the Windows 8 camp?)

It’s possible that the jury is still out… Surface sales could take off, Windows RT could be embraced by leading PC OEMs… but what are the odds? In addition to the tepid reception from customers and vendors alike, Microsoft must surmount the relentless market conquest of Android and iOS tablets whose numbers (210 million units) are expected to exceed laptop sales next year.

So, no… the Wintel Divorce isn’t happening. Intel’s x86 chips will remain the processors of choice to run Windows. Next month, we’ll have CES and its usual burst of announcements, both believable and dubious (remember when 2010 was declared the Year Of The Tablet PC?). We’ll have to sort the announcements that are merely that from those that will yield an actual device, but in the end I doubt we’ll see many new and really momentous Windows RT products out there.

Microsoft’s lackluster attempt at Post-PC infidelity doesn’t help Intel in its efforts to gain a foothold in the mobile world. Intel’s perennial efforts to break into the mobile market with lower power, lower cost x86 chips have, also perennially, failed. As a result, there is renewed speculation about a rapprochement between Intel and Apple, that the Santa Clara microprocessor giant could become an ardent (and high-volume) ARM SoC foundry.

As discussed here, some of this makes sense: Samsung is Apple’s biggest and most successful competitor in the smartphone/tablet space, spending billions more than anyone else in global marketing programs. At the same time, the South Korean company is Apple’s only supplier of ARM chips. Intel has the technology and manufacturing capacity to become an effective replacement for Samsung.

This wouldn’t be an easy decision for Intel: the volumes are high — as high as 415M ARM chips for 2013 according to one analyst — but the margins are low. And Intel doesn’t do low margins. Because of the Wintel duopoly, Intel’s x86 chips have always commanded a premium markup. Take Windows out of the picture and the margin disappears.

(As another aside, the 415,000 ARM chips number seems excessive. Assuming about 50 million iPhone 5s and 15 million iPads in the current quarter, and using the 4X rule of thumb for the following calendar year, we land somewhere between 250M and 300M ARM chips for Apple in 2013.)

Also, Intel would almost certainly not be Apple’s sole supplier of ARM chips. Yes, Apple needs to get out of its current and dangerous single source situation. But Tim Cook’s Supply Chain Management expertise will come into play to ensure that Apple doesn’t fall into a similar situation with Intel, that the company will secure at least a second source, such as the rumored TSMC.

The speculation by an RBC analyst that Intel will offer its services to build ARM chips for the iPhone on the condition Apple picks an x86 device for the iPad is nonsensical: Apple won’t fork iOS. Life is complicated enough with OS X on Intel and iOS on ARM.

Historically, a sizable fraction of Intel’s profits came from the following comparison. Take two microprocessor chips of equal “merit”: manufacturing cost, computing output, power dissipation… And add one difference: one runs Windows, the other doesn’t. Which one will get the highest profit margin?

In the ARM world and its flurry of customized chips and software platforms, the “runs Windows” advantage is no longer. ARM chips generate significantly lower margins than in the Intel-dominated world (its competitor AMD is ailing).

This leaves the chip giant facing a choice: It can have a meager meal at the tablet/smartphone fest, or not dine at all at the mobile table…while it watches its PC business decline.

In other news… Paul Otellini, Intel’s CEO, unexpectedly announced he’ll leave next May, a couple years ahead of the company’s mandatory 65-year retirement age. No undignified exit here. Intel’s Board pointedly stated they’ll be looking outside as well as inside for a successor, another unusual move in a company that so far stuck to successions orchestrated around carefully groomed execs. This could be seen as a sanction for Otellini missing the mobile wave and, much more important, a desire to bring new blood willing and able to look past the old x86 orthodoxy.

JLG@mondaynote.com

 

Apple, ARM, and Intel

 

Apple and Samsung are engaged in a knives-out smartphone war, most infamously in the courts but, more importantly, in the marketplace. In its latest ad campaign, Samsung has cleverly “borrowed” a page from Apple’s own marketing playbook, posturing the iPhone as the choice of autumn-aged parents and brainwashed queue sheep.

But when it comes to chips, the two companies must pretend to be civil for the sake of the children: Samsung is the sole supplier of ARM-based processors for the iPhone.

Something has to give.

Since no one sees Samsung getting out of its booming smartphone business, the conclusion is that Apple will assume full custody, it will take its iDevices processor business elsewhere.

But where? There are rumors (which we’ll get to), and none of them so much as hint at Intel.

Except for the rare cameo appearance, Intel is nowhere in the Post-PC world (or, as Frank Shaw, the literate and witty head of Microsoft’s corporate PR obdurately insists, the “PC Plus” world). Becoming Apple’s ARM source wouldn’t just put the Santa Clara company in the race, it would vault them into the lead.

They’ve been there before: Intel scored a coup when Apple switched to the x86 architecture for its Macintosh line in 2005. An iDevice encore would mark an even bigger score as smartphones and tablets have already reached much higher volumes and grow much faster.

So… Why hasn’t Intel jumped at the chance?

The first explanation is architectural disdain. Intel sees “no future for ARM“, it’s a culture of x86 true believers. And they have a right to their conviction: With each iteration of its manufacturing technology, Intel has full control over how to improve its processors. They can reduce x86 power consumption by using smaller building blocks (they’re already down to 22 nanometers wide). They can micro-manage (literally) which parts of a complex chip will be turned on, off, or somewhere in between, in a kind of hibernation.

A further problem is that Intel would need to change roles. Today, the company designs the microprocessors that it manufactures. It tells PC clone makers what these chips will do, how many they will get, when, and for how much. Its development model (called Tick Tock in industry argot) essentially defines the schedules and finances of hardware makers.

This dictatorial model won’t work for iDevices. Apple crossed the border into Intel’s chipset empire back in the Macintosh era, but, today, it has far too much invested in its ARM design to again surrender complete control. As evidenced by the A6 processor running inside the iPhone 5, Apple goes to great lengths to customize the basic ARM cores, adding graphic processors, memory, and large amounts of support logic, and even resorts to aggressive hand-optimization of the silicon layout — as opposed to just letting CAD software tools do the job.

Intel would have to accept Apple’s design and “pour” it into silicon — it would become a lowly “merchant foundry“. Intel knows how to design and manufacture standard parts, it has little experience manufacturing other people’s custom designs…or pricing them.

Which leads us to the most likely answer to the Why Not Intel question: Money. Intel is a sophisticated business entity that expertly balances both terms of the profit equation. On the one hand, they use brand identity, marketing incentives, and a little strong-arming to keep prices “acceptable”, while on the other, the Tick Tock technology and product development pushes its costs down.

The company meticulously tunes the price points for its processors to generate the revenue that will fund development as well as the Intel Inside campaigns that have cost hundreds of millions of dollars over the years, to say nothing of the more recent $300M Ultrabook fund.

One way to visualize Intel’s money pump is to think of what the industry calls a Wafer Start. Here, “wafer” refers to the basic silicon “galette” that will go through the manufacturing steps and emerge with thousands of chips ready to be diced out. For Intel, profit comes from the difference between the cost of running a wafer through the $5B manufacturing unit (a “fab” in our argot) and the revenue that the marketplace will grant each chip.

Intel’s published prices range from a “low” $117 for a Core i3 processor to $999 for a top-of-the-line Core i7 device. Of course, these are the publicly advertised price tags, so we can assume that Acer, Lenovo, and HP pay less… but compare this to iSuppli’s estimate for the cost of the A6 processor: $17.50.

Even if more A6 chips could be produced per wafer — an unproven assumption — Intel’s revenue per A6 wafer start would be much lower than with their x86 microprocessors. In Intel’s perception of reality, this would destroy the business model.

In the meantime, the rumor of the day is that Apple will use TSMC, a well-regarded Taiwanese foundry, the world’s largest. TSMC is known to have made test runs of the A4 last year, and is now reportedly doing the same for the A5 processors that power the new iPad. Furthermore, “industry insiders” have reported that Apple attempted to secure exclusive access to TMSC’s semiconductor output but were rebuffed. (Qualcomm tried, as well; same result.)

This raises a big Disruption question for Intel: In the name of protecting today’s business model, will it let TSMC and others take the huge mobile volume, albeit with lower profit per unit? Can Intel afford to shun ARM?

For all of Intel’s semiconductor design and manufacturing feats, its processors suffer from a genetic handicap: They have to support the legacy x86 instruction set, and thus they’re inherently more complicated than legacy-free ARM devices, they require more transistors, more silicon. Intel will argue, rightly, that they’ll always be one technological step ahead of the competition, but is one step enough for x86 chips to beat ARM microprocessors?

JLG@mondaynote.com

 

The Apple-Intel-Samsung Ménage à Trois

Fascinating doesn’t do justice to the spectacle, nor to the stakes. Taken in pairs, these giants exchange fluids – products and billion$ – while fiercely fighting with their other half. Each company is the World’s Number One in their domain: Intel in microprocessors, Samsung in electronics, Apple in failure to fail as ordained by the sages.

The ARM-based chips in iDevices come from a foundry owned by Samsung, Apple’s mortal smartphone enemy. Intel supplies x86 chips to Apple and its PC competitors, Samsung included, and would like nothing more than to raid Samsung’s ARM business and make a triumphant Intel Inside claim for Post-PC devices. And Apple would love to get rid of Samsung, its enemy supplier, but not at the cost of losing the four advantages it derives from using the ARM architecture: cost, power consumption, customization and ownership of the design.

At its annual investor day last week, Intel CEO Paul Otellini sounded a bit like a spurned suitor as he made yet another bid for Apple’s iDevices business [emphasis mine]:

“Our job is to insure our silicon is so compelling, in terms off running the Mac better or being a better iPad device, that […] they can’t ignore us.”

This is a bit odd. Intel is Apple’s only supplier of x86 microprocessors; AMD, Intel’s main competitor, isn’t in the picture. How could Apple ‘‘ignore’’ Intel? Au contraire, many, yours truly included, have wondered: Why has Intel ignored Apple’s huge iDevices business?

Perhaps Intel simply didn’t see the wave coming. Steeped in its domination of the PC business — and perhaps listening too much to the dismissive comments of Messrs. Ballmer and Shaw — Intel got stuck knitting one x86 generation after another. The formula wasn’t broken.

Another, and perhaps more believable, explanation is the business model problem. These new ARM chips are great, but where’s the money? They’re too inexpensive, they bring less than a third, sometimes even just a fifth of the price, of a tried and true x86 PC microprocessor. This might explain why Intel sold their ARM business, XScale chips, to Marvell in 2006.

Then there’s the power consumption factor: x86 chips use more watts than an ARM chip. Regardless of price, this is why ARM chips have proliferated in battery-limited mobile devices. Year after year, Intel has promised, and failed, to nullify ARM’s power consumption advantage through their technical and manufacturing might.

2012 might be different. Intel claims ‘‘the x86 power myth is finally busted.” Android phones powered by the latest x86 iteration have been demonstrated. One such device will be made and sold in India, in partnership with a company called Lava International. Orange, the France-based international carrier, also intends to sell an Intel-based smartphone.

With all this, what stops Apple from doing what worked so well for their Macintosh line: Drop ARM (and thus Samsung), join the Intel camp yet again, and be happy forever after in a relationship with fewer participants?

There appear to be a number of reasons to do so.

First, there would be no border war. Unlike Samsung, Intel doesn’t make smartphones and tablets. Intel sells to manufacturers and Apple sells to humans.

Second, the patent front is equally quiet. The two companies have suitable Intellectual Property arrangements and, of late, Intel is helping Apple in its patent fights with Samsung.

Third, if the newer generation of x86 chips are as sober as claimed, the power consumption obstacle will be gone. (But let’s be cautious, here. Not only have we heard these claims before, nothing says that ARM foundries won’t also make progress.)

Finally, Otellini’s ‘‘they can’t ignore us’’ could be decoded as ‘‘they won’t be able to ignore our prices’’. Once concerned about what ARM-like prices would do to its business model, Intel appears to have seen the Post-PC light: Traditional PCs will continue to make technical progress, but the go-go days of ever-increasing volumes are gone. It now sounds like Intel has decided to cannibalize parts of its PC business in order to gain a seat at the smartphone and tablet table.

Just like Apple must have gotten a very friendly agreement when switching the Mac to Intel, one can easily see a (still very hypothetical) sweet deal for low-power x86 chips for iDevices. Winning the iDevices account would put Intel “on the Post-PC map.” That should be worth a suitable price concession.

Is this enough for Apple to ditch Samsung?

Not so fast, there’s one big obstacle left.

Let’s not forget who Samsung is and how they operate. This is a family-controlled chaebol, a gang of extremely determined people whose daring tactics make Microsoft, Oracle, Google, and Apple itself blush. Chairman Lee Kun-hee has been embroiled in various “misunderstandings.” He was convicted (and then pardoned) in a slush fund scandal. The company was caught in cartel arrangements and paid a fine of more than $200M in one case. As part of the multi-lawsuit fight with Apple, the company has been accused of willfully withholding and destroying evidence — and this isn’t their first offense. Samsung look like a determined repeat obstructor of justice. My own observations of Samsung in previous industry posts are not inconsistent with the above. Samsung plays hardball and then some.

This doesn’t diminish Samsung’s achievements. The Korean conglomerate’s success on so many fronts is a testament to the vision, skill, and energy of its leaders and workers. But there has been so much bad blood between Samsung and Apple that one has a hard time seeing even an armed peace between the two companies.

And this doesn’t mean Apple will abandon ARM processors. The company keeps investing in silicon design teams, it has plenty of money, some of which could go into financing parts or the entirety of a foundry for one of Samsung’s competitors in Taiwan (TSMC) or elsewhere in the US, Europe, or Israel. If it’s a strategic move and not just an empty boast on PowerPoint slides, $10B for a foundry is within Apple’s budget.

To its adopters, ARM’s big advantage is customization. Once you have an ARM license, you’ve entered an ecosystem of CAD software and module libraries. You alter the processor design as you wish, remove the parts you don’t need, and add components licensed from third parties. The finished product is a SOC (System On a Chip) that is uniquely yours and more suited to your needs than an off-the-shelf processor from a vendor such as Intel. Customization, licensing chip designs to customers — such moves are not in the Intel playbook, they’re not part of the culture.

I don’t see Apple losing its appetite for customization and ownership, for making its products more competitive by incorporating new functions, such as voice processing and advanced graphics on their SOCs. For this reason alone, I don’t see Apple joining the x86 camp for iDevices. (Nor do I see competitive smartphone makers dropping their SOCs in favor of an Intel chip or chipset.)

Intel isn’t completely out of the game, but to truly play they would need to join the ARM camp, either as a full licensee designing SOCs or as a founder for SOCs engineered by Apple and its competitors.

These are risky times: A false move by any one vertex of the love triangle and tens of billions of dollars will flow in the wrong direction.

JLG@mondaynote.com

2011: Shift Happens

Whatever 2011 was, it wasn’t The Year Of The Incumbent. The high-tech world has never seen the ground shift under so many established companies. This causes afflicted CEOs to exhibit the usual symptoms of disorientation: reorg spams, mindless muttering of old mantras and, in more severe cases, speaking in tongues, using secret language known only to their co-CEO.

Let’s start with the Wintel Empire

Intel. The company just re-organized its mobile activities, merging four pre-existing groups into a single business unit. In a world where mobile devices are taking off while PC sales flag, Intel has effectively lost the new market to ARM. Even if, after years of broken promises, Intel finally produces a low-power x86 chip that meets the requirements of smartphones and tablets, it won’t be enough to take the market back from ARM.

Here’s why: The Cambridge company made two smart decisions. First, it didn’t fight Intel on its sacred PC ground; and, second, it licensed its designs rather than manufacture microprocessors. Now, ARM licensees are in the hundreds and a rich ecosystem of customizing extensions, design houses and silicon foundries has given the architecture a dominant and probably unassailable position in the Post-PC world.

We’ll see if Intel recognizes the futility of trying to dominate the new theatre of operations with its old weapons and tactics, or if it goes back and reacquires an ARM license. This alone won’t solve its problems: customers of ARM-based Systems On a Chip (SOC) are used to flexibility (customization) and low prices. The first ingredient isn’t in evidence in the culture of a company used to dictate terms to PC makers. The second, low prices, is trouble for the kind of healthy margins Intel derives from its Wintel quasi-monopoly. Speaking of which…

Microsoft. The company also reorged its mobile business: Andy Lees, formerly President of its Windows Phone division just got benched. The sugar-coating is Andy keeps his President title, in “a new role working for me [Ballmer] on a time-critical opportunity focused on driving maximum impact in 2012 with Windows Phone and Windows 8”. Right.

Ballmer once predicted Windows Mobile would achieve 40% market share by 2012, Andy Lee pays the price for failing to achieve traction with Windows Phone: according to Gartner, Microsoft’s new mobile OS got 1.6% market share in Q2 2011.

Microsoft will have to buy Nokia in order to fully control its destiny in this huge new market currently dominated by Android-based handset makers (with Samsung in the lead) and by Apple. In spite of efforts to ‘‘tax” Android licensees, the old Windows PC licensing model won’t work for Microsoft. The vertical, integrated, not to say “Apple” approach works well for Microsoft in its flourishing Xbox/Kinect business, it could also work for MicroNokia phones. Moreover, what will Microsoft do once Googorola integrates Moto hardware + Android system software + Google applications and Cloud services?
In the good old PC business Microsoft’s situation is very different, it’s still on top of the world. But the high-growth years are in the past. In the US, for Q2 2011, PC sales declined by 4.2%; in Europe, for Q3 this time, PC sales went down by 11.4% (both numbers are year-to-year comparisons).

At the same time, according to IDC the tablet market grew 264.5% in Q3 (admire the idiotic .5% precision, and consider tablets started from a small 2010 base). Worldwide, including the newly launched Kindle Fire, 2011 tablets shipments will be around 100 million units. Of which Microsoft will have nothing, or close to nothing if we include a small number of the confidential Tablet PC devices. The rise of tablets causes clone makers such as Dell, Samsung and Asus (but not Acer) to give up on netbooks.

In 2012, Microsoft is expected to launch a Windows 8 version suited for tablets. That version will be different from the desktop product: in a break with its monogamous Wintel relationship, Windows 8 will support ARM-based tablets. This “forks” Windows and many applications in two different flavors. Here again, the once dominant Microsoft lost its footing and is forced to play catch-up with a “best of both world” (or not optimized for either) product.

In the meantime, Redmond clings to a PC-centric party line, calling interloping smartphones and tablets “companion products’’. One can guess how different the chant would be if Microsoft dominated smartphones or tablets.

Still, like Intel, Microsoft is a growing, profitable and cash-rich company. Even if one is skeptical of their chances to re-assert themselves in the Post-PC world, these companies have the financial means to do so. The same cannot be said of the fallen smartphone leaders.

RIM: ‘Amateur hour is over.This is what the company imprudently claimed when introducing its PlayBook tablet. It is an expensive failure ($485M written off last quarter) but RIM co-CEOs remain eerily bullish: ‘Just you wait…’ For next quarter’s new phones, for the new BlackBerry 10 OS (based on QNX), for a software update for the PlayBook…

I remember being in New York City early January 2007 (right before the iPhone introduction). Jet-lagged after flying in from Paris, I got up very early and walked to Avenue of The Americas. Looking left, looking right, I saw Starbucks signs. I got to the closest coffee shop and saw everyone in the line ahead of me holding a BlackBerry, a.k.a. CrackBerry for its addictive nature. Mid-december 2011, RIM shares were down 80% from February this year:

Sammy the Walrus IV provides a detailed timeline for RIM’s fall on his blog, it’s painful.

On Horace Dediu’s Asymco site, you’ll find a piece titled “Does the phone market forgive failure?”. Horace’s answer is a clear and analytical No. Which raises the question: What’s next for RIM? The company has relatively low cash reserves ($1.5B) and few friends, now, on financial markets. It is attacked at the low end by Chinese Android licensees and, above, by everyone from Samsung to Nokia and Apple. Not a pretty picture. Vocal shareholders demand a change in management to turn the company around. But to do what? Does anyone want the job? And, if you do, doesn’t it disqualify you?

Nokia: The company has more cash, about 10B€ ($13B) and a big partner in Microsoft. The latest Nokia financials are here and show the company’s business decelerates on all fronts, this in a booming market. Even if initial reactions to the newest Windows Phone handsets aren’t said to be wildly enthusiastic, it is a bit early to draw conclusions. But Wall Street (whose wisdom is less than infinite) has already passed judgment:

Let’s put it plainly: No one but RIM needs RIM; but Microsoft’s future in the smartphone (and, perhaps, tablet) market requires a strong Nokia. Other Windows Phone “partners” such as Samsung are happily pushing Android handsets, they don’t need Microsoft the way PC OEMs still need Windows. Why struggle with a two-headed hydra when you can acquire Nokia and have only one CEO fully in charge? Would this be Andy Lees’ mission?

All this stumbling takes place in the midst of the biggest wave of growth, innovation and disruption the high-tech industry has ever seen: the mobile devices + Cloud + social graph combination is destroying (most) incumbents on its path. Google, Apple, Facebook, Samsung and others such as Amazon are taking over. 2012 should be an interesting year for bankers and attorneys.

JLG@mondaynote.com


Intel 3-D Transistors: Why and When?

A few days ago, Intel teased: On May 4th, the company would make “its most significant technology announcement of the year.”

Tongues wagged. Will Intel make ARM chips for Apple? The speculation has roots in reality.

We’ll start with the public breakup of the Wintel marriage. At this year’s CES in January, Steve Ballmer made it clear that x86 exclusivity was done for. With an eye on reentering the tablet market, the next release of Microsoft’s legacy OS, Windows 8, would also run on ARM SOCs. This will “fork” Windows: There’ll be two versions, one on x86 processors, another on ARM chips. Tablets, which introduce UI differences, add a couple more tines to the fork. The impact on application development isn’t clear yet (food for a future Monday Note). Surprisingly, there’s been little talk of Intel “going ARM” to repair the Wintel relationship.

Now let’s consider Intel’s complete absence from the mobile scene. Not a single smartphone contains an x86 processor. Not a single tablet, no GPS device, nothing.

For the past four years Intel has told us we’d see x86 mobile devices Real Soon Now. The company developed its own mobile version of Linux, MobLin, and they made a big deal of joining forces with Nokia’s Maemo to create MeeGo. But Nokia’s new CEO, Stephen Elop, kicked Meego to the kerb, wisely deciding to focus on one software platform, his ex-employer’s Windows Phone 7.

(We’ll see how wise this decision turns out to be. Perhaps Elop should have put his money on the front-running Android horse. Perhaps Microsoft should have “gone Apple” — pardon, “vertical.” They could have acquired Nokia, controlled the hardware and the software. They did so, successfully, with the Xbox and Kinect. Again, more food for future Monday Notes.)

The x86 mobile devices never materialized. Each new low-power processor promise from Intel was matched by ever more attractive ARM development. Now that the PC market is in its twilight, with mobile devices proliferating and stealing growth from the PC, surely Intel has to get into the race.

Then there’s the long-standing relationship between Steve Jobs and Intel — or, more specifically, with Intel co-founder Andy Grove. The relationship flourished at NeXT when Jobs moved the platform to Intel processors. After Jobs returned to Apple, efforts got under way to move the Macintosh away from the PowerPC, which was deemed poorly supported by IBM and Motorola, to the more robust x86 line.

It isn’t hard to imagine Intel offering Apple its advanced 22nanometer fabs, along with some kind of exclusivity and price advantage. And there’s a bonus: They’d be kicking Samsung, an annoying combination of supplier, competitor, and adversary in IP lawsuits. In return, Apple would give Intel the kind of volume the company likes, 100 million ARM chips in 2012.

From there, the train of thought continues to the terminus: the Macintosh line switches wholly to ARM, and Intel supplies the processors. It’s not impossible. Intel hedges its bets, secures an inexpensive ARM license and uses its technology and marketing prowess to grab their share of the explosive growth.

As the rumor site says: “This is going to cause meetings.”

Now, the reality.

What Intel announced last week is a new “3-D” transistor technology. 3-D here doesn’t refer to images but to a design and manufacturing technique: Making transistors in three dimensions, as opposed to today’s “planar” technology where the microscopic silicon circuitry is laid out on a flat surface. Just as you can store more cars in a multi-storey garage than in a flat parking lot, more circuitry can be packed in three dimensions.

The new 22nm semiconductor manufacturing process also helps. The circuitry building blocks are smaller, they waste less electrical power through heat dissipation. All of this — cue the cymbals — is ideal for mobile applications. In plain English: This is Intel’s ARM killer. (Cruelly, Google tells us we heard the same story three years ago. And two years ago. And last year.)

Intel’s press release is firmly planted in hyperbole:

Intel’s scientists and engineers have once again reinvented the transistor, this time utilizing the third dimension,” said Intel President and CEO Paul Otellini. “Amazing, world-shaping devices will be created from this capability as we advance Moore’s Law into new realms.”

The part about “once again” reinventing the transistor is a bit far-fetched. On Intel’s website, you’ll find the company’s own timeline, replete with innovations, and bowdlerization…but nothing about reinventing the transistor. There’s some dispute as to the transistor’s actual invention: when, where, by whom. Most history books credit William Shockley at Bell Labs Research with the first silicon transistor, which was produced in 1954 by Texas Instruments. (At my Breton Roman Catholic boarding school, the Prefect of Discipline was a certified geek. In 1955, instead of looking at religious pictures, we were in his office drooling at this incredible Philips OC 71 germanium transistor…)

We’re meant to be impressed by the promised performance and power dissipation improvements:

The 22nm 3-D Tri-Gate transistors provide up to 37 percent performance increase at low voltage versus Intel’s 32nm planar transistors. This incredible gain means that they are ideal for use in small handheld devices, which operate using less energy to “switch” back and forth. Alternatively, the new transistors consume less than half the power when at the same performance as 2-D planar transistors on 32nm chips.

Note the Alternatively: it’s either more performance or less power dissipation.

We’ll have to wait a year to see how this markitecture translates into actual devices.

Will this be enough to unseat ARM? Most observers doubt it. The big news was received with an equally big yawn. Wall Street didn’t pay much attention. We’ve been here before: The “product” of the announcement is the announcement. (And there’s the suspicion that “breakthrough” revelations are an attempt to mask a lack of spanking new products.)

But let’s return to the rumor, from SemiAccurate, that the Mac and Intel will soon be “arm-in-ARM.” (That bad pun isn’t mine.)

First, let’s consider the name of the website.

Second, what will Apple do at the high-end, for media creation and editing? What about Photoshop, FinalCut, and other applications, including CAD where the Mac is getting back in the game? There’s no roadmap for ARM chips to beat Intel in these computationally intensive areas.

Today, going ARM is technically feasible on entry-level Macs. Tomorrow, newer multicore ARM chips might work for middle-of-the-line Macintosh products. But will Apple abandon the faster x86 processors at the high end just to avoid the kind of forking that awaits Windows in its own move to ARM? If not, we’ll again see Universal applications (a.k.a. fat binaries–two versions inside the same container), just as we did with the PowerPC to x86 transition. Microsoft is doing it because it must; Apple did it because the PowerPC didn’t have a future. But now?

———

On a related note…and more food for thought: I’d love to know how the iPad line will evolve. For example: will pressure-sensitive stylus input ever happen? Eschewing stylus input in the early days was a thoughtful move. Perhaps it’s time to relax the restriction and thus enable richer media creation applications.

The next iOS and OS X releases will shed more light on the relative roles of Apple’s tablet and PC product lines, how they will coexist, what they’ll have in common and what will keep them apart. We should know in about a month.

JLG@mondaynote.com