ARM

Peak PC. Intel Fork.

 

Propelled by Moore’s Law and the Internet, PCs have enjoyed four decades of strong growth, defying many doomsday prophecies along the way. But, with microprocessor performance flattening out, the go-go years have come to an end. Intel, the emperor of PC processors, and a nobody in mobile devices needs to react.]

I’m suspicious of Peak <Anything> predictions. Some of us became aware of the notion of a resource zenith during the 1973 OPEC oil embargo, with its shocking images of cars lined up at gas stations (in America!):

Gas Lines Oil Embargo

This was Peak Oil, and it spelled doom to the auto industry.

We know what happened next: Cars improved in design and performance, manufacturers became more numerous. Looking at this bit of history through my geek glasses, I see three explanations for the rebound: computers, computers, and computers. Computer Assisted Design (CAD) made it easier to design new car models as variations on a platform; Volkswagen’s MQB is a good example. Massive computer systems were used to automate the assembly line and manage the supply chain. It didn’t take long for computers to work their way into the cars themselves, from the ECU under the hood to the processors that monitor the health of the vehicle and control the entertainment and navigation systems.

Since then, we’ve had repeated predictions of Peak Oil, only to be surprised by the news that the US will soon become a net oil exporter and, as Richard Muller points out in his must-read Physics for Future Presidents, we have more than a century of coal reserves. (Unfortunately, the book, by a bona fide, middle-of-the-road physicist, can’t promise us that physics will eventually push politics aside when considering the rise of CO2 in the atmosphere…)

I’ve heard similar End of The Go-Go Days predictions about personal computers since 1968 when my love affair with these machines started at HP France (I was lucky enough to be hired to launch their first desktop machine).

I heard the cry again in 1985 when I landed in Cupertino in time for the marked slowdown in Apple ][ sales. The never-before round of layoffs at Apple prompted young MBAs, freshly imported from Playtex and Pepsi, to intone the It’s All Commodities Now dirge. I interpreted the cry (undiplomatically -- I hadn’t yet learned to speak Californian) as a self-serving It’s All Marketing Now ploy. In the meantime, engineers ignored the hand-wringing, went back to work, and, once again, proved that the technology “mines” were far from exhausted.

In 1988, a Sun Microsystems executive charitably warned me: “PCs are driving towards the Grand Canyon at 100 mph!”.  A subscriber to Sun’s The Network Is The Computer gospel, the gent opined that heavy-duty computing tasks would be performed by muscular computers somewhere (anywhere) on the network. Desktop devices (he confusingly called them “servers” because they were to “serve” a windowing protocol, X11) would become commodities no more sophisticated or costly than a telephone. He had no answer for multimedia applications that require local processing of music, video, and graphics, nor could he account for current and imminent mobile devices. His view wasn’t entirely new. In 1965, Herb Grosch gave us his Law, which told us that bigger computers provide better economics; smaller machines are uneconomical.

And yet, personal computers flourished.

I have vivid memories of the joy of very early adopters, yours truly included. Personal computers are liberating in many ways.

First, they don’t belong to the institution, there’s no need for the intercession of a technopriest, I can lift my PC with my arms, my brains, and my credit card.

Second, and more deeply, the PC is a response to a frustration, to a sense of something amiss. One of mankind’s most important creations is the symbol, a sign without a pre-existing meaning: X as opposed to a drawing of a deer on a cave wall. Strung together, these symbols show formidable power. The expressive and manipulative power of symbol strings runs through the Song of Songs, Rumi’s incandescent poetry, Wall Street greed, and quantum physics.

But our central nervous system hasn’t kept up with our invention. We don’t memorize strings well, we struggle with long division, let alone extracting cubic roots in our heads.

The PC comes to the rescue, with its indefatigable ability to remember and combine symbol strings. Hence the partnership with an object that extends the reach of our minds and bodies.

Around 1994, the Internet came out of the university closet, gave the PC access to millions of servers around the world (thus fulfilling a necessary part of the Sun exec’s prophecy), and extended our grasp.

It’s been great and profitable fun.

But today, we once again hear Peak PC stories. Sales have gone flat, never to return:

PC shipments 2014-18 - PNG

This time, I’m inclined to agree.

Why?

Most evenings, my home-builder spouse and I take a walk around Palo Alto. Right now, this smallish university town is going through a building boom. Offices and three-layer retail + office + residence are going up all around University Avenue. Remodels and raze-and-build projects can be found in the more residential parts of town. No block is left unmolested.

I can’t help but marvel. None of this activity, none of Silicon Valley would exist without Moore’s Law, the promise made in 1965 that semiconductor performance would double every 18 months. And, for the better part of 40 years, it did - and rained money on the tech ecosystem, companies and people. PCs, servers, embedded electronics, giant network routers, cars...they’ve all been propelled because Moore’s Law has been upheld...until recently.

The 1977 Apple ][ had a 1MHz 8-bit processor. Today’s PCs and Mac’s reach 3.7GHz, but number that hasn’t changed in more than three years. This isn’t to say that Intel processors aren’t still improving, but the days when each new chip brought substantial increases in clock speed seem to be over.

One should never say never, but Moore’s Law is now bumping into the Laws of Physics. The energy needed to vibrate matter (electrons in our case) increases with frequency. The higher the clock frequency, the higher the power dissipation and the greater the heat that’s generated…and a PC can withstand only so much heat. Consider the cooling contraptions used by PC gamers when they push the performance envelope of their “rigs”:

EK-Thermosphere_right2_12001

To work around the physical limits, Intel and others resort to stratagems such as “multiple cores”, more processors on the same chip. But if too many computations need the result of the previous step before moving forward, it doesn’t matter how many cores you have. Markitects have an answer to that as well: “speculative branch execution”, the use of several processors to execute possible next steps. When the needed outcome appears, the “bad” branches are pruned and the process goes forward on the already-computed good branch. It makes for interesting technical papers, but it’s no substitute for a 8GHz clock speed.

If we need confirmation of the flattening out of microprocessor progress, we can turn to Intel and the delays in implementing its Broadwell chips. The move to a 14 nanometers  “geometry” — the term here denotes the size of a basic circuit building block — is proving more difficult than expected. And the design isn’t meant to yield faster processors, just less power-hungry ones (plus other goodies such as better multi-media processing).

One possible reaction to this state of affairs is to look at tablets as a new engine of growth. This is what Microsoft seems to be doing by promoting its Intel-inside Surface Pro 3 as a laptop replacement. But even if Microsoft tablets turn out to be every bit as good as Microsoft says they are, they aren’t immune to the flattening out of Intel processor performance. (I don’t have an opinion yet on the product — I tried to buy one but was told to wait till June 20th.)

Does this broaden the opening for ARM-based devices? Among their advantages is a cleaner architecture, one devoid of the layers of backwards compatibility silt x86 devices need. ARM derivaties need less circuitry for the same computing task and, as a result, dissipate less power. This is one of the key reasons for their dominance in the battery-powered world of mobile devices. (The other is the customization and integration flexibility provided by the ARM ecosystem.) But today’s ARM derivatives run at lower speeds (a little above 1GHz for some) than Intel chips. Running at higher speeds will challenge them to do so without hurting battery life and having to add the fan that Microsoft tablets need.

With no room to grow, PC players exit the game. Sony just did. Dell took itself private and is going through the surgery and financial bleeding a company can’t withstand in public. Hewlett-Packard, once the leading PC maker, now trails Lenovo. With no sign of turning its PC business around, HP will soon find itself in an untenable position.

Intel doesn’t have the luxury of leaving their game — they only have one. But I can’t imagine that Brian Krzanich, Intel’s new CEO, will look at Peak PC and be content with the prospect of increasingly difficult x86 iterations. There have been many discussions of Intel finally taking the plunge and becoming a “foundry” for someone else’s ARM-based SoC (System On a Chip) designs instead of owning x86 design and manufacturing decisions. Peak PC will force Intel CEO’s hand.

JLG@mondaynote.com

Wintel: Le Divorce Part II

 

At CES 2011, Ballmer told the world Windows would “fork”, that it would also run on lower power ARM chips for mobile devices. This was seen as a momentous breach in the long-standing Wintel duopoly. Two years later, the ARM tooth of the fork looks short and dull.

This is what I wrote almost two years ago:

After years of monogamy with the x86 architecture, Windows will soon run on ARM processors.

As in any divorce, Microsoft and Intel point fingers at one another. Intel complains about Microsoft’s failure to make a real tablet OS. They say MS has tried to shoehorn “Windows Everywhere” onto a device that has an incompatible user interface, power management, and connectivity requirements while the competition has created device-focused software platforms.

Microsoft rebuts: It’s Intel’s fault. Windows CE works perfectly well on ARM-based devices, as do Windows Mobile and now Windows Phone 7. Intel keeps telling us they’re “on track”, that they’ll eventually shrink x86 processors to the point where the power dissipation will be compatible with smartphones and tablets. But…when?

Today, a version of Windows (RT) does indeed run on an ARM processor, on Microsoft’s Surface tablet-PC hybrid. Has Microsoft finally served Intel with divorce papers?

Not so fast. The market’s reaction to Redmond’s ambitious Surface design has fallen far short of the heights envisioned in the company’s enthusiastic launch: Surface machines aren’t flying off Microsoft Store shelves. Ballmer himself admits sales are “modest” (and then quickly backpedals); Digitimes, admittedly not always reliable, quotes suppliers who say that Surface orders have been cut by half; anecdotally, but amusingly, field research by Piper Jaffray’s Gene Munster (who can be a bit excitable) shows zero Surfaces sold during a two hour period at the Mall of America on Black Friday, while iPads were selling at a rate of 11-an-hour.

Traditional PC OEMs aren’t enthusiastic either. Todd Bradley, head of HP’s Personal Systems Group, is unimpressed:

“It tends to be slow and a little kludgey as you use it .…”

Acer exec Linxian Lang warns:

“Redmond will have to eat ‘hard rice’ with Surface…it should stick to its more readily-chewed software diet.”

To be sure, there are happy Surface users, such as Steve Sinofsky, the former Windows Division President, as captured in lukew’s Instagram picture:

(An aside: I went back to Sinofsky’s 8,000 words blog post that lovingly describes the process of developing “WOA” — Windows on ARM. At the time, WOA was presented as part of the Windows 8 universe. Later, Microsoft swapped the “8″ designation and chose to use “RT” instead. These naming decisions aren’t made lightly. Is there any wonder why WOA was moved out of the Windows 8 camp?)

It’s possible that the jury is still out… Surface sales could take off, Windows RT could be embraced by leading PC OEMs… but what are the odds? In addition to the tepid reception from customers and vendors alike, Microsoft must surmount the relentless market conquest of Android and iOS tablets whose numbers (210 million units) are expected to exceed laptop sales next year.

So, no… the Wintel Divorce isn’t happening. Intel’s x86 chips will remain the processors of choice to run Windows. Next month, we’ll have CES and its usual burst of announcements, both believable and dubious (remember when 2010 was declared the Year Of The Tablet PC?). We’ll have to sort the announcements that are merely that from those that will yield an actual device, but in the end I doubt we’ll see many new and really momentous Windows RT products out there.

Microsoft’s lackluster attempt at Post-PC infidelity doesn’t help Intel in its efforts to gain a foothold in the mobile world. Intel’s perennial efforts to break into the mobile market with lower power, lower cost x86 chips have, also perennially, failed. As a result, there is renewed speculation about a rapprochement between Intel and Apple, that the Santa Clara microprocessor giant could become an ardent (and high-volume) ARM SoC foundry.

As discussed here, some of this makes sense: Samsung is Apple’s biggest and most successful competitor in the smartphone/tablet space, spending billions more than anyone else in global marketing programs. At the same time, the South Korean company is Apple’s only supplier of ARM chips. Intel has the technology and manufacturing capacity to become an effective replacement for Samsung.

This wouldn’t be an easy decision for Intel: the volumes are high — as high as 415M ARM chips for 2013 according to one analyst — but the margins are low. And Intel doesn’t do low margins. Because of the Wintel duopoly, Intel’s x86 chips have always commanded a premium markup. Take Windows out of the picture and the margin disappears.

(As another aside, the 415,000 ARM chips number seems excessive. Assuming about 50 million iPhone 5s and 15 million iPads in the current quarter, and using the 4X rule of thumb for the following calendar year, we land somewhere between 250M and 300M ARM chips for Apple in 2013.)

Also, Intel would almost certainly not be Apple’s sole supplier of ARM chips. Yes, Apple needs to get out of its current and dangerous single source situation. But Tim Cook’s Supply Chain Management expertise will come into play to ensure that Apple doesn’t fall into a similar situation with Intel, that the company will secure at least a second source, such as the rumored TSMC.

The speculation by an RBC analyst that Intel will offer its services to build ARM chips for the iPhone on the condition Apple picks an x86 device for the iPad is nonsensical: Apple won’t fork iOS. Life is complicated enough with OS X on Intel and iOS on ARM.

Historically, a sizable fraction of Intel’s profits came from the following comparison. Take two microprocessor chips of equal “merit”: manufacturing cost, computing output, power dissipation… And add one difference: one runs Windows, the other doesn’t. Which one will get the highest profit margin?

In the ARM world and its flurry of customized chips and software platforms, the “runs Windows” advantage is no longer. ARM chips generate significantly lower margins than in the Intel-dominated world (its competitor AMD is ailing).

This leaves the chip giant facing a choice: It can have a meager meal at the tablet/smartphone fest, or not dine at all at the mobile table…while it watches its PC business decline.

In other news… Paul Otellini, Intel’s CEO, unexpectedly announced he’ll leave next May, a couple years ahead of the company’s mandatory 65-year retirement age. No undignified exit here. Intel’s Board pointedly stated they’ll be looking outside as well as inside for a successor, another unusual move in a company that so far stuck to successions orchestrated around carefully groomed execs. This could be seen as a sanction for Otellini missing the mobile wave and, much more important, a desire to bring new blood willing and able to look past the old x86 orthodoxy.

JLG@mondaynote.com

 

Apple, ARM, and Intel

 

Apple and Samsung are engaged in a knives-out smartphone war, most infamously in the courts but, more importantly, in the marketplace. In its latest ad campaign, Samsung has cleverly “borrowed” a page from Apple’s own marketing playbook, posturing the iPhone as the choice of autumn-aged parents and brainwashed queue sheep.

But when it comes to chips, the two companies must pretend to be civil for the sake of the children: Samsung is the sole supplier of ARM-based processors for the iPhone.

Something has to give.

Since no one sees Samsung getting out of its booming smartphone business, the conclusion is that Apple will assume full custody, it will take its iDevices processor business elsewhere.

But where? There are rumors (which we’ll get to), and none of them so much as hint at Intel.

Except for the rare cameo appearance, Intel is nowhere in the Post-PC world (or, as Frank Shaw, the literate and witty head of Microsoft’s corporate PR obdurately insists, the “PC Plus” world). Becoming Apple’s ARM source wouldn’t just put the Santa Clara company in the race, it would vault them into the lead.

They’ve been there before: Intel scored a coup when Apple switched to the x86 architecture for its Macintosh line in 2005. An iDevice encore would mark an even bigger score as smartphones and tablets have already reached much higher volumes and grow much faster.

So… Why hasn’t Intel jumped at the chance?

The first explanation is architectural disdain. Intel sees “no future for ARM“, it’s a culture of x86 true believers. And they have a right to their conviction: With each iteration of its manufacturing technology, Intel has full control over how to improve its processors. They can reduce x86 power consumption by using smaller building blocks (they’re already down to 22 nanometers wide). They can micro-manage (literally) which parts of a complex chip will be turned on, off, or somewhere in between, in a kind of hibernation.

A further problem is that Intel would need to change roles. Today, the company designs the microprocessors that it manufactures. It tells PC clone makers what these chips will do, how many they will get, when, and for how much. Its development model (called Tick Tock in industry argot) essentially defines the schedules and finances of hardware makers.

This dictatorial model won’t work for iDevices. Apple crossed the border into Intel’s chipset empire back in the Macintosh era, but, today, it has far too much invested in its ARM design to again surrender complete control. As evidenced by the A6 processor running inside the iPhone 5, Apple goes to great lengths to customize the basic ARM cores, adding graphic processors, memory, and large amounts of support logic, and even resorts to aggressive hand-optimization of the silicon layout — as opposed to just letting CAD software tools do the job.

Intel would have to accept Apple’s design and “pour” it into silicon — it would become a lowly “merchant foundry“. Intel knows how to design and manufacture standard parts, it has little experience manufacturing other people’s custom designs…or pricing them.

Which leads us to the most likely answer to the Why Not Intel question: Money. Intel is a sophisticated business entity that expertly balances both terms of the profit equation. On the one hand, they use brand identity, marketing incentives, and a little strong-arming to keep prices “acceptable”, while on the other, the Tick Tock technology and product development pushes its costs down.

The company meticulously tunes the price points for its processors to generate the revenue that will fund development as well as the Intel Inside campaigns that have cost hundreds of millions of dollars over the years, to say nothing of the more recent $300M Ultrabook fund.

One way to visualize Intel’s money pump is to think of what the industry calls a Wafer Start. Here, “wafer” refers to the basic silicon “galette” that will go through the manufacturing steps and emerge with thousands of chips ready to be diced out. For Intel, profit comes from the difference between the cost of running a wafer through the $5B manufacturing unit (a “fab” in our argot) and the revenue that the marketplace will grant each chip.

Intel’s published prices range from a “low” $117 for a Core i3 processor to $999 for a top-of-the-line Core i7 device. Of course, these are the publicly advertised price tags, so we can assume that Acer, Lenovo, and HP pay less… but compare this to iSuppli’s estimate for the cost of the A6 processor: $17.50.

Even if more A6 chips could be produced per wafer — an unproven assumption — Intel’s revenue per A6 wafer start would be much lower than with their x86 microprocessors. In Intel’s perception of reality, this would destroy the business model.

In the meantime, the rumor of the day is that Apple will use TSMC, a well-regarded Taiwanese foundry, the world’s largest. TSMC is known to have made test runs of the A4 last year, and is now reportedly doing the same for the A5 processors that power the new iPad. Furthermore, “industry insiders” have reported that Apple attempted to secure exclusive access to TMSC’s semiconductor output but were rebuffed. (Qualcomm tried, as well; same result.)

This raises a big Disruption question for Intel: In the name of protecting today’s business model, will it let TSMC and others take the huge mobile volume, albeit with lower profit per unit? Can Intel afford to shun ARM?

For all of Intel’s semiconductor design and manufacturing feats, its processors suffer from a genetic handicap: They have to support the legacy x86 instruction set, and thus they’re inherently more complicated than legacy-free ARM devices, they require more transistors, more silicon. Intel will argue, rightly, that they’ll always be one technological step ahead of the competition, but is one step enough for x86 chips to beat ARM microprocessors?

JLG@mondaynote.com

 

Intel 3-D Transistors: Why and When?

A few days ago, Intel teased: On May 4th, the company would make “its most significant technology announcement of the year.”

Tongues wagged. Will Intel make ARM chips for Apple? The speculation has roots in reality.

We’ll start with the public breakup of the Wintel marriage. At this year’s CES in January, Steve Ballmer made it clear that x86 exclusivity was done for. With an eye on reentering the tablet market, the next release of Microsoft’s legacy OS, Windows 8, would also run on ARM SOCs. This will “fork” Windows: There’ll be two versions, one on x86 processors, another on ARM chips. Tablets, which introduce UI differences, add a couple more tines to the fork. The impact on application development isn’t clear yet (food for a future Monday Note). Surprisingly, there’s been little talk of Intel “going ARM” to repair the Wintel relationship.

Now let’s consider Intel’s complete absence from the mobile scene. Not a single smartphone contains an x86 processor. Not a single tablet, no GPS device, nothing.

For the past four years Intel has told us we’d see x86 mobile devices Real Soon Now. The company developed its own mobile version of Linux, MobLin, and they made a big deal of joining forces with Nokia’s Maemo to create MeeGo. But Nokia’s new CEO, Stephen Elop, kicked Meego to the kerb, wisely deciding to focus on one software platform, his ex-employer’s Windows Phone 7.

(We’ll see how wise this decision turns out to be. Perhaps Elop should have put his money on the front-running Android horse. Perhaps Microsoft should have “gone Apple” — pardon, “vertical.” They could have acquired Nokia, controlled the hardware and the software. They did so, successfully, with the Xbox and Kinect. Again, more food for future Monday Notes.)

The x86 mobile devices never materialized. Each new low-power processor promise from Intel was matched by ever more attractive ARM development. Now that the PC market is in its twilight, with mobile devices proliferating and stealing growth from the PC, surely Intel has to get into the race.

Then there’s the long-standing relationship between Steve Jobs and Intel — or, more specifically, with Intel co-founder Andy Grove. The relationship flourished at NeXT when Jobs moved the platform to Intel processors. After Jobs returned to Apple, efforts got under way to move the Macintosh away from the PowerPC, which was deemed poorly supported by IBM and Motorola, to the more robust x86 line.

It isn’t hard to imagine Intel offering Apple its advanced 22nanometer fabs, along with some kind of exclusivity and price advantage. And there’s a bonus: They’d be kicking Samsung, an annoying combination of supplier, competitor, and adversary in IP lawsuits. In return, Apple would give Intel the kind of volume the company likes, 100 million ARM chips in 2012.

From there, the train of thought continues to the terminus: the Macintosh line switches wholly to ARM, and Intel supplies the processors. It’s not impossible. Intel hedges its bets, secures an inexpensive ARM license and uses its technology and marketing prowess to grab their share of the explosive growth.

As the rumor site says: “This is going to cause meetings.”

Now, the reality.

What Intel announced last week is a new “3-D” transistor technology. 3-D here doesn’t refer to images but to a design and manufacturing technique: Making transistors in three dimensions, as opposed to today’s “planar” technology where the microscopic silicon circuitry is laid out on a flat surface. Just as you can store more cars in a multi-storey garage than in a flat parking lot, more circuitry can be packed in three dimensions.

The new 22nm semiconductor manufacturing process also helps. The circuitry building blocks are smaller, they waste less electrical power through heat dissipation. All of this — cue the cymbals — is ideal for mobile applications. In plain English: This is Intel’s ARM killer. (Cruelly, Google tells us we heard the same story three years ago. And two years ago. And last year.)

Intel’s press release is firmly planted in hyperbole:

Intel’s scientists and engineers have once again reinvented the transistor, this time utilizing the third dimension,” said Intel President and CEO Paul Otellini. “Amazing, world-shaping devices will be created from this capability as we advance Moore’s Law into new realms.”

The part about “once again” reinventing the transistor is a bit far-fetched. On Intel’s website, you’ll find the company’s own timeline, replete with innovations, and bowdlerization…but nothing about reinventing the transistor. There’s some dispute as to the transistor’s actual invention: when, where, by whom. Most history books credit William Shockley at Bell Labs Research with the first silicon transistor, which was produced in 1954 by Texas Instruments. (At my Breton Roman Catholic boarding school, the Prefect of Discipline was a certified geek. In 1955, instead of looking at religious pictures, we were in his office drooling at this incredible Philips OC 71 germanium transistor…)

We’re meant to be impressed by the promised performance and power dissipation improvements:

The 22nm 3-D Tri-Gate transistors provide up to 37 percent performance increase at low voltage versus Intel’s 32nm planar transistors. This incredible gain means that they are ideal for use in small handheld devices, which operate using less energy to “switch” back and forth. Alternatively, the new transistors consume less than half the power when at the same performance as 2-D planar transistors on 32nm chips.

Note the Alternatively: it’s either more performance or less power dissipation.

We’ll have to wait a year to see how this markitecture translates into actual devices.

Will this be enough to unseat ARM? Most observers doubt it. The big news was received with an equally big yawn. Wall Street didn’t pay much attention. We’ve been here before: The “product” of the announcement is the announcement. (And there’s the suspicion that “breakthrough” revelations are an attempt to mask a lack of spanking new products.)

But let’s return to the rumor, from SemiAccurate, that the Mac and Intel will soon be “arm-in-ARM.” (That bad pun isn’t mine.)

First, let’s consider the name of the website.

Second, what will Apple do at the high-end, for media creation and editing? What about Photoshop, FinalCut, and other applications, including CAD where the Mac is getting back in the game? There’s no roadmap for ARM chips to beat Intel in these computationally intensive areas.

Today, going ARM is technically feasible on entry-level Macs. Tomorrow, newer multicore ARM chips might work for middle-of-the-line Macintosh products. But will Apple abandon the faster x86 processors at the high end just to avoid the kind of forking that awaits Windows in its own move to ARM? If not, we’ll again see Universal applications (a.k.a. fat binaries–two versions inside the same container), just as we did with the PowerPC to x86 transition. Microsoft is doing it because it must; Apple did it because the PowerPC didn’t have a future. But now?

———

On a related note…and more food for thought: I’d love to know how the iPad line will evolve. For example: will pressure-sensitive stylus input ever happen? Eschewing stylus input in the early days was a thoughtful move. Perhaps it’s time to relax the restriction and thus enable richer media creation applications.

The next iOS and OS X releases will shed more light on the relative roles of Apple’s tablet and PC product lines, how they will coexist, what they’ll have in common and what will keep them apart. We should know in about a month.

JLG@mondaynote.com