bit-tech.net

Intel backs away from Itanium, plans Xeon drop-in chips

Intel backs away from Itanium, plans Xeon drop-in chips

Intel's future Xeon chips will be drop-in compatible with Itanium, suggesting the company is finally looking to phase out the IA-64 architecture.

Intel appears to finally admitting defeat on its ill-fated Itanium processor architecture, announcing that future generations of the chip will be socket-compatible with Xeon - a first step towards phasing the architecture out altogether.

Originally developed by Hewlett-Packard under the codename EPIC (Explicitly Parallel Instruction Computing) as a replacement for RISC (Reduced Instruction Set Computing) architectures, Itanium has been in development since 1989. It wasn't until Intel got involved in its development, however, that a commercial product appeared: the Merced-core Itanium.

Launched in 2001, the delay in bringing the product to market led to some serious performance issues. Intel's first attempt at a true 64-bit chip, the IA-64 architecture was incompatible with existing x86 code and its performance inferior to existing CISC and RISC designs. Having missed the boat for mainstream use, Intel repositioned the chip for high-performance computing (HPC) and tried again with the Itanium 2 in 2002. Sadly, while more successful than its predecessor, the Itanium 2 was overshadowed by the 2003 launch of the AMD Opteron and its x86-64 architecture. Offering the same 64-bit benefits as the Itanium but on a chip fully compatible with legacy 32-bit code, the Opteron proved popular - so much so, in fact, that Intel copied it with its own Xeon x86-64 processors.

The launch of x86-64 chips by AMD and Intel was bad news for Itanium, which was beginning to live up to its Titanic-inspired nickname of 'Itanic.' Rather than rewriting all their code from scratch, users were preferring to buy Xeon and Opteron chips - largey ignoring the expensive and poorly-performance Itanium line completely.

Seeing a market trend towards x86-64, many companies have moved away from IA-64 architecture development. Microsoft, which supported the platform in Windows Server 2003 and 2008, has announced that future products won't include IA-64 support, while previous announcements from Canonical and Red Hat make commercial Linux support a problem. Even HP, co-developer of the IA-64 architecture and Intel's biggest ally in keeping it afloat, has been gently moving away from the platform.

Intel, however, has insisted that IA-64 has a place in the high-performance computing market despite not a single entry in the TOP500 list using the architecture. With the announcement that future Itanium products will be pin-compatible with Xeon, however, it suggests that even Intel is finally seeing the writing on the wall.

There's little chance that Intel is hoping for users to move from its x86-64 Xeon chips to IA-64 Itanium products, meaning the pin compatibility exists for one reason: to allow customers a graceful path from Itanium to Xeon without having to replace their systems outright. Customers relying on Itanium architecture servers now - and they are few and far between - will be able to buy the new-generation version, port their code to x86-64, and then replace the Itanium chips with Xeons.

For consumers, that could be good news: the resources Intel is currently ploughing into a niche platform will instead be focused on the x86-64 Xeon products, meaning the consumer Core range - or its future replacement - will benefit as well.

5 Comments

Discuss in the forums Reply
fellix_bg 9th November 2012, 14:25 Quote
The founding issue with Itanium is that its EPIC architecture philosophy is simply too far removed from the modern trends in the industry, like power efficiency, uniform top-down scalability (from mobile to server) and broad software compatibility. And the thing is just too expensive to build, since to be an efficient ISA it needs a ton of cache memory to keep the static scheduled wide pipeline fed with data and instructions.
Alecto 9th November 2012, 22:35 Quote
And a ton of cache is what it does have. Since these are premium chips it doesn't matter that Intel has to spend more die area on cache - the customer pays for it anyway.
fluxtatic 10th November 2012, 09:18 Quote
Holy crap, Larry Ellison was right!?
greigaitken 10th November 2012, 18:47 Quote
holy crap, i thought this thing was burried years ago
Griffter 12th November 2012, 08:56 Quote
just plain old holy crap!
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums