There’s an interesting article over at Ars Technica, titled What processor should I buy: Intel’s crazy pricing makes my head hurt
. That might seem a silly question at first: as the author points out, surely you just buy the most expensive CPU in the LGA1155 range. However, Peter Bright is no fool; looking closer at the specs and his requirements, the author struggles to make sense of Intel’s strategy with new features, performance and compatibility.
The problem is due to Bright’s desire to make a future-proof, fast PC that can run Visual Studio and Battlefield 3 easily. A Core i7-2600 is a no-brainer, but there are three flavours, with the S model even running at slower stock speeds to save 30W of power (it Turbo Boosts to the same 3.8GHz as the other i7-2600 CPUs, however).
Then there’s the toss-up between the i7-2600 and the i7-2600K – the former has some interesting virtualisation and security features that Bright wants, but the latter has a better GPU and the ability to overclock. So which one is better? They both seem compromised and yet there’s a £10 ($23) price difference. The point is really, why has Intel disabled the useful VT-d and the potentially useful TXT logic from the i7-2600K?
Sure, TXT could be seen as a way to introduce hardware-based DRM to a home PC, but as Bright points out, it could also be very useful in preventing rootkits from slaving your PC to their nefarious desire (my melodramatic wording, not his).
Bright finds a solution to his quandary in the Xeon world, where there is a CPU that fits his needs, but then he’s stymied by the lack of Smart Response
on official Xeon chipsets. So he’ll have to opt for the not officially supported combination of a Z68 motherboard with a Xeon processor. This should work fine, but for a PC you absolutely rely on for work (I assume) this isn’t a comfortable arrangement.
So why does Intel feel the need to disable potentially useful features from its supposedly top-end CPU when this will slow down uptake? And is the lack of Smart Response technology in any Xeon chipset a tacit admission that it’s not 100 per cent reliable? Conspiracy theories below please!