Enterprise-class desktop computing

Written by Wil Harris

October 2, 2006 // 12:08 p.m.

Tags: #amt #idf #intel #virtualisation #virtualization #vt

IDF is always a very interesting week. It's a very in-your-face reminder that Intel is far more than just a company that makes processors. Whilst CPUs are undoubtedly at the heart of what Intel does, its staff have created a company that is the leader in many related fields, and an up-and-coming player in many others. Whilst it might sell Core 2s to gamers, it's also looking at how it can enhance old age care for pensioners. Whilst low-power chips are great for server racks, they're also great for providing laptops to the third world. The sessions and keynote speeches at IDF are a bombarding, unmissable message that Intel is a firm that cares about far more than just die size.

"There are always themes at IDF..."

There are always themes at IDF - the things that you hear about all week, or that you can imply from the speeches and material around you. This year there have been a few - the multi-core, tera-scale future is quite a big one, as is the personality of mobility, where we see mobile computing becoming more ubiquitous.

But the big one for me, this time around, was one that wasn't really explicity enunciated - rather, it was a theme that I implied from the sessions and the speeches I heard all week. The theme is what I have called 'Enterprise on Desktop is the new Mobile on Desktop'.

Mobile on desktop
Over the past couple of years, we have seen Intel bring Mobile on Desktop into the mainstream. First, it created the Pentium M, a chip which has given a new lease of life to laptops. It has driven the massive growth in laptop sales which now sees laptops outselling desktops - in a true sense, mobiles are now on desks everywhere. But in a chip sense, too, the phrase holds up - the Pentium M was the blueprint for the Core Duo and Core 2 Duo, chips which now make up the bulk of Intel's processor line for desktop and gaming PCs. Mobile technology, brought into mainstream, desktop PCs.

That transition is now all but done, as the Core microarchitecture permeates Intel's lineup, all the way from the E6300 to the QX6700 (that's quad-core, if you didn't know). With the transition we have seen lower power consumption, better performance and quieter PCs, advantages that the mobile technology has added to the desktop form factor.

But that was 2005/2006. The coming years, 2007/2008, will see a new trend - enterprise on the desktop. What does that mean? Well, it means that features that were designed for server and enterprise computing are going to emerge on your desktop, much as mobile features have emerged. The two massive, massive features are VT and iAMT.

Virtualise this
So far, so many new acronyms. What are these things, and what do they mean?

VT is short for Virtualisation Technology, and it's a hardware feature built into Intel's new top-end processors. So far, it's little used, but will be increasingly so over the next couple of years. Virtualisation is a slightly more sophisticated form of emulation. I'm quite sure that many of you have run ZSNES, a funky emulator that creates a virtual SNES machine inside your Windows desktop, then allows SNES software to interact with it, allowing you to play retro games on your PC. Well, Virtualisation creates another, virtual, PC inside your Windows desktop, allowing you to do with it as you will.

This means you could run an instance of Linux within Windows. Or, you could run an instance of Windows within Linux, allowing you to use the Open Source software as your main configuration but still maintain a Windows install for gaming - and access the two simultaneously, without having to reboot.

Traditionally, emulation like this has come with a massive performance hit, as the virtual machine has to emulate hardware. However, with Intel's new Virtualisation built right into the CPU, as well as advances in software such as VMWare, which facilitates the creation of Virtual Machines, 'guest' machines can be created with near-native performance.

In enterprise situations, this has many advantages. Drop a dual-core CPU with virtualisation in to an existing server system, and you can immediately double the capacity of your server farm - running double the number of cores and double the copies of Windows, or Linux, or whatever. We are currently pioneering virtualisation here at bit-tech - we're using two quad-core Clovertown CPUs for our gaming server, and we are currently experimenting with virtualising copies of Linux, one for each core, to run game servers on - effectively making a quad-core CPU capable of running four instances of Linux and four different game servers, all at near-native speed. That's pretty impressive.

This technology is already starting to make its way into the desktop mainstream. The most high-profile consumer app for virtualisation is Parallels, an application for Mac OSX which allows you to run a copy of Windows right within the Mac operating system, at near-native performance. This allows Mac users to use Windows software with no hassle at all, without having to reboot.

Security conscious users are also starting to realise the benefits of virtualisation. Nothing from the 'guest' machine can escape into the 'host' OS without your explicit say so. Running your everyday operating system 'within' another operating system means that should you get a crash, or contract a virus, you can just shut down the window to your everyday system and restart a new instance of the OS with a double-click of the virtualisation software. OS 'images' can be saved as files, so you can simply restart the OS, configured with all your software and files, at will. You can even copy the OS configuration to a DVD and transfer it to another machine, if you want.

"Security conscious users are also starting to realise the benefits of virtualization"

This is going to have big ramifications for home users. Imagine keeping your browser in a virtual machine, keeping any internet nasties from getting at your main box. How about running Windows games from within Linux or Mac at full speed? Keeping a different OS configuration for everybody in your household, without worrying about lame Windows user profiles? The possibilities are massive, and you can bet that Intel, and AMD for that matter, will be exploiting this over the coming years, in conjunction with the software makers.

A Major Technology
The second big thing is AMT. This stands for Active Management Technology. This is another combination of Intel hardware built into the CPU and software running alongside the OS.

AMT uses software that creates a separate section of the computer 'alongside' the OS. It is inaccessible by Windows programmes (which means viruses can't get at it) and it communicates directly with the system hardware. What does this mean?

It means that, given the right system and power settings, administrators can remotely access the PC even when the user has turned it off. This means that IT managers can do system administration whenever they need to.

It allows admins to log in and get to grips with the machine remotely, even when Windows has crashed. If a user has a major error that kills the system, an admin can logon and push all the right buttons to reset the OS back to a usable state, all over a network connection. The AMT software can also run custom tasks that automate backups and desktop OS imaging, meaning that should a user lose data or corrupt the OS, restoring back to the last good state is a cinch - and, you guessed it, you can do it all remotely.

This is being hailed as a breakthrough for big businesses, which have a lot of system admins managing computers for even more users, most of whom are technically lacking. Rather than having to send an admin to the desk-side to sort out whatever the problem is or administer patches etc, all of this can be done remotely, without having to rely on Windows.

This is another feature that we're going to see coming to the desktop fairly soon. If you buy a PC from a vendor, it's possible that you could buy technical support for when your system goes wrong - and the vendor could log in to your machine remotely and fix whatever the problem is, even if Windows is dead. If you build your parents a machine, using an AMT-compliant machine means that rather having to come running because your mum broke the internets, you can just fire up the net connection and fix whatever the problem is from wherever you happen to be.

Simply put, AMT could make the life of the technical troubleshooter a heck of a lot easier, and the life of a consumer a heck of a lot simpler.

Currently, AMT is supported in new Core 2 Duo processors, and comes set up with PCs bearing Intel's vPro platform badge. But, since Core 2 Duo processors are soon going to be the bulk of Intel's desktop shipments too, expect to see some easy, consumer-grade softare to manage AMT features next year, quite probably at Intel's behest.

So, the application of these enterprise technologies to consumer hardware was really the thing that stood out to me this year as the big, upcoming trend. I can't wait to see what companies do with these new hardware features - I suspect we're going to see some really cool applications taking advantage of them. If you think you spotted something better at IDF, why not let me know over in the bit-tech forums?


View this in the forums