Your games might get smarter

Written by Brett Thomas

September 7, 2006 | 19:39

Tags: #ai #artificial-intelligence

In our review of F.E.A.R., one of the things that the Bit-tech staff praised most was the incredibly impressive AI. Indeed, if you need a challenge, just crank to mid-difficulty and get ready for a wild ride...enemies hunt in packs, use suppressing fire, and flank. But what if they could be even smarter?

AISeek has thought about just that, and has developed a processor solely devoted to artificial intelligence. Much like the Ageia PhysX processor, the AISeek Intia processor is designed to give the CPU some breathing room by doing the AI computations seperately - and much like the PhysX chip, it succeeds at being considerably better. AISeek particularly boasts about the chip's pathfinding skills, which should eliminate the occurance of artifacts and failures to find a path. You know those great points where we all laugh as the enemy looks at you funny, never moving despite you pounding on him? Well, apparently that's not supposed to happen.

Along with the pathfinding, the Intia setup is designed to greatly increase 'line of sight' understanding, so that enemies behave more like they're supposed to. Here's to hoping this could fix when every enemy knows your exact location and shoots like a trained sniper - even in absolute blackness. The other major improvement avenue is in terrain utilisation, where enemies can understand the concept of cover. We've seen hints of this in the upcoming title "Gears of War," and it would be great to see it make a little more of an appearance in other upcoming games.

According to the specs, the Intia processor can do AI calculations about 200 times faster than if a CPU does them. It interfaces through an SDK that game developers have to use when writing their engines, though, so it seems like it could suffer from similar issues to the PhysX chip - a chicken and egg scenario where nobody will license a product that consumers don't buy, but consumers don't buy a product that nobody's licensed for use.

While we talk about the possibility of graphics moving back onto the chip, it appears yet another function wants to try having room of its own. Will it be another must-have item, or does it look like yet another solution to a problem that's not bad enough to fix yet? Let us know your thoughts in our forums.
Discuss this in the forums
Demo Day at Datron Technology

May 14 2021 | 18:40