bit-gamer.net

Chronicles of Narnia: Developer Q&A

Many of our readers from over the pond will have absolutely no idea what this strange combination of characters is, or how it relates to English pop-lit and children's TV. However, if you haven't previously been exposed to the story of The Lion, The Witch and The Wardrobe, a classic children's tale by renowned English author C.S. Lewis, you're about to get a whole lot more familiar.

Disney has taken the story and is turning it into a Christmas blockbuster - and are counting on it to be their most successful movie ever. The book was published originally in 1950, and was turned into a hugely popular TV series in the 80s.

Chronicles of Narnia: Developer Q&A Narnia: Developer Q&A Chronicles of Narnia: Developer Q&A Narnia: Developer Q&A
Of course, as with all big licenses these days, there is a video game to go along with the film release. The game is being handled by the chaps at Traveller's Tales, the team responsible for the utterly sublime Lego Star Wars.

The game engine is an evolution of the Lego Star Wars engine, although the artwork is obviously substantially different! As such, it works on PC, PS2 and Xbox - but, perhaps obviously, it looks best on the PC.

The game follows the film storyline, from the time that the four children leave London to the time they become Kings and Queens of Narnia. Some sections are ripped straight from the film, others take a small scene and expand it into a whole level of gameplay. We've noted some of our gameplay impressions on page 2.

We spoke to Dave Dootson, Lead Programmer on the project, about the the game, and the graphics engine that will power it.

Chronicles of Narnia: Developer Q&A Narnia: Developer Q&A
bit-tech: What version of the DirectX / Shader model spec is Narnia written to? What shader paths are available, and how do those relate to the graphical features rendered?

Dave Dootson: We are using DX9.0c and require a video card that supports at least vertex and pixel shaders 1.1 ( eg. GeForce3 or greater ). We use an automatic shader builder to construct shaders which match the specification of the machine that the game is run on. If a graphics card does not support certain features because the shader model is too low then an alternative shader without those features is compiled.

(Note: Shader 1.1 is the fallback path for games like Splinter Cell: Chaos Theory. As noted below, we expect the game to feature some extra eye candy for higher end boards, using SM 2.0+, which enables a large number of funky features).

bit-tech: What differences, if any, does the game render between ATI and Nvidia cards in terms of shader paths, on current hardware?

Dave Dootson: Currently there are no differences between Nvidia and ATI cards in the way shader paths are set up. However, it may be possible to compile slightly more optimal shaders for Nvidia cards if detected using the 2.0a option, given that shaders are now compiled at run-time.

Chronicles of Narnia: Developer Q&A Narnia: Developer Q&A Chronicles of Narnia: Developer Q&A Narnia: Developer Q&A