We patch Crysis to v1.21 and run it in DirectX 10 mode with High detail settings. We use 1,680 x 1,050 with 2x AA and no AF to give a reasonably real-world test without the risk that the graphics card will be a limiting factor to CPU performance.
We load a save game in the Relic level and play the game for roughly three minutes, following a strictly defined sequence of actions and movements. We repeat this test three times, or until a reliable set of results is achieved. The consistent results are then averaged to give the figures below.
This is the best way to test how a CPU affects game performance, as the game will be generating AI and physics and game rules for the CPU to perform. A timedemo addresses a CPU in a noticeably different way.
1,680 x 1,050, DX10, 64-bit, High, no AA, no AF
Intel Core i5-750 (4.15GHz)
AMD Phenom II X4 965 Black Edition (3.99GHz)
Intel Core i5-661 (4GHz)
Intel Core 2 Duo E8400 (4.25GHz)
Intel Core i3-530 (3.5GHz)
Intel Core i5-661 (3.33GHz)
Intel Core 2 Quad Q6600 (3.7GHz)
AMD Phenom II X4 965 Black Edition (3.4GHz)
AMD Athlon II X4 630 (3.71GHz)
Intel Core i5-750 (2.66GHz)
Intel Core 2 Duo E8400 (3GHz)
AMD Athlon II X4 630 (2.8GHz)
Intel Core i3-530 (2.93GHz)
Intel Core 2 Quad Q6600 (2.4GHz)
Frame rate (fps), higher is better. Blue: stock sp