Google working to make Stadia games 'more responsive' than local ones

October 10, 2019 | 10:57

Tags: #cloud-gaming #game-streaming #lag #latency #madj-bakar #negative-latency #predictive-input #project-delorean #project-xcloud #stadia

Companies: #google #microsoft

Google is confident it can solve one of the two biggest issues with cloud-powered game streaming on its Stadia platform, promising that games will eventually become 'faster and more responsive' than playing the same games locally.

Cloud gaming is, on its face, a tempting proposition: Instead of shelling out on high-end hardware that needs constantly upgrading to play the latest games at their best, players can instead rent time on powerful servers in remote data centres and stream the resulting video to low-cost, low-power, and even pocket-sized devices. There are only really two problems with the concept: The fact that sending input over the network to a physically-remote data centre, having it process said input into in-game actions, and stream the video result back to the user adds measurable latency which can make twitch-gamers throw down their controllers in disgust; and that high-quality video streams require relatively high-bandwidth connections and chew up gigabytes of data for every hour of play.

Google, however, thinks it has the solution to at least one of those two problems. In an interview between vice president of engineering Madj Bakar and Edge Magazine, spotted and summarised by PCGamesN, Bakar claims that 'in a year or two' games played via the company's soon-to-launch Stadia cloud gaming platform will be 'running faster and feel more responsive in the cloud than they do locally, regardless of how powerful the local machine is.'

The secret, Bakar claims, comes in the form of 'negative latency,' in which a buffer is filled using information about the network latency between the player and the server. This 'lag mitigation' buffer will be populated through a range of techniques, including increasing the framerate to reduce input lag at the server side and also predicting the player's inputs so that the game can begin the process of responding before the input signal is even received.

If the latter sounds familiar, it's because that's exactly what Microsoft proposed more than half a decade ago with Project DeLorean, since renamed following a complaint from the holder of the trademark. Demonstrated at the time with twitch-shooter Doom 3 and action role-playing game Fable 3, Microsoft's technology worked by rendering multiple possible outcomes depending on user input then discarding all but the one corresponding to the actual received input. When activated, the company claimed it could hide up to 250ms of latency from a connection - making a game played on a network with up to 250ms of lag feel as responsive as local, and reducing anything over that by 250ms.

Google has not confirmed whether its own latency reduction techniques follow those of DeLorean, which is likely to make an appearance as part of Microsoft's rival Project xCloud game streaming service.


Discuss this in the forums
YouTube logo
MSI MPG Velox 100R Chassis Review

October 14 2021 | 15:04

TOP STORIES

SUGGESTED FOR YOU