bit-tech.net

Microsoft hides cloud latency with DeLorean

Microsoft hides cloud latency with DeLorean

Microsoft's DeLorean software uses input prediction and pre-rendering to mask up to 250ms of network latency, making twitch gaming possible on cloud platforms.

Microsoft's research arm claims it may have found a way to solve the barrier preventing mass adoption of cloud gaming: the latency between the server and the client.

Cloud gaming has been poised to be the next big thing for a number of years, now. While Nvidia has enjoyed some success with its Grid platform - servers packed with GeForce GPU hardware - and client-facing platforms like OnLive continue to roll on, cloud gaming is still a poor second best to local gaming. The premise, using remote server hardware to do the heavy lifting so you can play the latest PC games on low-end - even mobile - hardware is sound, but the delay between inputting your actions at your end and the server responding means that twitch-gaming is out of the question.

Microsoft Research's answer? Time travel, via what its creators have dubbed DeLorean. In a white paper published this week, a team lead by Kyungmin Lee has detailed what they are calling a 'speculative execution system' which is capable of hiding up to 250ms of network latency from the user - perfect for high-latency mobile networks. DeLorean works by rendering possible outcomes ahead of time, then displaying only the correct outcome to the user.

'To evaluate the prediction and speculation techniques in DeLorean, we use two high quality, commercially-released games: a twitch-based first person shooter, Doom 3, and an action role playing game, Fable 3,' the team writes in the paper's abstract. 'Through user studies and performance benchmarks, we find that players overwhelmingly prefer DeLorean to traditional thin-client gaming where the network RTT is fully visible, and that DeLorean successfully mimics playing across a low-latency network.'

The paper itself (PDF warning) goes into the technical details, but it's not clear when - or if - the technology will reach consumers. The team has, however, published the binaries for the DeLorean software, along with videos demonstrating its use.

16 Comments

Discuss in the forums Reply
ChaosDefinesOrder 22nd August 2014, 11:16 Quote
sounds more like selectable parallel (or divergent) timelines than time travel...
Gareth Halfacree 22nd August 2014, 11:17 Quote
Quote:
Originally Posted by ChaosDefinesOrder
sounds more like selectable parallel (or divergent) timelines than time travel...
I know, but c'mon: It's called DeLorean, Marty!
Big_malc 22nd August 2014, 11:26 Quote
Do they do this using a flux capacitor running at 1.21 Giga Watts
Corky42 22nd August 2014, 11:35 Quote
Forgive my lack of understanding, but isn't the problem with streaming not the time it takes to render what is happening, but the time it takes to transmit what has happened ? In other words isn't the latency down to the network being used and not the hardware doing the rendering.
Gareth Halfacree 22nd August 2014, 11:43 Quote
Quote:
Originally Posted by Corky42
Forgive my lack of understanding, but isn't the problem with streaming not the time it takes to render what is happening, but the time it takes to transmit what has happened ? In other words isn't the latency down to the network being used and not the hardware doing the rendering.
Read the white-paper (or just re-read the article): DeLorean works by guessing what you're going to do. A crude example, which is a far bigger change than the team is working on: you're at the end of the corridor and can turn left or right. The cloud server renders frames for both possibilities and sends them to your client machine before you've pressed a key; when you make a decision and press left, your client software shows you the already-rendered 'he turned left' frame and discards the 'he turned right' frame. Et voila: latency, she dun gone vanished.

In reality, the system is working on teeny-tiny per-frame differences, but that's the general principle as I understand it. There's a lot more to it, naturally - and it's all explained in the white paper. The result is that network latency of up to 250ms can be hidden entirely to make the game feel as responsive as if it were being played on a local machine, and latency higher than that - on a mobile network, for example - can be effectively reduced by 250ms (so a 500ms latency becomes a 250ms latency - still high, but half what it was.)
Bungletron 22nd August 2014, 12:24 Quote
Quote:
Originally Posted by Gareth Halfacree
Read the white-paper (or just re-read the article)

Burn.

Since all the possible outcomes exist until determined by user interaction I would have called this technology Schrodinger rather than DeLorean.

Totally ingenius, but I assume processing power and bandwidth required must be humongous? It is not just turn left or turn right, its actually perform any minute amount of movement in any axis plus any other action then draw and send every possible frame (one will be used the rest is waste, lol!)? It is very much like a grand master in chess predicting all the possible moves for several turns, maybe they should call it Kasparov...
Umbra 22nd August 2014, 13:00 Quote
Quote:
Originally Posted by Bungletron
Burn.
Totally ingenius, but I assume processing power and bandwidth required must be humongous? It is not just turn left or turn right, its actually perform any minute amount of movement in any axis plus any other action then draw and send every possible frame (one will be used the rest is waste, lol!)?

Multiply by the 1,000s of people playing at the same time at different points in the game and surely that would require a ridiculous amount of hardware, maybe the likes of Nvidia could afford to build it and then lease bandwidth to others, probably not AMD though
Corky42 22nd August 2014, 13:01 Quote
Quote:
Originally Posted by Gareth Halfacree
Read the white-paper (or just re-read the article):
<Snip>

Thanks, i was thinking the rendered frames would just sit on the server until it knew what to send, it didn't occur to me that it just sent all possible outcomes for the next so many milliseconds to the client and let it choose what frame to display out of all the possibility.

Sorry i didn't have time to read the white-paper, and maybe didn't read the article thoroughly enough (or I'm just a bit dense) :D

I guess this would use more bandwidth if it was sending loads of frames to the client, even if only some of those frames are ever used.
ChaosDefinesOrder 22nd August 2014, 13:29 Quote
Quote:
Originally Posted by Bungletron
Since all the possible outcomes exist until determined by user interaction I would have called this technology Schrodinger rather than DeLorean

Heisenberg, surely?
Gareth Halfacree 22nd August 2014, 17:18 Quote
Quote:
Originally Posted by Corky42
I guess this would use more bandwidth if it was sending loads of frames to the client, even if only some of those frames are ever used.
There's bandwidth-management stuff in DeLorean too. Long story short: yes, it uses more bandwidth, but not *massively* more. Bear in mind that OnLive recommends a stable 5Mb/s connection for 720p, so you could triple that and still be in spitting distance of Ofcom's May 2013 national average of 14.7Mb/s.
rollo 22nd August 2014, 18:41 Quote
Dout every ISP will be happy to have you drain 5mb or 15mb from them every second. More so after 6pm at night when throttling occurs across most UK ISPs.
Gareth Halfacree 22nd August 2014, 18:56 Quote
Quote:
Originally Posted by rollo
Dout every ISP will be happy to have you drain 5mb or 15mb from them every second. More so after 6pm at night when throttling occurs across most UK ISPs.
Netflix uses 25Mb/s for its "Ultra HD" streams, so...
rollo 22nd August 2014, 21:40 Quote
25megabit or megabyte?
Gareth Halfacree 22nd August 2014, 21:58 Quote
Quote:
Originally Posted by rollo
25megabit or megabyte?
Bit. Hence the lower case b.
fluxtatic 23rd August 2014, 07:55 Quote
Quote:
Originally Posted by Gareth Halfacree
Quote:
Originally Posted by Corky42
I guess this would use more bandwidth if it was sending loads of frames to the client, even if only some of those frames are ever used.
There's bandwidth-management stuff in DeLorean too. Long story short: yes, it uses more bandwidth, but not *massively* more. Bear in mind that OnLive recommends a stable 5Mb/s connection for 720p, so you could triple that and still be in spitting distance of Ofcom's May 2013 national average of 14.7Mb/s.

In Britain, sure. There's still plenty of the US that's missing out,. Actually, the US might be the odd man out on this one, I suppose. The rest of Europe and the parts of Asia that matter market-wise are likely just fine. Australia's probably pretty screwed too, though, I suppose.
Gareth Halfacree 24th August 2014, 09:09 Quote
Quote:
Originally Posted by fluxtatic
In Britain, sure. There's still plenty of the US that's missing out,. Actually, the US might be the odd man out on this one, I suppose. The rest of Europe and the parts of Asia that matter market-wise are likely just fine. Australia's probably pretty screwed too, though, I suppose.
This suggests that the national average download speed for the US is 28.7Mb/s. Australia gets 16.2Mb/s. Granted, these are averages that will be skewed upwards by small-scale but extremely high-speed services like Google Fibre, but by those figures more homes in the US and Australia would be able to use a 15Mb/s streaming service than would not.

That said, I'd be tempted to halve the figures on offer from that site: it claims the average for the UK is nearly 30Mb/s, double that of Ofcom's average. I guess more people use Speedtest to verify that they have teh fastz0rz than to check if their connection is slow.

EDIT: Here, this might be a bit better: Akmai's State of the Internet report for Q4 2013. That says that average broadband speeds in the US broke 10Mb/s for the first time - meaning you could double the bandwidth required of a game-streaming service like OnLive and still be suitable for more homes than not. Still haven't been able to find any official figures, mind: does the FCC not produce a report like Ofcom's?
Log in

You are not logged in, please login with your forum account below. If you don't already have an account please register to start contributing.



Discuss in the forums