Gaming in 2020: what the next decade holds

Industry plans motion and mind control, 3D MMOs and more

Moral quandaries aside, AMD's Richard Huddy is 100 per cent convinced that "2020 is the point where we will be able to fully render reality in a realistic way and we will have the majority of scenarios that we experience in this world rendered convincingly.

"So I will be able to take you out onto a desert, for example, which is a nice easy thing, or I'll be able to put you in a far more complex environment such as a busy shopping mall with hundreds of people bustling around you. Or I'll put you in a historical battle scene from 1000 AD and render all of that completely convincingly."

Right now such scenarios are difficult to picture, because the vast majority of us are playing our games on relatively small, low-resolution TVs and monitors.

But, as the AMD man puts it: "The more I can see, the more I can do. That is the nub of it. I love being able to see lots of stuff. And when it comes to controlling information, clearly having a second or third display is a good thing. But what I really want to do, at times, is to fill my whole field of view with computer rendered stuff.

"Whether that be a desktop environment to make me more productive at work, or whether that be inside a gaming environment, in which my peripheral vision really is being used in the same way that I use it in the real world – to locate stuff, and to be able to react accordingly," says Huddy.

AMD s multi screen eyefinity tech points us in the direction of where displays are going

MORE TO SEE: AMD's multi-screen EyeFinity tech points us in the direction of where displays are going

"We've done some interesting stuff already with EyeFinity – and we've taken that tech to LAN parties to see how professional, highly competitive gamers coped with and without their peripheral vision in the game," Huddy continues.

"And the results were completely consistent, the more you can see the better you can do. If the game is working hard to give you that extra information then you are simply better off."

Huddy offers the example of playing a racing game, in which your field of view is horizontally-bound – ie you are not really interested in looking up and down – which "is the epitome of that kind of experience, where three or even five monitors will continue to bring you benefits and when you can clearly see if and when there is a car on your right hand side jostling to get past you."

Seeing more stuff = better games

Put simply, gaming is just better if you can see more stuff. And AMD's recent three and six monitor support "is really our first steps towards this, we most certainly want to get to the stage where it is rather more like the 'holodeck' where everything around you that you can imagine is displayed, and also that the display is updated so quickly that you are not aware of it updating itself.

Right now, the human eye has a better resolving power than the screens we are using. "So at AMD we would certainly like to be able to cope with that and the field of view that I tend to fill with my monitor should take around 25 megapixels, ideally – because at that point we are at the resolution of the human eye. And if we extend that outwards to include the rest of our vision, including the peripheral vision and so on, then we would probably need something like 100 megapixels to give a really convincing view. So we would need a lot of computing horsepower in order to achieve that job!"

But this is the 'tipping point' and the stage at which things "will be more convincing and we will no longer have to worry about fiddling with the detail – anti-aliasing and so on – because everything will look just right."

And the most exciting thing is the fact that this is not 'all' far off future tech dreaming, because AMD has been showing off some beautiful graphics tech experimentation recently using its new DX11 hardware where they have been doing 'global illumination' in real time in relatively simple scenes in games. (Global illumination, for those that aren't aware of it, is what Shrek 2 was rendered in, compared with Shrek 1 which was rendered with more traditional 'rasterization' and 'ray tracing' techniques). And Huddy also admits that AMD is also currently talking to a games company who is looking at putting that tech into a game that may even ship later in 2010.