We take a look at some of the biggest gameplay changes that PS5 and Xbox Series X might potentially bring about.
It’s crazy how fast time flies. We are now just a few months from the arrival of the ninth console generation. Barring any last-minute delays, both the PlayStation 5 and Xbox Series X should hit shelves before the end of 2020. The arrival of a new console generation always means a significant change in the way games look and how they play. With the increase in graphics and CPU capabilities come new features that just weren’t possible a generation earlier.
This is something we see in every generation. Back in the mid 2000s, when the PlayStation 3 and Xbox 360 just hit the market, rag doll physics, real time shadowing, and light bloom became important parts of the visual makeup of many games. Some games leveraged the “next-gen” consoles superior processing capabilities to deliver an unprecedented number of NPCs on screen, with unique patterns of behavior. In the eighth gen, we again saw advances in terms of graphics. Physically-based rendering and GPU-accelerated particles were introduced right at launch. And interestingly, a handful of cross-gen titles like Metal Gear Solid 5 even managed to deliver some of these features on the PlayStation 3 and Xbox 360.
And, as we sit on the cusp of the ninth-gen, we’re about to see another leap in progress and fidelity. This generation, the increase in CPU capabilities is a key highlight. Both the PlayStation 5 and Xbox Series X offer CPU horsepower that’s an order of magnitude greater than their predecessors, alongside the expected boost to graphics capabilities. The new consoles also feature SSD storage, radically faster than anything we’ve seen before, even in the PC space. And, quite apart from the console and games industry, we’ve seen advances in fields like artificial intelligence that could have significant implications for games. Let’s take a look at five ways that the PlayStation 5 and Xbox Series X are expected to transform games in the years to come.
Artificial intelligence and creative boosts to image quality
Sony recently patented an artificial intelligence-based technology solution for frame reconstruction that’s very similar to NVIDIA’s DLSS. What does this have to do with next-generation graphics? A whole lot, it turns out.
While both the PlayStation 5 and Xbox Series X are inarguably powerful devices, Sony and Microsoft made much of the point that their consoles would support 4K/60 FPS gaming experiences. From what we know about the PlayStation 5 and Xbox Series X GPUs, and what we’ve seen in the PC space, it should be technically possible for the PS5 and Xbox Series X to deliver a 60 FPS experience in most current-gen AAA titles. There is, of course, an important caveat there. Running eighth-gen games at 4K/60 FPS is one thing. But what about technologically advanced ninth-gen games? How exactly will consoles handle visuals like those seen the Lumen in the Land of Nanite demo at 4K/60 FPS? This is where AI-based frame reconstruction comes into the picture.
While checkerboard rendering delivered reasonable results last gen, the output was noticeably worse in quality relative to native 4K. NVIDIA’s DLSS technology on PC, though, was something different. By using an AI that’s trained on gaming image sets, DLSS is able to reconstruct detail from a lower resolution frame buffer by actually guessing and filling in the missing bits. In some implementations, for instance in Death Stranding on PC, DLSS actually delivers better results than native resolution rendering. And, remarkably, it boosts performance by as much as 50 percent.
The implications for PlayStation 5 and Xbox Series X are obvious. Both these consoles have finite GPU resources. If next-gen machine learning-based upscaling technology is used, owners of 4K displays could get native image quality without compromising on the frame rate or other aspects of the visuals. This would mean that PlayStation and Xbox Series X games could deliver real 4K quality while also pushing the envelope forward in terms of gameplay and graphics.
A return to micro-destruction
There was a brief point in game history in the late 2000s where every seemed to focus on environmental micro-destruction. NVIDIA’s PhysX suite at the time emphasised particle effects and dynamic, interactive debris, as seen in Borderlands 2. Red Faction: Guerrilla allowed console players to wreck virtually any standing structure in-game. Battlefield: Bad Company let players put holes in walls with a grenade launcher. Sadly, because high-end physics features were CPU-intensive, they were mostly shunted aside in favour of newer graphics rendering techniques. While micro-destruction is somewhat more prominent in current-gen titles, we’ve yet to see anything quite on the scale of those earlier games.
With the massive boost to console CPU horsepower in the PlayStation 5 and Xbox Series X, developers might finally have enough headroom to bring back micro-destruction. Advanced destruction physics would could allow players to strategically destroy parts of the environment — to create and break cover on the fly, for example. One area where destruction could go beyond what we’ve seen til date is soft-body physics. Soft-body physics models how objects bend and contort when forces are applied to them. There aren’t a lot of games out there right now that deploy it. However, Beam-NG, a soft-body physics tech demo, gives us a good idea about the possibilities. Next-gen racing titles and open world action games could leverage soft-body physics to model more accurate car crashes and damage modeling, for instance.
Audio-only indie titles
Several years ago, a little-known iOS game, Papa Sangre, demonstrated that it was possible for developers to implement immersive game worlds with no visuals at all. Papa Sangre’s use of binaural technology and relatively detailed samples created the illusion of a vibrant underworld environment solely through audio.
While we don’t think AAA studios will focus on audio-only experiences, we think that indie devs will almost certainly look at features like the PlayStation 5’s Tempest audio engine as a way to deliver next-gen audio-only experiences. It’ll be fascinating to see what audio-only titles could do on PS5, with Tempest supporting hundreds of dynamic sound sources at a time. Together with binaural HRTF and a good pair of earphones, we see all-new audio vistas taking shape in the years to come. This would be a big win for the accessibility community, too, since audio-only games would offer value to visually-impaired gamers.
Procedural content that doesn’t suck
As far back as 2011 and Skyrim’s Radiant quest technology, developers have implemented contextual, procedurally generated in-game content to a greater or lesser degree of success.
A major limitation to procedural content generation is that it’s simply not feasible with current technology to implement procedural quests that go beyond fetch missions within the existing game world. With recent advances in artificial intelligence and related fields, we think that procedural content generation could see something of a comeback with the PlayStation 5 and Xbox Series X. AI-based Procedural generation could be leveraged across different aspects of the mission design process. It could even be used to author unique assets, like buildings, NPCs, and weapons.
We’ve seen games like No Man’s Sky do this already. However, the limited number of variables mean that mission structures — and indeed procedurally generated environments and characters — are never that different from each other. AI-enhanced procedural generation could create unique experiences that approach the quality of handcrafted quests. Better yet, AI-based audio rendering technology could be used to turn procedural conversation text into in-game audio. This could transform the “padding”, for instance: developers won’t have to sacrifice quality for quantity.
Cloud-powered cross-platform experiences
Both Microsoft and Sony are already clear about their long-term play in the gaming space: moving everything to the cloud. Microsoft’s xCloud is set to release in a couple months and will allow gamers to play Xbox titles on their smartphones and other devices. However, the infrastructure is still not there for a pure, cloud-powered experiences in the here and now. In the months and years to come, we think that Sony and Microsoft might ease the transition by leveraging the power of hybrid solutions that connect to a host console.
Effectively, we’re talking about supercharged versions of the “companion apps” that many games ship with right now, such as Fallout 4’s Pip-boy app. Chunks of gameplay — especially sections that are more amenable to mobile controls like driving sections — could be streamed to other devices. Players could use a mobile-friendly interface — much like the Wii U’s tablet mode functionality — to play these sections or perform other actions that would impact the in-game world. And thanks to the super-fast SSD storage and always-on connectivity players could pick up the main console experience seamlessly.
We expect to see major change in the console space over the next couple years. The PlayStation 5 and Xbox Series X will almost certainly deliver graphics and gameplay experiences that aren’t possible on current tech. But, as we’ve seen here, there are a number of other ways we expect developers to harness the power of the next-gen consoles.