Shadow Warrior 2 Dev: PS4 GPU is “Beast”, Has “Interesting Hardware Features”

Flying Wild Hog talks about PS4 Pro’s powerful performance.

Posted By | On 08th, Jan. 2017 Under News


Flying Wild Hog’s Shadow Warrior 2 was pretty cool when it first released on PC and now the first person shooter is set to release on Xbox One and PS4 in April or May.

Speaking to WCCFTech, Flying Wild Hog’s Tadeusz Zielinski spoke about the PS4 Pro and what it could help the studio achieve.

With regards to whether it was powerful enough for 4K gaming, Zielinski noted, “PS4 Pro is a great upgrade over base PS4. the CPU didn’t get a big upgrade, but GPU is a beast. It also has some interesting hardware features, which help with achieving 4K resolution without resorting to brute force.”

As for architect Mark Cerny’s claims that the console’s ability to performance two 16-bit operations simultaneously as opposed to a single 32-bit operation, thus leading to “radically” increased performance, Zielinski noted, “Yes. Half precision (16 bit) instructions are a great feature. They were used some time ago in Geforce FX, but didn’t manage to gain popularity and were dropped.

“It’s a pity, because most operations don’t need full float (32 bit) precision and it’s a waste to use full float precision for them. With half precision instructions we could gain much better performance without sacrificing image quality.”

Shadow Warrior 2 will support PS4 Pro when it launches though no specific enhancements have been planned. Also, it most likely won’t support HDR on PS4 and Xbox One S “due to time constraints.”


Awesome Stuff that you might be interested in

  • Mark

    “Most operations don’t need full floats (32-bit), so we can use half floats (16-bit) for better performance without losing image quality”. Interesting stuff. I’ve been hearing modern games usually only use 32-bit floats….maybe they found some rendering stuff unknown to most

    • kma99

      He’s doing his PSA for sony. No developer making big budget games would say that.

    • Living While Alive

      Exactly!!! Funny how this comes out when it’s discovered, Scorpio may have Ryzen/Vega.

      Paid for by Sony Computer Entertainment

    • Riggybro

      I’ve read that most developers are forced to use 32 bit for everything when for some things 16 bit is all that is required (but is unavailable to them).

    • Mark

      Ok, so maybe devs are starting to look at 16-bit floats more for extra performance where they can. I don’t think he’s the only one lookin at half floats more either

    • Luke Skywalker

      not sure where you read that but from what I know, 16bit calculations is available in the x86 instruction set on the cpu…..not sure about on the gpu though for amd but NVidia has been doing this since GeForce fx, just no one uses it. it’s not some brand new thing or a revolution in computing, it’s more like “hey, you remember this? how about we start using it again? we might be able to save just a little bandwidth in only some situations but every little counts, right?”

      the problem with 16bit is that you use it at the cost of precision, I mean do you like to see banding in your images or not?
      maybe can use it for shadows but then you’ll have some low quality shadows (to be honest most shadows in current games are low quality anyway so using 16bit and saving a little bandwidth might not be a bad idea if 16bit wasn’t being used already)

    • Clint Hoffman

      Some basics for you:
      1.Cpu and gpu instructions are not the same.
      2. The fact the Pro chiphandles 2-16 bit instructions in the same time as one 32 bit one is a huge performance increase. (despites previous posts on this where users (xbox fans) were insisting this was a lie and it would only appear in the official amd chips coming in 2017.
      3. Game engines handle more than graphics. And of the graphics there is a lot that absolutely does not need 32 bit instructions.

    • Luke Skywalker

      Point 1 and 3 already know. Which is why is said Nvidia has been doing it on their gpu on or at least since the GeForce fx line.
      You said there is a lot that doesn’t need 32bit instructions, can you name some? Thanks.

    • Clint Hoffman

      Current game engines are collection of different subsystems. I like to think of them as engines that have different tasks and a good game engine is the infrastructure that ties them all
      together into one cohesive piece.
      In answer to your question the two largest sections that come to mind that house potential are:
      How about subsystem instructions that deal with physics?
      How about all the instructions that deal with anything image related that removes graphics that are not visible in the player’s line of sight?
      How about all image related instructions where the player can’t actually see much but the engine still has to draw something (e.g. think a dark room).
      How about all image related instructions where the distance of the subject makes it impossible to resolve detail?
      How’s that? 🙂

    • Luke Skywalker

      “How about all the instructions that deal with anything image related that removes graphics that are not visible in the player’s line of sight?”
      do you mean occlusion?
      physics and occlusion the more precision the better.

      so from the examples you game that leaves low light, shadows and
      “How about all image related instructions where the distance of the subject makes it impossible to resolve detail?”
      isn’t there a name for that? you render less detail the farther the player is from an object and the closer the player get the more detail you render….sounds familiar?

      anyway, not to say 16bit instructions are no good, i’m sure they have uses but not in a general sense.

    • Clint Hoffman

      They weren’t “forced”… It just made more sense to use the 32bit instructions standard since the 16 bit took the same time to process as the 32bit on older architecture.
      As for the people that worry about perceived “loss” of quality….you can ignore them. There are a lot of instructions that have zero impact on any visuals.

  • Tactical Lag-fighting tips

    That’s true.
    #SickBoom

  • Clint Hoffman

    Curious where the idiotic people went…. You know.. The ones that kept trying to deny the fact that the Pro could process 2 16 bit instructions at once.

    • Donnajdavis

      Google is paying 97$ per hour! Work for few hours and have longer with friends & family! !ue301c:
      On tuesday I got a great new Land Rover Range Rover from having earned $8752 this last four weeks.. Its the most-financialy rewarding I’ve had.. It sounds unbelievable but you wont forgive yourself if you don’t check it
      !ue301c:
      ➽➽
      ➽➽;➽➽ http://GoogleFinancialJobsCash591ShopAppGetPay$97Hour ★★✫★★✫★★✫★★✫★★✫★★✫★★✫★★✫★★✫★★✫★★✫★★✫★★✫★★✫★★✫★★✫★★✫★★::::::!ue301c:….,…….

  • Mr Xrat

    Time for facts to trigger the Xgimps.

    The Pro is powerful.

    The Shitpio will also be powerful.

    Both will be held back by the base hardware.

    The Shitpio has no exclusives.

    It’s time to deal with this as soon as you can. 🙂


More From GamingBolt

 

Copyright © 2009-2017 GamingBolt.com. All Rights Reserved.