DX12’s Ability To Unify GPUs Is “A Great Move”, Xbox One Will Benefit From An Engineering Point of View

Celtoys founder Don Williamson talks about the potential of DirectX 12.

Posted By | On 18th, Nov. 2015 Under News


DirectX 12

Even though DirectX 12 is available now with Windows 10 and the operating system itself has begun to integrate with the Xbox One with the New Xbox One Experience, it will still take time before all games are developed with the API in mind. That being said, DirectX 12 has a ton of potential in what it can achieve, especially when it comes to Microsoft’s gaming push.

GamingBolt spoke to Celtoys founder Don Williamson, an ex-Lionhead member who headed up the Fable engine and who currently specializes in performance and engine optimization, about some of the factors that will determine whether we see more DirectX 12-built games in the coming years. Williamson stated that, “Nobody really cared about DirectX 11 because there was assumed to be no money building your engine for it. With Xbox 360, you already had a lot of the features of DX11 and more, with lower level access and a DX9-like API. Porting this to DX9 for PC was simple enough, with the lions share of work going into compatibility and performance – adding a new API was just a recipe for trouble.

“The Xbox One and PC seem to have more financial impact on sales than the previous generation and if Microsoft manage the Xbox One DirectX 12 API effectively, the up-take should be good. It’s more complicated to develop for than DirectX 11 but more and more people are using engines like UE4 and Unity so the impact should be lessened.”

As for the belief that DirectX 12 could be used for better visuals on the Xbox One and how it could benefit the hardware overall, Williamson said, “The innovation in DirectX 12 is its ability to unify a large number GPUs with a single, non-intrusive API; it’s such a great move from Microsoft. I think the Xbox One software team will already have made their version of DirectX 11 very close to the metal so the main benefit will be from an engineering point of view, where you can realistically drop your DirectX 11 version altogether. The simpler platform will allow more time to be invested in optimizing the game.”

What are your thoughts on all this? Let us know in the comments below and stay tuned to our full interview with Don in the coming days.


Awesome Stuff that you might be interested in

  • Terminator

    Oh boy…”gets popcorn ready”

    The Basement Dweller Armchair Engineers aren’t far away.

    Another day another DX12 article where one developer says it will help the Xbox One or one that says it wont help it at all. Only time will tell who is correct, maybe both are correct in some things it will help in others it wont. When DX12 is truly put to the test then we will know.

    • Mav6

      Sits down right next to @Terminator with my own popcorn w/extra butter and Root Beer, this will be entertaining to say the least. At least today it’s a developer who we know is actually working on a game that supports DX12 on XB1.

    • Terminator

      Pass me a Root Beer! The show will start soon.

    • Rodney Patrick

      While you at can you pass me too. Lol

    • Mav6

      Damn straight no problem with that a cold one to you.

    • GrumpyGamr

      Is it the people who don’t make money off Microsoft and know what their talking about vs the XboxOne fanboy show

    • Rodney Patrick

      Lmao!!!

    • Rodney Patrick

      EXACTLY!!!! lol

    • hvd hvd

      i guess devs who make games are wrong and a bum like you is right…lol

    • Terminator

      Are you talking to me? Hope you are not because what you said doesn’t apply to me since I am not one of those online fake engineers who thinks they know more than the developers.

    • hvd hvd

      well you cant use common sense so yea im talking to you.doesnt the xbox one use a 8 core cpu?answer yes.doesnt the xbox one use a gpu?answer yes.dose xbox one use dx12 and win 10?answer yes.

      so isnt the xbox one like a pc from a hardware and software stand point?answer yes.here is where the common sense comes in to play.

      in layman terms doesnt the dx12 api help the gpu and cpu talk to each other better?answer yes.so therefore the gpu doesnt have to wait for the cpu and vice versa right?answer yes.

      so using common sense since the pc and xbox one uses about the same soffware dx12 and win 10 and pc meaning amd gets a big boost right now and xbox one uses amd gpu and cpu it will get a boost not as good as pc but it will bet a boost none the less.

      like i said common sense.

    • theduckofdeath

      It has been said before and is said again by the dev speaking in the article — XB1 already benefits from many of the major advances brought to the PC by DX12.

      Previous DX versions didn’t give devs low level access to memory allocators and other points of optimization. Now the devs have much greater control, New features and improvements, along with the ability to run mismatched GPUs in PCs.

      All this gives devs more incentive to develop under DX.12, where many skipped DX11 and stuck with DX9. It will promote and ease “porting” to and from PC, allowing more time for optimization…making & saving devs more money.

    • hvd hvd

      thats right ALL the dev’s that are using dx12 are saying there is a boost to xbox one and most people whoarnt game dev’s are saying its not going to get a boost.so who do you believe the guys that actually are making games or joe smhoo who says its not.i think we know tha answer..lol

    • Terminator

      That’s a cool story what does all that have to do with me?

    • hvd hvd

      my post is showing some common sense.you arnt a game dev that makes games.you are a joe smhoo who doesnt.i just thought id give a little bit of enlightenment since you said fake dev’s..

    • Terminator

      As much as I want DX12 to be a hit and prove the naysayers wrong you aren’t a dev either, you may have a bit of an understanding on the topic but that’s it. You can use all the “common sense” you like but that still doesn’t prove anything, especially when you haven’t worked with the software unlike the dev of this article that has and knows what he is talking about.

    • hvd hvd

      i am not a dev i play games.i just have common sense enuff to know how dx12 works.its not that hard to see how it works.
      likei said ill take a developers word over som joe smhoo like you.and a lot of developers are saying a boost ill take their words over yours any day…lol

    • Mirimon

      Idk, even msft said it won’t… sounds more like bad journalists are incorrectly sourcing the wrong people.

    • kstuffs

      MS also said Cloud Compute is possible when they launched the XB1, but all the naysayers said it was a gimmick and BS PR stuffs. MS isn’t going to make any performance claims until they have the games to show it.

    • Terminator

      Sounds more like you just read what you wanted to read. Phil Spencer said that DX12 would help in some way.

    • XbotMK1

      “Sounds more like you just read what you wanted to read”

      You just described yourself. Congratulations.

    • Terminator

      LOL, that’s what you just did, typical TrollBotFailMK1.

    • Mirimon

      Selective reading and imagination on top?…

      He stated it would not help in any of those major ways… such as frame rates, resolutions, you kbow, the hard haming oerformance functions. Did it tidy up some headroom for one if the three active operating systems, sure. Will it help the xb1 ui be a tad less slow but still lethargic? Yes.

      Changes, yes, significant or meaningful? Not really.

    • Terminator

      It has begun.

    • kreator

      Best response yet!

  • Donald Graham

    This developer clearly doesn’t know anything. DX12 won’t do anything. There are only so many gnomes that can fit in one Xbox. The PS4 will always have better looking graphics and will only get better when you squint really hard.

    • Rodney Patrick

      One word for you,SALTY!!!!

    • Donald Graham

      It was meant to be sarcasm. I just wanted to make fun of the whole console war thing. I promise that I don’t actually think there are gnomes in the Xbox.

    • Mirimon

      But, therw is plenty of room in there for a gnome or two.

    • Mark

      Lol

    • red2k

      Xbox One have the best graphics on the market they launch with Ryse (Visually a masterpice) and Forza 5 locked at 1080p 60fps and now Forza 6, ROTR and then QB. Visually or Performance point of view Xbox One have the best game on the market.

    • Sindin78

      You got it.

    • hvd hvd

      dont forget halo 5 1080p/60fps.xbox one has more 1080p/60fps exclusives then the ps4 as well forsas and halo 5.wait till fable and gears of war with dx12 comes out.i cant wait to see what they have to say after that.

    • Mirimon

      You and hvd are being sarcastic right?

    • Mark

      I have to say, my top 5 for graphics are;

      1) Star Wars
      2) The Order
      3) Rise Tomb Raider
      4) Killzone
      5) Ryse

    • ShowanW

      as sucky as KillZone:SF is, it is beautiful to look at.

  • Sindin78

    Where is xbotmk1. He is real defensive about this.

    • Rodney Patrick

      Lmfao,that he is

    • JerkDaNERD7

      He cares more than Xbox fans for some odd reason

    • Mr Xrat

      You gimps were saying the Xbone was supposed to have closed the gap by now. Where is it? What happened? Looks like you idiots got led up the garden path again.

    • Jacob S

      Maybe you should ask yourself why more devs are not using the esram and instead bypassing it and lowering the resolution.
      Why hasn’t one PS4 exclusive matched the graphics/physics/player/ai counts of Xbox1 exclusives.
      Why does PSN suck, with server farms spread out far enough that double ping times of X1. You haven’t seen DX12 and the only cloud computing that has been done is on X1 MP exclusives.

    • Satsumo

      A game takes 2-3 years to make.
      Only MS studios had DX12 access for past year or so – hence it will take longer for non-MS studios.
      We always said DX12 was coming around now – but that games actually developed for it would take a year or two after that.
      Common sense really.

    • snOOziie

      At the end of the day the developers have chosen the main platform really can’t do much about it.. But with DirectX 12 it makes it easier for them now and they have to olny make one game and port it over and the software does the rest……

    • snOOziie

      The developers have the tools and they are not using them. Just look at the first party games!

    • Mr Xrat

      Yeah, they look crap. Never mind.

    • Sindin78

      Mental boy. Mad sheep disease is curable by buying and Xbox one.

  • Graeme Willy

    DX12 will definitely help the Xbox One. Especially in games that use dynamic resolution handling. You’ll be able to hit 1080p more often, but not necessarily full- time.
    We have to understand that the GPU in the Xbox One is a sub 1080p GPU in the PC world…a Radeon R7 260x, to be precise. Whereas, the PS4’s is an entry level 1080p GPU (The R9 270x). Anyone in the PC community using a 260x is likely hitting between 720p-900p in most PC games, depending on trade-offs selected. Meaning, the resolution performance depends on what you are lowering the texture quality to in your games and/ or whether or not you are minimizing/ foregoing post processing, decreasing draw distance etc. An R7 260x is considered your MMO light-duty cards. Most people that run a card like this are probably playing older games, like WoW, or GW2. Now, there is a difference between PC and Xbox One. In the Xbox One we’re talking an APU and not a card, firstly, so you do have better performance since there is less latency involved when the CPU side has to receive draw call requests from the GPU. Also, the Xbox’s R7 260x is sitting on a 256bit memory bus, whereas, the PC variant sits on 128 bit bus. This is probably how the Xbox can manage to hit 1080p more often than that same GPU could on PC.
    DX12 will help…however, “help” is subjective. How will it help? Help with better frame rates? Help with pushing more post processing and/ or graphical effects? Or help with resolution?” Meaning, a developer may not choose to use leverage DX12 gains for resolution, or even FPS. A dev might use any resource gains by throwing in more vegetation, increase draw distance, more shadow effects, better post processing…etc.

    We also have to remember that the PS4 has equally as great an API that developers love, as it does pretty much everything DX12 does anyway…in that it has minimal overhead and low level access. So in terms of the whole graphics war, you’re still separated by hardware, ultimately. Microsoft would have to create some magicware to make an R7 260x outperform an R9 270x…and that would only be until Sony counters that with a new and improved API that seals that gap once again. However, there is one final area. CPU. The CPU waits instruction from the GPU. Sometimes, if a game is too CPU intense, it will send CPU-bound tasks to the GPU. The Xbox One does the latter slightly less than the PS4, on account of a slightly faster processor, so there is less offload that needs to occur. Also, last I heard, the PS4 CPU still locks away two cores for the OS(for the moment). The Xbox One reserves only one core for OS. Having an extra core is loads of help, since now games can send even less CPU-bound work to the GPU, essentially giving you more GPU. This is how we are not seeing the 40% hardware disparity between the two consoles that we should be seeing…on account that there is literally 40% performance gap between the two GPU’s. We should be seeing noticeably less vegetation, noticeably less draw distance, in addition to lower resolution in titles…but the only detected difference is only ever in pixels and not a 40% difference in pixel count. So what the X1 manages, is impressive.

    • Psionicinversion

      actually PS4 is more of a 270 not X but its also massively under clocked

    • Graeme Willy

      True. My mistake. I just like to put “x” after everything 😉

    • kreator

      I was going to say the same thing. 🙂

    • red2k

      I don’t know guys…. I saw the Xbox One hardware and something don’t fit. I doubt that Microsoft spend money and resources building a chip with all of the similarities of a next gen GPU but with the performance that actually have. Something is not ok.

      PD i dont want prove anything with my coment is just a observation…

    • theduckofdeath

      I think where Microsoft got into trouble with XB1 was:

      1. Don Mattrick
      2. Inability to predict GDDR5 availability years into the future

      Number 2 lead to using the embedded RAM buffer approach again. The 40 MB buffer takes up a huge portion of the APU and is still has less capacity than many devs would like. It cannibalized much of the real estate that could have gone to the GPU. This configuration effectively complicated and limited game development.

      Sony got lucky in that enough GDDR5 was available to have 8 GB per console, and Mark Cerny didn’t have to deviate from their original vision.

    • GHz

      I agree. But its also true that the tech in your box shouldn’t have to follow the rule of thumb from the last 30yrs. How can you deliver added performance w/o worrying about what’s inside the box….to a certain degree. I see two companies following separate philosophy’s in introducing us to what next gen gaming will look like.

    • red2k

      Yes that is one part of the history but when you see the number of layers on the chip someting is not ok. That leyers are made of transistors etc.. If you put it there is not for onament because that cost money. You as a company are not going to use a model that have an excess of material because that is not cost efficiency. Anyway Xbox is a great console with a virtualGPU and the arquitecture is there to have a great performance with Asinc compute and tile resources using ESRAM with the specs that they already show us.

    • Xbox one 2econd gpu unlocking

      Some of the layers are still under professional lockout

    • theduckofdeath

      That’s my point — Microsoft could not just make the chip as big as possible, so they made concessions.

    • Graeme Willy

      The Xbox was designed around tiled resources. Parallel processing of tiled graphics that are not much bigger than 150×150 pixels in dimension and assembling only what is on screen, and as you pan, shuffling off that data and pulling in more tiles. Therefore, the Xbox isn’t rendering in a traditional manner, like PC, or PS4. Meaning, it does not rely on pulling in 5GB+ of massive textures into memory at one time and crunching it all out in one go. You’d need a hefty amount of CU’s and a large pool of high bandwidth memory for that, like the PS4, or a higher end graphics card. Tiled resources, instead, needs ultra low latency memory ,so it can assembled the scene without producing noticeable pop-in. Think of it as like putting a 1GB file on a memory stick, vs, several 5MB files. Were this not the case, the ESRAM would be an entire cache pool of 32MB, and not 4*8MB. They are perfectly structured for handling tiles in parallel. Tiles also don’t require a massive amount of CU’s as the traditional rendering methods requires. So, if executed the way MS envisioned it in their tiled resource demonstrations, it should actually have left over GPU resources to put into other things, since the Xbox One is only concerned with rendering what is in view and not the things beside and/ or behind you. For years PC games used a similar thing, called Culling. THis is a bit more sophisticated and is the only reason why you aren’t seeing the huge 40% disparity between the consoles…but rather, just the difference of a few extra pixels…in most titles. This is where DX12 will help.

    • theduckofdeath

      We have been hearing about tiled resources on XB1 for two years. Has the technique been implemented in a game to this point?

      The difference is more than a few pixels…closer to 662,000 (1080p vs. 900p). You can say the PS4 and PCs use a brute force approach, but it is working. What is the point of tile resource and a proprietary embedded memory scheme and rendering techniques if your hardware implantation can’t match traditional schemes? There is little reason why the changes coming to PC with DX12 shouldn’t have poppped up on XB1 by now.

    • Xbox one 2econd gpu unlocking

      No we asked for a system to use tile recourses.
      And thats why we wanted edram

    • Michael

      Lol…GDDR5 isn’t much different than DDR. You a achieve the same goals but take different routes to get there. If GDDR5 was that much better you would see more 1080p 60 fps for ps4, but you dont. That’s because it has bottlenecks just like DDR. Some of the most powerful gaming computers still don’t use GDDR5.

    • theduckofdeath

      But that is a large part of the problem — the approaches are NOT achieving the same goals. Not as implemented. The another part being the GPU disparity. The API for XB1 seems to be cleaned up at this point.

      EVERY powerful “gaming” computer has GDDR5 or better on their graphics card. My 7950 from 3 years ago (seems longer) had 3 GB GDDR5, 28 CUs, 32 ROPs and was clock near 1 GHz.

      If you mean to say system RAM, that is different — I still use DDR3 @ 2400 MHz.

      Sony learned their lesson after PS3 and followed Microsoft’s lead with unified RAM. That is why in consoles you will find one or the other these days.

    • bewareofZ

      6 months later, the proof that it does matter is clear for all to see.
      720p Quantum Break.

    • Mark

      Good stuff Graeme. I have a picture from a DX12 book concerning PC and Xbox, and it does say “Increased draw call performance” on the X1. Imo, that’s exactly what we’re seeing in games like Scalebound and Quantum Break..

    • kreator

      Not entirely true, but good response none the less.

  • Psionicinversion

    @Xbotmk1… its ok they arent saying its going to magically increase power there talking about as engineering point of view porting between xbox and pc will be really fast and some point in the future when the dinosaurs upgrade to win10 can drop DX11 in games altogether and save developers time and money by vastly reduced port times and allow more time to optimise the game.

    of course also makes it very easy for the next xbox to potentially have an additional GPU unit to be plugged in, not saying they will because it would be kind of stupid but the functionality is there

    • Mark

      True. He’s saying the biggest savings will be time, mainly which will be for optimization and possibly any extra features that can be plugged in….sounds good to me.

    • Xbox one 2econd gpu unlocking

      YesThat’s what we have. It’s designed for an extra GPU to be plugged in.
      100% we have the most powerful console, even without that anyway

    • Psionicinversion

      Oh shut up for all your incessant ramblings you have never said that before so you just lie based on other peoples observations. Where are they going stick this gpu? There’s no socket no power connector no pcie lanes running off????? Nothing it’s something that can only be put in the next xbox

  • hvd hvd

    well if this doesnt help the xbox one nonthing from here on out will.so i hope for the best.

  • d0x360

    What nonsense. All it takes is one game to push the envelope with a new DX and soon every dev follows suit.

    Best example I have oa far cry. Of used dx9 and once people.say it on action everyone used dx9 and then 10 and now 11. Dx12 is a game changer for PC. It takes time because it’s not plugs and play it requires a total rewrite of the render path

  • Sindin78

    Even tho battlefront is a shallow repetitive bore of a shooter. Its graphics are about the best that I have seen this Gen. Therefore, I can safely say that resolution does not equal graphical quality. At this point resolution is meaningless to me…DX12 sounds as if it will offer some performance increase to resolution but more so its focused on effective rendering higher quality textures, filtering and frame rate increases which suits me just fine. Resolution means nothing.

    • Mark

      Been tryin to say this, ur on point.

    • Mr Xrat

      “Resolution means nothing” the Xbox gimp tells himself, just like he’s been telling himself for the last two years and will tell himself until the end of the gen.

    • Sindin78

      Shut your mouth sheeptard. Dont talk to me…your kind belong in a mental institution.

    • This Guy

      Bah it’s another one of those second place console delusions like how the Cell processor will change gaming or the emotion engine is the greatest advancement in graphics. lol.

    • Corey

      Resolution isn’t meaningless but it’s DEFINITELY not a very big deal. Not anywhere near a big deal as gamers like to make it. It’s very low on the totem poll for what makes a game incredible/graphically impressive. I will never play games on a big screen nor do I have any real desire to….so resolution doesn’t matter much at all at this point. Most gamers I know play on monitors or small TVs anyway…playing games on 50in TVs and ridiculousness like that is the worst!

      Battlefront is by far the most gorgeous game yet this generation. It’s given me the “new gen wow factor.” The only other game that came close was Witcher 3. They wow me with their ability to create atmosphere. (not very many games do that).

      All Battlefront needs is dynamic weather and procedural destruction and the game would be mind blowing. 8) But that would probably require a ridiculous amount of power and bandwidth, etc. that consoles just don’t have. Those are the games I dream about.

      DICE is about the only company that’s made FPS or multiplayer with that kind of atmosphere. Their production values are for the most part way above everyone else.

      I can’t even wait/imagine what their next game will look like when they make it. Till then I’ll definitely be enjoying Battlefront.

    • Sindin78

      Agree with the impressiveness of the graphics. Furthers my point that resolution is either meaningless or very close to it as you suggest. Either way it puts the resolution debate to bed for me. Unfortunately I get bored after playing battlefront after about 15 minutes. So to deliver these graphics at the same time while making the game engaging and fun is still not there. I turned battlefront off after 20 minutes last night and put in Halo BTB and couldn’t get off until 2AM.

    • Corey

      Fair enough. I have yet to get bored with Battlefront. Right now sniping is loathsome and clunky though. Battlefront could definitely have a bit more depth but the quality is superb. If they take this foundation and just build on it it’s gonna be fabulous.

      Halo 5 is definitely fun and between Battlefront, Halo, and Hardline….I’m set. But I don’t enjoy Halo very much at all unless I’m playing with friends.

    • Sindin78

      Roger that…lots of great games. Peace out

  • Mirimon

    Corrections:

    The pc have more financial impacts, not the xbox one. Not a single xbox since the first one has made any money. The brandname as a whole has been costing msft 2 billion dollars annually, now reaching 3 billion, just to keep it in operation. This is after earnings. The biggest sources of income for the brand is subscriptiin, micro yransactions, and minecraft, all of which barely put a dent into the operating costs.

    Second correction; Williamson seems to know little about this dx12 and its effect on xb1. It has been stated, several times over by xbox and msft brass that dx12 will have little to no effect on the performance of the xbox one and will at most just throw another wrench into the gears of development until that gets relearned (yes, even Phil and Nadela said as much).

    • GHz

      1st of all, selling hardware in the gaming console business is tough and is not an exclusive experience to only MSFT. Though Sony is doing better in that department (selling more PS4’s) it’s still not enough to guarantee them a significant profit, in which they can return monies earned to better their networks, invest in bigger games etc. It’s quite the balancing act. Its been said, going into the console hardware business is madness.

      2nd, it’s true, so for we don’t know of too many Devs/Engineers who know what DX12 will do for XB1. But what about Brad Wardell? He has full access and what did he say? He admitted that he don’t do XB but he had seen things that made him draw conclusions. He also told us that top brass @ MSFT told those who have access to dumb down the performance gains @ the time because the percentage increases would be too hard to believe on PC. Why would MSFT tell them that? Phil 1st tell us that DX12 on XB1 will allow us to see a graphic upgrade comparable to the jump from the 1st Halo game to Halo 4. Pple said BS! So he started answering differently. He started to say, “not so much.”

      Like the situation on PC, we’ll wait on the XB1 data, cause MSFT is tired of having the media and fanboys turning their claims into click bait misinformation.

      Remember when they 1st told us about the cloud + XB1? They 1st told us 3 XB1’s in the cloud for every 1physical XB1. Media turned that into a horrendous click bait story, with their armchair developers claiming to know better, and proceeded to “educate” us in why its not going to work. Now that we have data in the form of Crackdown 3 demo, today we know that number can be up to as much as 13 depending on the type of game you want to build. So even with the cloud they were holding back on the truth in 2013-2014. What now?

    • Corey

      The cloud is gonna be awesome for gaming when it can come to fruition. I don’t think it will this generation though…..we need better internet infrastructure to really make the incredible advancements that I can see coming. I mean….perhaps they can find a work around….but dang that’s gonna take some brain power and some magic hands.

    • GHz

      I think we’ll see it this gen. They demoed Crackdown 3 on XB1 hardware + XBLCloud. The claim is together, they do rendering that a beefed up PC cannot do. Now that’s XB1 tech. If my XB1 is giving me experiences in PRE ALPHA STAGE demos, in 2015, that a high end PC cannot duplicate, then I cant wait to see what it’ll do by end of its lifespan.

    • Corey

      true. The crackdown demo was pretty cool and promising. But it’s young, and that’s more of an uncomplicated arena game. I’m not sure if we could see something like it implemented into something complex like a Battlefront or Battlefield game with all that foliage and the awesome realistic textures, etc. I certainly hope so but the sheer amount of resources that that would take makes me think that there probably isn’t a company willing/capable of doing that just yet.

      I want crackdown 3 destruction (tweaked to be a LITTLE less chaotic)…..Battlefront detail/fidelity/map quality….and Battlefield 3-esque gameplay…..with dynamic weather and atmospheric effects. 😀 Dream game right there. But who knows….maybe that could happen this generation. But it probably won’t till way later in the generation if possible.

    • GHz

      “uncomplicated arena game. ”

      How you figure? You see big scale destruction that you cant do on a high end PC rig, but because there’s no foliage, and the graphics are in Pre Alpha, Crackdown 3 is uncomplicated? My expectations for XB1 is high, but d@mn. LOL…Give the soup a little time to cook dude.

      Yeah great wish list. I’m hoping that EA , UbiSoft, Activision use the cloud to the fullest. No parity. It would be a disgrace to all gamers.

    • Corey

      from what I’ve seen on of the demo the overlaying gameplay looks like a rather simplistic arena shooter without complex animations, and realistic textures, etc. Don’t get me wrong, not dissing it or anything. Just saying….it looks more like a giant sandbox full of legos…..not a super realistic…game. (im probably mainly commenting/reacting to the art style more than anything I guess)….I just find myself wondering if it could handle the same type of thing with all kinds of complex, realistic assets, animations, etc like a Battlefield game. And how all that would work, interrelate, etc…..and whether a company like DICE would be able to produce something with it. Cause that would be incredible.

    • GHz

      “im probably mainly commenting/reacting to the art style more than anything I guess”

      Yeah you are. Pre Alpha textures @ that. As far animation on that scale, from characters, weapons, explosions, deformation, interactive building demolition, that is as complex as it can probably get for now. It certainly never been accomplished on that scale on console or PC before. This is a 1st. Your expectations of what the XB1 should be doing is very high & that’s good.

      I cant see why Dice cannot do it. We all know consoles can render the “art style” locally. We can have a Battlefront game doing Crackdown style large scale demolition. But it’s up to dice if they want to use XBLCloud to accomplish that.

    • Corey

      Indeed. I guess that’s my main thing. I wonder if other devs will take the time to use it or learn how to use it. Especially 3rd party devs. They’d have to build two different versions of the game if they did…which I don’t think they’d do.

      So cloud stuff might have to just be used for exclusive games..which stinks because my favorite games like Battlefield aren’t exclusive games.

      Can you even imagine a Battlefield game with that level of destruction along with physical based rendering and weather effects, 20 vs 20…..with vehicles…..etc. etc. AH! I so hope that can happen this generation at some point. But honestly I’m still not convinced it can or will.

    • GHz

      Yeah, I sort of agree. It’s hard for me to see 3rd party jumping in. That’s because I’m not knowledgeable about the other cloud solutions though. We know cloudgine does the problem solving for devs so that they can concentrate on just building the games. So as far as the difficulty goes, that supposed to be solved. How Sony fits in is one of the main questions. Another is, will the techniques discovered by MSFT for cloud gaming, be adapted by companies who have their own servers, such as EA, Activision etc? Cause if they do, then PS4 gamers can also enjoy Crackdown 3 style multiplayer gaming.

      Keep in mind though that big companies are trying to move all their games into the cloud. Cause when done right its actually more cost effective in the long run.

      A bigger FX heavy Battlefield is always a great idea. We wait n see.

    • Jeffrey Byers

      the CLOUD as a rendering tool is problematic in the extreme. Not only the obvious latency issues, bandwidth to stream video, but also the cost to render the extra content is significant.

      To offload client calculations for the other people in MMO’s to minimize your CPU bottleneck and reduce some round-trip latency helps those games but visual processing is a long way off except as a niche experiment.

    • GHz

      What you think about https://www.shinra.com/us then?

    • kreator

      The armchair PS3.5 dev’s on here are HILARIOUS!

    • Mirimon

      I would say so… making games for an imaginary console that is also likely more powerful than the xb1 is pretty funny. You should start a small comedy oriented graphic novel.

    • kreator

      Nope, but i’d start a 50 shades of grey starring someone you know 😉

  • GHz

    “so the main benefit will be from an engineering point of view, where you can realistically drop your DirectX 11 version altogether. The simpler platform will allow more time to be invested in optimizing the game.”

    Here it is again and this time coming from Don Williamson who currently specializes in performance and engine optimization. What does that mean? Remember what Brad Wardell said back then?

    ” None of this has to do with getting close to the hardware. It’s all about the cores. Getting “closer” to the hardware is relatively meaningless at this point. We’re way beyond that.” – Brad Wardell
    Here’s the example.

    • MrZweistein

      The picture is wrong (someone made that from the original of Brad Wardell but didn’t understand the original diagram very well)!

      First of all: all GPU cores are used in all cases (DX9, DX11 and DX12). The differences in the APIs are how the graphic commands from the code (CPU) to the GPU and especially how the API handles it.

      With DX9 only one CPU core can send commands to the API. Which then sends the commands sequentially to the GPU.

      With DX11 several CPU cores can send commands to the API but the API still will send those command sequentially to the GPU. Thats a bottleneck because commands have to wait for the predecessor in the command queue.

      With DX12 several CPU cores/threads can send commands to the API. The API also can send those commands to several command processors of the GPU in parallel. There is no bottleneck but more work on the code side to deal with dependencies.

    • GHz

      Well I took the above diagram to mean that engineers will now have complete control of the GPU under DX12.

      And I found Brads original 🙂
      https://uploads.disquscdn.com/images/d386fbc46763545ff71b0ac188541988aec14f2ec5dc9845f2bf6158d6090f78.png

    • MrZweistein

      Well, that diagram is indeed correct but doesn’t really show why you have more control over the GPU.

      With DX12 the code itself has to sync the tasks because a generic API cannot do this due to the fact it doesn’t know the logical order of the commands that the code requires. So the developers itself need to implement those validation checks and sync mechanism that were part of the APIs (DX11 & DX9) before. Much more work and more complexity on the Dev’s side but better efficiency in result.

    • GHz

      So legit diagram but my interpretation of it is wrong? Gotcha. But it is accurate to say that DX12 can grant you more control of the GPU right?
      I mean I’m not as technical as you but that’s what you’re saying in the end. Am I correct? And thanks for your input. 🙂

    • MrZweistein

      The Wardell diagram originally was made to show the differences in draw calls output and why there was this significant increase of performance with DX12 over predecessor versions on PCs. Indirectly it explains what you wrote but only if you understand the architecture as whole and the consequences of it on the software side.

      The multi-threading paradigm of DX12 forces you to move responsibility for validation/sync/memory alloc mechanism into the direction of the developer and his own code. It’s more like a “need” or “demand” because its the only feasible solution to handle the parallelity of things happen now. As I said earlier a general API that doesn’t know the exact logic of your code couldn’t not do that.

      Of course this doesn’t make coding easier because you have to write additional code but on the other hand debugging gets easier because you now have not the uncertainty how the API handles things, that was always a reason for weird behavior.

      Which brings me back to your original point/question. Putting all this into one conclusion you can say if a developer mastered DX12 or using an engine that masters DX12 – I really mean master and not some half assed implementation – there will be more time for optimization for the platforms (PC, Xbox).

  • Michael

    We all know xbox was the better engineered machine. Sony took cheap parts and threw in some GDDR5 and said they have more power but have yet to produce any proof to back it up. They can’t even put together resources for BC and other features xbox has been doing since day one.

  • Failz

    Good to here DX12 is still looking promising. A free update like this is always welcome. Soon PC gaming will conquer the planet once again if it hasn’t already muhahaha.

  • Xbox one 2econd gpu unlocking

    We have the hardware for an expansive port for a plug in GPU.
    We haven’t agreed on when we want the upgrade yet.
    But DX12 is for unifying our GPUs and will be used for this.
    Our layer GPUs
    Our Future upgrade GPU
    Cloud GPU processing centre.
    All linked
    We are beta testing for our future system updates and upgrading

    And we game on +12

  • Godzilla

    bottomline it still won’t change the public’s perception of the xbone

    • GHz

      Games will. Talking XB tech makes no sense w/o data to back it up. The fact is even w/o DX12 in the picture, graphics on console will improve. That’s the natural order of things. What we know is, that MSFT downplayed results of DX12 on PC until they had the data to prove their astronomical results. They downplayed how many XB1’s in the cloud for every physical XB1, minimizing it to only 3. Today we know it can be up to 13 or more, thanks to the Crackdown 3 demo. And that was pre alpha! AMD told us that all benefits that PC gets from DX12 will be the same on XB1. We have the slides to prove it. In the beginning Phil gave us a hint. He told us XB1 graphic jump with DX12 will be like going from PDZ to Halo4, before he started downplaying it because of the backlash he received. So we watch as these games improve. Games built specifically for XB1 do just that, surpassing their 3rd party cross platform counterparts.

    • XbotMK1

      Xbox One won’t get a performance boost from DX12 like PC will because Xbox One already uses low level DX12 features. This was stated by Phil Spenser.

      If DX12 was going to boost Xbox One you would see the benefits by now now but it hasn’t. DX12 will only make it easy to port between Xbox One and PC. The cloud doesn’t make Xbox One stronger either. That was nothing but a lie Microsoft used as a marketing trick.

      The PS4 has a lower level API of it’s own as well but the PS4 is simply a stronger console which is why nearly every game looks and runs better on the PS4.

    • GHz

      “Xbox One won’t get a performance boost from DX12 like PC will because Xbox One already uses low level DX12 features. ”

      Of course not like PC, but @ the same time the experts also told us that it has nothing to do with API lower level access. In fact they said that’s irrelevant.

      And this what Phil really said…..

      “On the DX12 question, I was asked early on by people if DX12 is gonna dramatically change the graphics capabilities of Xbox One and I said it wouldn’t. DX12 makes it easier to do some of the things that Xbox One’s good at, which will be nice and you’ll see improvement in games that use DX12″

      Simply put, it means their will be a change & you’ll see improvements but it wont be dramatic. But what does Phil actually means when he said you’ll see improvements that aren’t dramatic? Lets go to the source.

      @nickomatic20 When you think about start of gen to end of gen teams learn a lot. DX12 will help as well, think PDZ to Halo 4.— Phil Spencer (@XboxP3) April 7, 2014

    • This Guy

      Wow, I almost “Liked” one of your comments until you went off the rails with the cloud ranting once again. So close.

    • Tech junkie

      It’s not our fault you are a retarded broke fool. Again take your meds

    • snOOziie

      You are probably right…. but you never know.

  • Tech junkie

    Hey look the same thing I’ve been saying for ever.
    Wheres my good buddy xbotmk1. Maybe he found his meds

  • Mr Xrat

    I remember Xbox gimps saying DX12 was supposed to be around by now. All those power differences were supposed to be nullified by now. That unlocked core a year ago was going to make all the difference by now, they said.

    Xgimps sure must be tired of being wrong by now.

    • snOOziie

      They still have Six Core’s just for games.. What it comes down to is the developers are lazy and all the prove you need is PlayStation is only running 3 Core’s for games at a slower clock speed.

    • Mr Xrat

      Xgimps making stuff up again. The only difference between the CPUs is the clock speed and that’s made sod all difference.

    • snOOziie

      Oh ok newbie you are wrong we have known for a long time now I think 3 years! The The developers have been asking for access to the 4 cloud Core’s! Yeah that’s right cloud Core’s…… lol The reason why they call them cloud Core’s it’s because they run at a totally different speed to the main 4 no one actually knows what speed…. that is the difference between a $3 billion dollar chips and a $3 hundred million. I hope you learnt something today… Please come again!

    • Mr Xrat

      They are no cloud cores, Xgimp. Lay off misterxmedia’s crack and join us in reality.

    • snOOziie

      For a fanboy you are not that knowledgeable about your brand! Let me guess your 15….. This is old news and Sony are the ones that call them cloud Core’s….

    • Mr Xrat

      Might want to try citing this crap, friend. 🙂

    • Truth™

      Reality where the PC dominates and will use DX12 and Vulkan effectively to annhilate the consoles.

      Poor 900PS4 owners. They were so sure of Async Compute and hUMA making all their games run at 1080P 60FPS and then there are none. Not even Unsharted 4 which they botched so bad they fired the programming staff quietly so neogaf wouldn’t find out.

      It’s going to be a long generation. 720P PS2 games and 720P VR to defend after the Uncharted 900P Disaster. Top kek

      https://uploads.disquscdn.com/images/d57eb59f107f4d8c10a25ff3cf0aaad5a23109a102cdf97421e04628dd417d09.jpg

    • red2k

      Yeah yeah….Put the CPU of PS4 at faster clock speed and use more cores to add CPU power and you will kill you GPU bandwith… Thats something surelly you knew.

    • XbotMK1

      Microsoft fanboys are getting really desperate and defensive now that more and more developers are coming out and basically stating what we’ve already known, and what I’ve been stating the past year about DX12 not helping Xbox One. Developers, myself, even Phil Spenser and basically everyone who isn’t a Microsoft fanboy has been saying the same thing for the past year but Microsoft fanboys continue to stay in denial and cling to false hopes. The API inside the PS4 and Xbox One can change but the hardware does not. Anyone who uses a proprietary Microsoft API such as Direct X to justify their purchase of an Xbox One, is either a corporate slave, or a fanboy.

      The reason why these Microsoft fanboys cling to DX12 is so they can use it as an excuse or cop out whenever someone says the PS4 is stronger, for example…

      David to Mr. Microc0ck: “I recommend buying a PS4 because it is cheaper, has more games, and it is stronger so most games run better on it”

      Mr. Microc0ck: “No I bought an Xbox One because DX12 is going to be huge”

      That is the only reason Microsoft fanboys talk about DX12. It’s so they can use it as ammo, a social defense mechanism, and FUD. These Microsoft fanboys are just lying about DX12 because they want something to cling to. Microsoft fanboys need security and DX12 is what they use when they can’t accept facts. Microsoft fanboys feel guilty that they bought a console that is both weaker and more expensive than the PS4, so these Microsoft fanboys fabricate their own beliefs about DX12.

    • GHz

      “Microsoft fanboys are getting really desperate and defensive now that more and more developers are coming out and basically stating what we’ve already known, and what I’ve been stating the past year about DX12 not helping Xbox One.”

      @nickomatic20 When you think about start of gen to end of gen teams learn a lot. DX12 will help as well, think PDZ to Halo 4.— Phil Spencer (@XboxP3) April 7, 2014

      QuestoTuto if i had to pick 1 console it'd be the xbo.— Brad Wardell (@draginol) February 15, 2015
      What’s that you said about Phil & game engineers sharing your point of view? O_O

    • bewareofZ

      Windows developer picks Windows platform *shock*

    • This Guy

      “Microsoft fanboys are getting really desperate and defensive now”
      And yet PS4 fanboys are staying quiet about the whole thing right? Lol, sure they are.

    • Graeme Willy

      It’s possible for DX12 to close the gap, but only until the competition updates their API lol. Any software strides can be mimicked and/ or bested. All Sony has to do is update their own API for any shortcomings (if any), and/ or simply use Vulcan or Mantle and we are back to square one. You can’t update software and pretend the competition isn’t going to retaliate, even if the competition doesn’t really even need to, on account of having the better hardware in the first place. Microsoft enhanced their ESRAM and gave more GPU to developers by axing out the Kinect, Sony made more system memory available for gaming use. Microsoft unlocked their 7th core and so did Sony, in response. Pretty soon, Microsoft will unlock the remainder of their CU’s, for the full 896 stream processors and Sony will unlock their remaining CU’s for the full 1200 stream processors. Then we are back to square one, which is about a 40% difference lol.
      Plain and simple, PS4 uses an R9 270, and Xbox uses an R7 260x. You can’t close the gap, but you can change the way software runs/ how it is processed, and that’s only until the competition does the same.

    • red2k

      Yeah thats why we have Forza 6 locked at 1080p 60fps and TR using Async Compute but at the mean time PS4 owners have only a Lastgen Remaster for this season. Do you need more prove than that… The best thing is if we even don’t touch the DX12 and with DX11.3 have good performance bonus when the engines use properly the new API will be more gain for X1 performance. I remember when Sony fans said we have the most powerfull console and at the same time can’t run a single nextgen exclusive locked at 1080p 60fps. Thats sad… Anyway if someone think that the cloud or DX12 have not impact on Xbox One is because an absence of gray matter.

    • Mr Xrat

      If you’re gonna try and talk up your crap games, don’t forget the last gen textures, crap AA, crap filtering and other shortcuts to get to your 1080p60.

      The Xbone isn’t special. Your sekrit sauce won’t do anything.

  • MACDUFF

    Gamingbolt i hope gets paid a visit by isis wannabes like in france and kill everyone in your office

    • GHz

      -_________________-
      You’re an @ssh0le. We do NOT need gamers like you. Just STFU.

    • This Guy

      Wow if there’s ever been a use of the “Flag” button on a comment, it’s now. Amazing how tough people act when they hide behind a keyboard.

  • YOUDIEMOFO

    “Ability to UNIFY GPU’s…….” Funny though as there is nothing to unify on the Xbone anyways…..there is only one GPU.

    The point of dx12 is the ability to scale all of the work load across all of your hardware as if each graphics card could handle its own respective task. Before dx12 your four GPU’s would have to run the exact same code for the exact same frame while using the same amount of memory for less and less improvement on scaling the further you go into it with more and more GFX cards.

    Scaling is not a factor for the Xbone……it’s simply raw power. All in due time I assume we will see if this is all smoke or not.

  • Dirkster_Dude

    Fewer lines of code to do the same task makes the task more efficient and if that means faster or prettier then it will be faster or prettier. How much so is the question. The fact that there will be some improvement is a given, but not how much.

  • Graeme Willy

    DX12 is a major step foreword. It brings DX back to its DX9-like roots, but with the efficiency of DX11…and then some. When DX11 was unveiled, it sought to make development easier, by automating certain tedious functions in order to simplify development. It did well in the area of simplification. The thing is, however, some developers prefer to have complete control, for sake of efficiency. It also takes resource to automate anything. So DX11 was a bit more of an software overhead than it needed to be. DX12 will forego these automations, give developers complete control, but at the cost of increased difficulty in development and tediousness…given, the lack of automation. So, while seasoned developers prefer as much manual control as they can be given, others who preferred a simple means of development, such as independent developers, may soon find DX12 a little more overwhelming. So it’s a win/ lose…it just depends on the development studio.

    How this will help? Well, while you are ultimately limited by hardware, you can tighten up efficiency and make sure it’s performing at a near and theoretical 100% performance point…but no hardware is leveraged at 100%, despite claims you see by developers who state they are pushing something 100%. First, there’s a human factor and software is overhead in itself. So, even if you did push said hardware 100%, an update to tools and API/ OS will come out that will allow further optimization and that theoretical margin just increased. The goal, is to design software that hinders as little as possible.

    The second reason, is if you push CPU and GPU too hard there are TDP concerns. If you run a benchmark on your PC’s CPU and/ or GPU you are pushing those more than an actual game ever will…aside from titles like Crysis, which come close. So you have to be extra particular in cooling, perhaps ditching OEM cooling for something better, like water cooling, or massive heatsinks and fans. When DX12 first promised to aim at pushing GPU’s 50% harder, chipmakers were concerned that existing GPU’s would not be able to handle or sustain the temperatures and that there would be a lot of GPU’s being fried because GPU’s, up until now, have been designed under the idea that no GPU is ever pushed to such extreme performance points, other than benchmarks. So, while you could theoretically run an R7 260x(Xbox GPU) at or even slightly past the default performance point of the PS4’s R9 270, including any software overclocks, I highly doubt that the Xbox One was designed to handle those thermal levels. The heat sinks aren’t that massive, for one…but I have always been perplexed as to why Microsoft left the GPU underclocked. An R7 260x should, by default, run at 1100GHZ…the same with the PS4’s GPU. There should really be no thermal difference between underclocking by 300 MHz versus the default speeds.

    However, DX12 is not the only API out there. PS4 uses a great one, too and so while DX12 may seek to narrow any disparities, by any margin, that margin can and will grow again. Both sides are fine tuning their hardware, at this point. Look at the 360 and Rise of the Tomb Raider, for instance. It does better in certain places than the Xbox One. This is because the Xbox 360 is technology matured.


 

Copyright © 2009-2015 GamingBolt.com. All Rights Reserved.