Stardock CEO On Xbox One’s ‘Crummy’ Bandwidth Affecting Number of Draw Calls, More Talk On DX12

More details regarding draw calls and batches using DirectX 12.

Posted By | On 09th, Feb. 2015 Under News


xbox one amd

Stardock Studios CEO Brad Wardell has been rather vocal about his support for Microsoft’s upcoming API, DirectX 12. And once again he had some rather interesting views on Twitter regarding the DirectX 12, draw calls batches, Xbox One and its bandwidth affecting the number of draw calls that can be made.

Wardell talked about the number of batches that can be rendered using DX12 when compared to DX9. According to him, DirectX 12 on current hardware can do over 75,ooo batches whereas DX9 could only manage 15,000 at maxed out. However, he further stated that such batches “depends on the scene” and the Naboo battle from Star Wars Episode I: The Phantom Menace could be done with 75k batches. Of course, as Wardell argues that it ultimately it boils down to whether 75k batches can deliver movielike quality as there’s a certain number of batches the hardware has to hit for realism.

Wardell also talked about the Xbox One by stating that it should be able to do around 50,000 batches whilst comparing it to the Xbox 360 which is capable of rendering 5000 using the Unreal Engine. But Wardell seemed unsure due to the console’s “crummy bandwidth”, although its architecture with 8 CPU cores and a mid-range GPU should be able to handle it.

What are your thoughts on Wardell’s opinions with regards to the number of draw calls possible on the Xbox One? Let us know in the comments below.


Awesome Stuff that you might be interested in

  • rodney patrick

    Gamingbolt you’ll talk about the something but try to put a different dress on it…..smh

  • Starman

    Now where did you dig up this CEO of a basement INDIE COMPANY …. to come up with this cock-&-bull story …lol

    • DarthDiggler

      @disqus_tpOqHGvHlR:disqus

      I guess if you are going to use the word crummy better be able PS4 right?

      God forbid anyone have a contrary opinion to yours and express it on Twitter!

      Which company are you CEO of BTW? Cock-&-Bull Enterprises?

  • trfe

    Click bait article. Dude was overly positive and headline paints it as negative. This site sucks.

    • DarthDiggler

      @trfe:disqus

      Crummy was a quote. 🙂 He was praising DX12 and dissing XBO bandwidth. You can read English right?

    • d0x360

      Mmm yes except the bandwidth isn’t crummy. Less than ps4? Sure. Crummy? No. Poor choice of words.

    • corvusmd

      Well and even saying “less than PS4” isn’t always true. PS4 achieving max bandwidth just means that the GPU is going 100% for 100% of the time. X1 achieving max bandwidth means devs are properly using eSRAM in addition to X1s GPU…which in that case if they do, X1 has a much higher bandwidth.

      Although in his defense, maybe he just thinks all console bandwidth is crummy….he may be thinking compared to high end PCs and/or what is needed for movielike effects. This quote sounds like it’s a bit out of context (twitter quotes usually leave a lot of room for imagination).

    • d0x360

      Yes you are correct. With full hardware utilization the x1 is capable of higher total bandwidth and lower latency BUT most games are just ports from PC and ps4 and dont make proper use of the esram. They pretty much ignore it or outright just it improperly. Some day engines will ditch cross gen support and be better setup to make use of each specific console but we aren’t there yet. Probably won’t be till 2016 unfortunately. Hopefully Microsoft rolls esram usage tricks from forza 6 and halo 5 right into the SDK so any dev can use them with little effort.

    • corvusmd

      Word 🙂 This is actually the aspect about DX12 that has me intrigued, not directly the API aspects…its that DX12 will apparently make using eSRAM much easier and more enticing for Devs…and provide more tools and options when using it.

    • d0x360

      It certainly can’t hurt. The best thing about dx12 is it will change the way games are made for everything. It will allow way more objects, way more detail etc and the change in coding style will mean improvement to games regardless of platform. The ps4 doesn’t support dx12 but it has a similar api as all consoles always have but due to the general way games are made the benefits weren’t often seen but they will be now.

      The downside to dx12 is consoles always had that low level hardware access which allowed a console to punch waaaay above its weight compared to PC. With that advantage gone PC is set to leave consoles in the dust by a massive margin without needing nvidia titans.

    • DarthDiggler

      @d0x360:disqus

      The best thing about dx12 is it will change the way games are made for everything…….

      That quote is so obtuse you could apply it to ANY Direct X update and it would still be valid. 🙂

      The downside to dx12 is consoles always had that low level hardware
      access which allowed a console to punch waaaay above its weight compared
      to PC. With that advantage gone PC is set to leave consoles in the dust
      by a massive margin without needing nvidia titans.

      Wow you are just full of MS press release bullet-points today. 🙂

      How about an original thought? Are you ready?

      PC isn’t at quite the same “low level” a console is. There are still a great many functions inside of a PC that even when they remain “idle” they may still be sipping some resources or memory. That doesn’t exist on a console. All functions are dedicated to it’s design. Which gives it quite a leg up from the general purpose PC.

      The single specification that consoles employ provides developers a great deal of uniformity. So they don’t have to target a “range of hardware” they know the hardware and specifically CODE for that hardware. Which is why games that come late in the generation tend to surprise people with how good they look even when the consoles are long in the tooth. Developers have matured their tools and sometimes can replicate popular hardware features of the day via software. That sort of development just NEVER happens on the PC, it doesn’t make economic sense.

      Not trying to diminish the PC, just pointing out due to it’s general nature its just as suited to play games as it is to do Word Processing.

    • bardock5151

      All about that damage control aren’t you. Just shut up. You keep banging on about devs who supposedly have bashed the X1, yet you won’t put up any sources. Also show me some devs who work with DX12 and DX11 who say DX12 will have no impact.

    • TheWayItsMeantToBePlayed

      You mean how games like Alien Isolation, Mordor and Far Cry are easily running 1080P 60FPS on PC with higher settings as part of it’s “General” nature.

      Please, continue. I want to hear more about how all this will “Never” happen on PC even when it’s more advanced than consoles now.

    • DarthDiggler

      @corvusmd:disqus

      DX12 will not be a ground breaking event on XBOX ONE. It will bring some improvements, but most of the improvements will be for the PC. Of course MS can muddy the waters by saying stuff about DX12 and not specifying the platform.

      IMHO the whole reason XBL is coming to the PC and PC Streaming is coming to the Xbox is so they can show PC screenshots in the living room and further fool their audience.

    • corvusmd

      I’m not sure why you are saying this to me, but ok

    • DarthDiggler

      @d0x360:disqus

      most games are just ports from PC and ps4 and dont make proper use of the esram. They pretty much ignore it or outright just it improperly.

      You don’t have any evidence for this statement at all. Unless you are privy to the inner workings of the games being developed on XBOX ONE, you are just speculating.

      It’s not like developer diaries are going to say things like “Oh we didn’t feel like coding for ESRAM so we got lazy and just pushed the game through looking like poo on XBOX ONE”.

      Seriously there is no way you could prove that unless you were the guy at XBOX certifying all the games for XBOX ONE.

      Some day engines will ditch cross gen support and be better setup to make use of each specific console but we aren’t there yet.

      No they won’t because developers don’t want to develop a game more than once if they can help it. Just because a game engine can handle multiple platforms doesn’t mean there are ZERO optimizations for those platforms. Granted some engines may have better support than others.

      Hopefully Microsoft rolls esram usage tricks from forza 6 and halo 5 right into the SDK so any dev can use them with little effort.

      Yeah 32MB of memory is going to make DDR3 memory faster than GDDR5. Google Deep Silver’s response to ESRAM on the XBONE. They said the bandwidth wasn’t wide enough for 1080p 60fps gaming.

      I am sure MS will have great things to say about the ESRAM at all times, but clearly what they say is different than what developers say and you can’t just blame the developers for saying it.

    • DarthDiggler

      @corvusmd:disqus

      Well and even saying “less than PS4” isn’t always true. PS4 achieving max bandwidth just means that the GPU is going 100% for 100% of the time. X1 achieving max bandwidth means devs are properly using eSRAM in addition to X1s GPU…which in that case if they do, X1 has a much higher bandwidth.

      Please cite a source for this. You are suggesting that the XBONE designers got their system perfectly balanced and that the PS4 designers did not. Both systems have been praised for their balance.

      How did MS fool all of you XBOX fans into thinking that 32 mega bytes (yes MEGA not GIGA) is some sort of silver bullet in game design? I am not saying it is useless, but benchmarks showcase that DDR3 / ESRAM is slower (and more complex to code for) than GDDR5.

      Although in his defense, maybe he just thinks all console bandwidth is crummy….he may be thinking compared to high end PCs and/or what is needed for movielike effects. This quote sounds like it’s a bit out of context (twitter quotes usually leave a lot of room for imagination).

      He used 124 characters in that twitter post. He had 16 characters left. He could have said XB1 & PS4 and still had plenty of head room. He could have said Consoles instead of XB1 and still had plenty of room. You are grasping at straws with this argument.

    • corvusmd

      “Please cite a source for this. You are suggesting that the XBONE designers got their system perfectly balanced and that the PS4 designers did not. Both systems have been praised for their balance.

      How did MS fool all of you XBOX fans into thinking that 32 mega bytes (yes MEGA not GIGA) is some sort of silver bullet in game design? I am not saying it is useless, but benchmarks showcase that DDR3 / ESRAM is slower (and more complex to code for) than GDDR5.”

      How did you have time to count that he used 124 Characters in a Twitter quote, but not have enough time for a 10 second google search? Then you went on to ramble about stuff that you were ASSUMING I was saying that I didn’t say…..sounds like you are in full blown fear mode trying to run defense force. It’s failing miserably.

      “He used 124 characters in that twitter post. He had 16 characters left. He could have said XB1 & PS4 and still had plenty of head room. He could have said Consoles instead of XB1 and still had plenty of room. You are grasping at straws with this argument.”

      When being asked a question about X1 and DX12 and how it will relate to creating Star Wars or Lord of the Rings like visuals….why would he stop and talk about the affects of DX12 on PS4? Also, the thing about the definition of “crummy” here is a little bit taken out of context. It’s not listed in this article, but the bandwidth he was talking about was JUST the GPU without eSRAM. It also does not rule out that he is talking just in regards to making those level of visuals and that he DOESN’T think the same about PS4…esp since X1 has a higher Bandwidth with the inclusion of the ultra-fast 32mbs of eSRAM which sole purpose is to move information quickly.

    • TheWayItsMeantToBePlayed

      Same PR they used convincing you that slow, laggy and bottlenecked 8GB GDDR 5 would produce the same results as PC ultra.

      It’s one thing to chastise the other fanbase for codswallop. It’s another to parrot your own MisterCernyMedia antics like GPGPU (Which requires a lot larger GPU than either console has)

    • DarthDiggler

      @d0x360:disqus

      This isn’t the first developer to bad mouth MS on their design of the XBONE. Many have remarked on the complexities of ESRAM / DDR3 memory vs full GDDR5. It is Brad Wardell’s opinion that the XBONE bandwidth is crummy. Given that what he says isn’t exactly a unique opinion there is likely a great deal of truth in what he was saying.

      Good gawd you XBONERS sure are sensitive this generation.

    • corvusmd

      Except he isn’t exactly badmouthing them for sure, it seems like you’re reading into this MUCH more than you need to.

    • corvusmd

      Yeah after reading it, it was a pretty retarded choice for a headline. Painting a picture of good things that DX12 would help achieve, and even specifically saying that it would do the same for X1 to a degree. Then he even specifically said he wasn’t sure that Bandwidth would even be an issue…then on twitter (where being descriptive is limited) he did say X1 Bandwidth is crummy…but compared to what? High end PCs/What is needed for movie effects? Compared to his ideal situation? (This would also imply that he thinks all console bandwidth is crummy being as X1 has the ability to have the highest bandwidth).

      …yet out of all the positive comments, the headline was the ONE THING that could even be taken (maybe even out of context) as something negative about X1…are we even trying to pretend there isn’t a bias anymore?

    • JerkDaNERD7

      Bias or not, these are positive things for developers. At least among there own conversations. It’s just sad but relevant that these type articles are even precedent.

  • Dirkster_Dude

    We have heard the same thing over and over about what DX12 will do for the XB1 and PC. On games that use DX12 it will improve what is possible using less resources. This can only be a good thing regardless of how much improvement it provides.

    • Mark

      Boom

  • red2k

    Resume… Xbox One will have a improve of 300 to 500% with DDR3. GB have the opportunity of make an article of the original source http://www.anandtech.com/show/8962/the-directx-12-performance-preview-amd-nvidia-star-swarm . But again… they try to interpret a Twitter message…. typical Sony fanboy. ESRAM is a additional tool and the Microsoft engineers already talked about what they want achieve with the combo.

    • otherZinc

      Well said.

    • DarthDiggler

      @otherZinc:disqus

      The problem is the article he cites is about DX12 on the PC not XBONE. So its a misrepresentation of the facts. I can’t even find the 300% – 500% figures he quoted, so it may be well said but he didn’t provide any evidence to back up those numbers.

    • Guest

      No, its not well said! Its a misrepresentation of what the article actually says. No where in that anandtech article does it state that the X1 will get those gains. He’s just inserting his hopes and wishes into it and you’re stupidly falling for it.

    • DarthDiggler

      @red2k:disqus

      Xbox One will have a improve of 300 to 500% with DDR3. GB have the opportunity of make an article of the original source http://www.anandtech.com/show/8962/the-directx-12-performance-preview-amd-nvidia-star-swarm/

      Where did you get that information from? You have cited an AnandTech article which doesn’t mention the word XBOX in any of the 9 pages of the article. That article is all about what DX12 will do for the PC. Xbox One isn’t a PC, it has a PC processor and GPU, the motherboard

      This business about DX12 and the XBONE has got to stop. Everything that I have read from developers has been that DX12 will make some marginal improvements but will not give the XBOX ONE parity with the PS4. Feel free to link me to any exact quotes from developers that you have that suggest otherwise. Any improvemens quoted at 300% to 500% for XBONE are not based on any benchmarks and sound like a bullet point for a press release.

    • John Doe

      DX12 won’t help Xbox One reach the PS4. When Wardell is referring to max batch counts and performance boost, he is referring to specific things, not overall performance. Xbox One won’t get any performance boost. DX12 will help 3rd party devs use techniques that first party devs allready use as well as new techniques compared to DX9.

      Also devs aren’t going to use DX12 unless you can port from it.

    • bardock5151

      Devs are going to use DX12 because you can port from it. Do some research.

    • Guest

      Yeah from PC to X1 (or vice versa) but how about to PS4, PS3, Vita, Wii U, Andriod, OSX, iOS, and Linux? They’re all OpenGL derived. Do some research.

    • bardock5151

      Do some research. The big 4 (who gives a f&ck about the vita, Sony sure doesn’t) Xbox 360, X1, PC and Ps4, three of them share the same basic architecture, while the other three share the same environment.

      The next point is game engine’s like UE4, Unity, CryEngine and so on all have built in support for OpenGL and DX, therefore negating any smaller studio’s need to get too technical. Do some research.

    • Fweds

      Notice “John Doe” using his multi Discuss accounts to upvote his other user name comments.

    • Fweds

      DirectX 12 boosts frames per second by 600% on AMD hardware

      “An early look at Microsoft’s DirectX 12 graphics API has shown that AMD graphics cards – and potentially Xbox One could be in line for a significant performance boost”

      http://www.mcvuk.com/news/read/directx-12-boosts-frames-per-second-by-600-on-amd-hardware/0144973

    • TheWayItsMeantToBePlayed

      You seem paranoid about this tech. Shouldn’t you be more worried about Sony’s lack of AF or other console bottlenecks instead.

      I mean, if what Wardell is saying, the PS4 and Xbone will be obsolete by the end of the year anyway

    • Guest

      And you seem to care to much for a supposed PC fanboy. Shouldn’t you just care that its gong to improve the vastly unoptimized PC? You’re just a bot.

    • Guest

      The eSRAM is not “a additional tool” its a crutch, a bottleneck! it got put there because MS (you know the big bad rich company) was to cheap to put in GDDR5, because they thought there would not be enough 8GB densities in time for release and was afraid, it would end up being to expensive if it didn’t happen so instead opted to put in the minuscule 32MB of eSRAM to compensate. Sony was considering doing the same thing (but with EDRAM that would have had over 1TB of bandwidth, cuz it was cheaper) but also decided against it, cuz they knew it would cause devs problems. You revisionist fanboys need to stop making stuff up and passing it up as fact when its not. Other even dumber and more desperate fanboys believe the ish and then start parroting it around like its true and then the echo chamber starts and dumb fanboy think its fact.

    • Fweds

      NOTICE “JOHN DOE” IS USING HIS MULTI DISQUS ACCOUNTS TO UPVOTE HiS AND OTHERS PS4 Bias comments.

  • Illusive Man

    Sounds like he was not factoring in ESRAM. When that is factored in real world bandwidth using concurrent read/write speeds of ESRAM+DDR3 is pretty similar to PS4s real world bandwidth, in fact it is slightly higher. Xbox One real world bandwidth is around 150 gb/s, and real world numbers for PS4 as stated by Sony are around 140 gb/s. PS4s however is easier to tap into since its just one unified pool of high speed memory.

    josh hanlon @23hanlon
    @draginol so you still think 50k is possible on X1 even with the bandwidth prob but shouldn’t the ESRAM help out with the x1 bandwidth
    Brad Wardell @draginolFollow
    @23hanlon was just talking about that with someone. Not sure if the bandwidth issue will be an issue or not.

    • d0x360

      The problem today is nobody…well almost nobody is actually using the esram effectively. Right now the x1 gets ports from ps4 and PC and they are using the esram to hold the frame buffer which is why we see sub 1080p games. Were developers to actually take the time and use a little finesse the difference would be negligible. The ps4 will always have an edge there is no questions there but until developers actually make use of the x1’s unique capabilities we will never see it’s actual potential .

      I’m hoping any techniques used in halo 5 and forza 6…or any other first party title just get rolled into the SDK so developed have ready made resources and paths of optimization. I also hope epic makes some changes in ue4 to take advantage of the esram.

  • Brads a good guy who has shed a lot of info on the CPU side of DX12 enhanced performance. He’s also the first to admit that he doesn’t know a lot about the gpu features that DX12 will introduce to development. So in other words gaming bolt. There’s no reason to write an article about bandwidth when GDC in is a few weeks where Microsoft will shed more light on the Xbox and its real time performance enhancements coming with direct X 12.

    • DarthDiggler

      @deeboy17:disqus

      Oh yea MS is always very honest and upfront in their press events. LOL

      Need I remind you of Project Natal, Milo, Illumiroom? 🙂

      I will admit HoloLens looks kind of cool, but I wonder what kind of cuts the capabilities get prior to retail release because no one can afford it in it’s prototype form?

    • corvusmd

      What’s wrong with tech Demos….esp if they say up front they are tech demos? That doesn’t make them dishonest in the slightest. Saying you weren’t hacked when you know your network was, or that your gave is native 1080p when you know it’s not…THAT is being dishonest.

    • Project Natal is Kinect 2.0. HoloLens is Illumiroom and Kinect combined and better than both although I do think Kinect will be compatible with HoloLens judging by the hologram they displayed at their Windows X press conference of the executive making himself a holographic message. Milo was a project cancelled by lionhead so that’s not Microsoft fault. The icing on the cake is that you are referring to the old Microsoft. Not the new age one currently.

    • TheWayItsMeantToBePlayed

      Toy Story Graphics. “Power Of The Cell”.

      You’re welcome.

  • corvusmd

    So several comments about how DX12 will directly improve the X1…yet the title of the article is “X1 has crummy bandwidth”…and I fell for the clickbait.

    My question though is that one of the main aspects of DX12 that is so exciting is actually how it makes using eSRAM easier to take advantage of and allows more tools for it that weren’ there before…thus allowing devs to take some load off GPU bandwitdth. So if that is the case…eSRAM bandwidth destroys all other console bandwidth levels…so combining API aspects with the “tools” aspects wouldn’t it start to work itself out? …maybe that’s why he said he wasn’t totally sure if bandwidth would even be an issue.

  • bleedsoe9mm

    nice to hear a dev so stoked about where the industry is going

  • marc Berry

    I will go with AMD on this one it’s their GPU and CPU. No one said it will do this over night, 50K now 100k later. One more time he left out the ESRAM just like UBsoft did.
    http://image.slidesharecdn.com/introductiontodx12-ivannevraev-140609111737-phpapp02/95/introduction-to-direct-3d-12-by-ivan-nevraev-31-1024.jpg?cb=1402332303

    That’s 100k! 50k now 100k later.
    Full AMD PDf here;
    http://www.slideshare.net/DevCentralAMD/introduction-to-dx12-by-ivan-nevraev

  • Gentleman Lover

    Gotta love this site. This guy says “crummy” in one response and the author throws it in as a headline.
    Straight to N4G you go…you’re welcome.

    • DarthDiggler

      @disqus_wNOY6bPJom:disqus

      You mad bro? 🙂

  • XanderZane41

    This is about DX12 increasing the performance on the XB1 by 200-300%, not about a crummy bandwith, which really isn’t all that crummy. Amazing how this site focused more on the negative part of what Brad says and not the positive.

    • DarthDiggler

      @XanderZane41:disqus

      You are living in a fantasy, DX12 will not deliver “200%-300%” performance increase to the XBONE. It will do great things for the PC, but they are even being a little coy about the improvements. Notice how they don’t give you performance differences between DX11 and DX12? They comparing it to DX9. I would hope that DX12 could perform much better than DX9 which was released in 2002!!!! 🙂

      The bandwidth of XBONE is kind of crummy Deep Silver remarked the ESRAM bandwidth is not high enough for 1080p 60fps.

      32MB ESRAM is not a silver bullet guys. You can keep wishing it was, but it is a far cry from Developers unlocking the power of the SPE’s in the PS3’s Cell processor. The Wii, Wii U and X360 all have ESRAM, developers should have a handle on it by now.

    • bardock5151

      Did you notice the benchmarks done by anandtech between DX12, 11 and mantle? http://www.anandtech.com/show/8962/the-directx-12-performance-preview-amd-nvidia-star-swarm

      Notice how they aren’t being coy about the 300-500% increases in performance.

      1080/60 is possible with esram. The only platform that isn’t struggling to hit a solid 1080/60 is the PC.

    • Jecht_Sin

      Yeah, on PC indeed. The time that the article mentioned the consoles was to state this: instead gave developers a means to access the GPU in a low-level, game console-like manner.

    • bardock5151

      He incorrectly stated that there were no DX12 vs DX11 benchmarks.

      He also incorrectly stated that they are being coy about the gains DX12 will bring. 300-500% was a huge win against naysayers out there, and if anything they have been very reserved as to the expected performance increase. All they have been repeating is 40%, 40%, 40%, yet here they have a 300-500% increase in performance over DX11.

      Do consoles provide 300-500% better performance than like for like hardware on PC running DX11? No, they don’t. At most 150% on like for like…. last generation. This gen the consoles are struggling to hit their counterparts performance, let alone even 50% better.

      Who knows, maybe Phil is also be very reserved as to what kind of impact DX12 will have on X1 too.

      Did you also notice the command processor became the bottleneck? It’s a complete reversal, no longer is the GPU suffering a weak CPU or poor CPU optimisation. A single portion of the standard GPU architecture has held it back. I wonder what AMD/NVIDIA and Microsoft have done to solve this issue, while attempting to keep the system balanced. Just a thought.

    • TheWayItsMeantToBePlayed

      Wii, Wii U and 360 were EDRAM. Significant difference. Also the Cell was a server processor jammed into a games console and never had any potenital in the first place so you can knock off that MisterCernyMedia garbage as well as the PS3 produced most of it’s games sub-720P.

      Funny how devs stopped talking about the benefits of “8GB GDDR5” as soon as it became apparent it wouldn’t help the PS4 hit 1080P 60FPS either and games that did like MGS GZ were on medium-low PC settings. The more you learn instead of parroting the same 1080pr Sony makes like you are claiming MS does.

    • Jecht_Sin

      I’d love to remind you that there are games on PS3 running at 1080p…

    • Northsidelunatic

      why do rpg’s sucks so bad when their on ps3 I know wii games like xenoblade that has game mechanics in it that make a ps3 seems like a ps2 game ,

    • efnet

      There are no games running at native 1080p on either ps3 or 360

  • Guest

    Sounds like this guy sellng you fools a ton of bs. Good luck with all the major letdown this is gong to bring. The X1 and PS4 are really weak systems, deal with it. And the PC’s only gong to get a lot better because of it (DX12), so the gap between the two will only widen. MS fanboys this gen are every bit as pathetic as Nntendo fanboys. But I guess that’s what happens when you’re the weakest next gen system (well not including the Wii U, that poor excuse of a system doesn’t even count!)

  • Guest

    Wardell is obviously a MS plant, he is a secret undercover MS PR guy. And MS is marketing DX12 hard! Its all they’ve (and their fanboys) have talked about for the last year). Talk about PR. Every new DX iteration is hyped up like this and they all let down. But suckers just keep falling for it. But then again, there is a new sucker born every day.

  • Berin Meha

    Error in the article – DX9 can do 5k batches. It says 15 times (x) in his Tweet: “Dx12 on current HW can do over 75k batches. 15x where dx9 maxed.”

  • Mark

    Just a quick summary of the youtube TIC podcast with Brad on DX12 and the X1, in light of this “crummy” statement lol.

    1) He thinks the X1 can “easily” bang out 1080p 60fps
    2) MOST developers don’t optimise for the ESRAM for multi-plats, due to release schedules
    3) Here’s the boom; Brad was asked if the X1 would see the performance gains like PC in the Star Swarm demo (300-500% gain in fps), to which he said “Yes I think it would”. Boom goes the dynamite!

    Also, this was missed by people concerning if he’s referring to optimising for the ESRAM or not;

    josh hanlon @23hanlon
    @draginol so you still think 50k is possible on X1 even with the bandwidth prob but shouldn’t the ESRAM help out with the x1 bandwidth
    Brad Wardell @draginolFollow
    @23hanlon was just talking about that with someone. Not sure if the bandwidth issue will be an issue or not.

    Sound like he’s UNSURE here.


 

Copyright © 2009-2015 GamingBolt.com. All Rights Reserved.