Microsoft Looking To Improve Xbox One’s DRAM Performance

Microsoft seeking staff to improve the Xbox One’s performance further.

Posted By | On 10th, Jan. 2015 Under News | Follow This Author @GamingBoltTweet

xbox one amd

For recap purposes, Xbox One’s Memory Management Unit features three memory units that go through three different places and perform three different tasks, working independently of each other. The first two memory units are divided into coherent and non-coherent DRAMs which accesses CPU Cache Coherence at 30GB/sec and Non-CPU Cache Coherence at 68GB/sec. For those who are unaware, coherency is the behavior of reads and writes to the same memory location, which resolves the issue of inconsistent data due to cache.

The last type of memory is the eSRAM which is divided into 4 blocks of 8MB, having a bandwidth of 204GB/sec. However the eSRAM, unlike DRAM has much lower latency. With the basic introduction out of the way let us focus on what this article is about. Microsoft may be improving the performance of the Xbox One’s DRAM according to a new job listing posted on the company’s official careers page under the Xbox Gaming division.

The company is looking for a Console Memory Validation Engineer who will take care of “the development, test and qualification of DRAM memory products functioning within Microsoft products.” Furthermore the candidate will need to “evaluate different solution options for performance, functionality, stability, cost and risk for the memory subsystem within the platform.”

The candidate should have prior experience in “debugging, validating and bringing up DRAM memory subsystems.” Recently, Microsoft also unlocked an extra CPU core on the Xbox One so it’s intriguing to see Microsoft pushing to optimize the Xbox One’s performance even further. During the last quarter of 2014, the performance gap between the PS4 and Xbox One did seemed to narrow down. Will Microsoft narrow down the gap even further in 2015? This is something that remains to be seen.

Awesome Stuff that you might be interested in

  • d0x360

    So the idea would be in two areas. First they would pour through the system and see of there are any software tweaks that can be made to improve efficiency. Second they might be looking into overclocking the ddr3 in which case they need to make sure thermal limits aren’t hit but also how far they can push the clock without introducing stability issues in games.

    Interesting. Microsoft had a similar job position on the og Xbox. They used ram from 2 seperate suppliers. This cause performance issues in about 30% of the Xbox consoles because one supplier made slightly more efficient ram. That ram was also what was in Dev kits so when testing games it might have run perfectly smooth on one Xbox but then have a lower frame rate on others. That’s why if you browsed forums about original Xbox games you would always see a small number of posts with people saying the frame rate of game X was inconsistent or bad but then other people had no issues at all. Quite interesting how a difference that on paper doesn’t even exist can effect performance so much.


      Interesting to say the least I’d say as well…….

    • Guest

      Ugh! You again, do you live on these forums? Anyways, I had the original Xbox and never knew that. (If its even true). But all this really means is that they want somebody to find them cheaper replacement RAM that wont cause any problems. Both you and Gamingdolt are reading way to much into this. But its nice to dream.

    • d0x360

      Yes how terrible of a person to be a regular at a website. Oooh the terror…

      Is it your goal in life to ram that Sony **** as far in as possible? Do you post about people reading into things too much in every story about Microsoft? How exactly is posting a comment about a massive 2 possible scenarios looking too into something?

      Here’s my post condensed down we can analyse the depth together…ready?
      Microsoft is probably looking to optimize software and overclock ram.

      Wow! You are so right I definitely went way crazy in depth with speculation!

      Its always easy to tell when someone has zero knowledge of hardware and software. You fit that bill nicely. Using just a teeny tiny nugget of common sense you can make an educated guess that they are aiming for performance gains not a new supplier. You dont have new jobs to look for suppliers. You are reading into my OG Xbox comment TOO FAR. The issue that happened then couldn’t happen today. Switching suppliers wouldn’t cause any performance loss or gain because unlike the ram in the original Xbox there are set standards with minimum performance thresholds. There wasn’t then, of course anyone with basic knowledge of hardware would have known that.

      Ty for making me waste 2 min of my life.

    • theCharlieSheen2

      This guy doesn’t get irony apparently.

    • Will

      “Quite interesting how a difference that on paper doesn’t even exist can effect performance so much.”

      Especially in an APU, since the entire system is dependent on the same pool of memory.

  • coolgamer

    all these PR stunts, announcements, etc…
    by the time one xbox one game fully uses ALL the directx12 features, we will be in 2017-2018…

    just look at the pc scene… we’re in 2015… how many PC games TRULY use 100% of a direct11 features, fully use all the ram in the computer (16, 32Gb), fully use each of the cpu cores, hyperthread cores, at 100%, fully use the 6-12 Gb of gddr5 available on the graphic cards, and use 70-100 GB of ultra-highres /perfect life-like photorealistic textures, stored on the hard drive ?

    how many PC games do that, in 2015 ?
    just a couple of years ago, many games would only use 1 cpu core, and many times, would solely rely on the gpu, to do all the stuff, and gamers could have a 30GHz cpu, the game wouldn’t run faster or look better, and everything would be calculated on the gpu.

    any possible improvement will give 1 or 2 little %, in the best cases.

    even microsoft engineers claimed that the esram, IN LAB, UNDER OPTIMUM CONDITIONS, with a perfect 50% read-50% write code, they could achieve around 150-160 GBs.

    never is a gaming doing a perfect 50%. that’s why in reality, maybe devs have a permanent 100-120 GBs available

    it’s almost funny, when a new article pops in, claiming the esram would be optimized, and suddenly, there are 8 million xbox one fans, thinking that devs will, my miracle, be able to run a game at 1080p60fps, instead of 720p30fps.

    by 2015, fans should have understood that the xbox one power and architecture were designed with low-average performange-high profit in mind.

    the xbox one alone, without a ps4 in the market, with its powerful scaller, etc, would be releasing quite good looking 720p games, with a good antialiasing processing, high res textures, etc etc, and everything would look quite good, even at only 720p. nobody would be able to compare it to another system, everybody would be happy about its graphics.

    but with some 1080p60fps ps4 games around, any X1 720p game will just look mehhh.

    without the ps4, the x1 would be doing great, games would look great with clear textures and huge antialiasing-scalling processing, and all the graphics would look polished, crispy, beautiful, even at 720p, and microsoft would be making lots of money with it.

    but because devs started using the ps4 hardware and making 1080p games, devs had to really push the x1 hardware, something that wasn’t supposed to happen. why do you guys think microsoft only put 32mb of esram on the system ?

    because that would be more than enough to make 720p games, and ultimately, use the hardware scaller to upscale it to 1080p, and all those 720p games would be sold as “1080p”, and nobody would see the difference, as the output would be 1080p.
    32mb of esram was more than enough, for what microsoft wanted to do with the system. i am sure never, i mean, never, was microsoft expecting they would need to push the x1 hardware to make native 1080p games. hence, all these esram optimizations, directx12 needs, and plenty of other small tunings, to try to have a few extra cpu cycles. they even need to ditch kinekt, so they can free some of the 7th cpu power. they desperately try to find ways to get a little extra processing power, to help making 900p games, or 1080p(with low quality textures, etc).

    OTHERWISE, if microsoft was wanting to make 1080p games, from the start, i guess using 48 or 64 mb of esram, would have solved the problem.
    with those 64mb, the xbox one wouldn’t have any troubles reaching the 1080p and high def visuals, and probably, the overall bandwidth would even be superior to the ps4 one.

    but NO. definitely, microsoft was aiming the 720p. hence the 32mb of esram, which clearly aren’t enough for those demanding 1080p visuals.

    it’s crazy, how 32mb of esram can make all the difference.
    maybe it was a matter of 10-20 extra dollars, to move to 64mb of esram. they could have saved a little on the kinekt or other components, and pay those extra $ for more esram.

    i imagine how many hundreds brainstorming meetings microsoft staff and engineers have had, about the x1 architecture, and the 32mb-64-32mb esram question. and finally, they choosed the 32mb path.

    oh man, how they must regret that decision.

    any engineers here, to explain how much of a difference using 64 mb of esram would make, vs 32mb ? huge difference on bandwidth-everything else ?

    • Israel Lopez

      Too long didn’t read. Just skimmed over it. Look at history: Consoles graphics suck their first year, look great their 2nd and 3rd, and shock everyone the last 3-4 years with unexpected graphics performance and visual quality. IE: EA said: “Crysis will never run on consoles”. 2-3 years later it was running on 360, then we got Crysis 2 and Crysis 3 on both 360/Ps3 (even if PC version looked better)

      P.D.: No one can tell the difference between 1080p and 720p on consoles with a TV. Anyone who claims they can is obsessing over the issue. You can only dramatically tell in a PC+Monitor setup. Xbox One lately has been matching and even surpassing Ps4’s performance, visuals and resolution in some 3rd party games.

    • aeris bueller

      “No one can tell the difference between 1080p and 720p on consoles with a TV”

      Wrong. Pretty much anyone can. Even on the damn PS3 XMB I can see a huge difference in sharpness if I switch it to 720p. The blind should really stop speaking for the sighted.

      Crysis 2 and 3 had a different engine. I agree the games will continue to look better as time goes on, and even the lowly Wii-U proves that hardware isn’t everything with Mario Kart 8 being one of the best looking games of this generation,
      but for god’s sake please stop downplaying the hardware difference. PS4 is more powerful period, and XBox One doesn’t surpass PS4’s performance on anything, except notoriously horribly coded garbage that isn’t indicative of anything like the latest Assassin’s Creed.
      I can’t believe you fanboys are still stuck on this nonsense. If you want the more powerful console go buy it (the PS4, in case you’re still not caught up), and if you just prefer XBox, whether for it’s exclusives or some idiotic sense of corporate loyalty, learn from the Nintendo fans, and shut up about your console catching up with hardware performance.

    • Will

      “IE: EA said: “Crysis will never run on consoles”. 2-3 years later it was running on 360, then we got Crysis 2 and Crysis 3 on both 360/Ps3 (even if PC version looked better)”

      This is because consoles always have been and (likely) always will be way more optimized than PCs. They have fixed hardware which gives engineers and developers a lot of time to tweak tools and code specifically for the system. Also, game sales are MUCH higher on consoles, so the status quo of graphics output is largely determined by console developers. A PC will usually only add pixels and draw distance, but not polygons or higher quality textures, which is the “meat” of the graphics presentation. Also, since PCs are open and feature mix-and-match hardware potential, it would be difficult, if not impossible to find a game that really utilizes the capability of your specific PC setup. A lot of the power of a PC is “lost” in a game’s performance due to lack of low-level hardware optimization.

    • Guest

      Crysis 2 and 3 couldn’t hold a candle to the original Crysis in terms of resource demand. Crysis was a BEAST to run and far ahead of its time.

    • RjK311jR

      I appreciate your long winded answer and opinion but you dont have any technical facts what so ever in any of this… your using words that any person who games knows about… Before you go into a huge schpeel about this that and the other get your facts straight… did you work there? do you know someone who worked there and broke NDA to discuss trade secrets with you? The system info your talking about, although can sound awesome to PS4 fans, but if your truly meaning what you said state facts, cite works so others can learn, like myself. Im no know-it-all but this out right hate on one console over the other has to stop… A gamer, games cause he loves gaming. If your so strong and for Sony why read a MS article… ok you read the title and thought uh oh and get nervous? Why? Or did it look interesting… but To bash it speaks volume about you! Dont worry bud our PS4 is fine and is not magically going to be outdone by Xbox One… But grow up man, The Xbox One is an amazing console, with awesome features. It is very capable of playing many of the games ps4 does at almost very similar quality.. give or take here and there… All of us know this already… but when someone want to write about an update and give you, me and all the other gamers out there GAMING NEWs or any other info on future updates on all video gaming hardware and software then let it ride… Dont let your ignorance, arrogance, and immaturity get the best of you… My advice thank the author, cause without it we wouldnt get some of this info. You can do whatever you want… smh it just sucks that this is whats become of the gaming industries players… We all love competition.. Heck console wars have been around for years… Hate to tell you this but the only thing I got out of coolgamer’s comment is obvious troll. and thats the first time I have ever used that word cause i despise it!

    • GHz

      You’re one brave dude bypassing all what experienced devs had to say
      about this matter to evangelize your opinion on hardware you don’t have a clue about.

      “ALL the directx12 features, we will be in 2017-2018”

      DX12 will officially will be released late this year 2015 & It make more
      sense that 1st party will have 1st dibs on its use. I’m pretty sure Big games like TR, Halo 5 ARE probably using a hefty amount of DX12 features already behind closed doors. Remember, XB1 is only device that is 100% Dx12 compliant, meaning it is the only platform of its kind. AMD and NVidia is up next by middle of this year. You’re timeline applies to devs who don’t have all access or who don’t have the resources to keep up with other well established companies. The pacing and use will be different for everybody for all kinds of reasons.

    • GHz

      “just look at the pc scene… we’re in 2015… how many PC games TRULY use 100% of a direct11 features”

      I think in the case of PC, it was a mix of things. AMD may have features
      already on their GCs, but maybe APIs weren’t ready to exploit them fully and vice versa. I’ve heard AMD talk about such situations. Plus consoles are really different this time around. Especially in the case of MS, they have the benefit of having created their hardware and software. Again 1st party will exploit fully what the XB1 can do. Just look at QB. If a 2nd party DEV can do that now on XB1, just imagine what 1st party will do by years end. And your talk about PC game development don’t apply here. Plus the playing field is changing rapidly with the introduction of Mantle and DX12. Don’t be stuck in the past. Things will be done differently now.

    • GHz

      that stuff you said about ESRAM, MS engineers actually said you can achieve 140-150GB/s out of an application, real code! That is what is available to devs if they wish to exploit it. XB1 was 1st out the gate with 1080p 60fps on next gen console. FM5 did this under extreme
      conditions via its physics engine. Remember that game is one big simulator. Real @ that. Do some research on the development of that game. Find out what kind of computers are needed to run such realistic simulations. And look up which company Turn 10 was involved with while developing that game. That game, just like every other major XB1 game, was testing certain aspects of XB1 tech. In this case, it was realistic physics and simulation.
      Devs went for simulation @ 1080p 60fps over graphics and it worked because we still got a pretty looking game in the end. Not bad for a release title!

    • GHz

      “but with some 1080p60fps ps4 games around”

      Which game on the PS4 is doing this locked w/o dips and various issues?
      Please name them. Now I don’t count 2frame dips, but any game that dip well into the 40’s during gameplay is not a 60 fps game. 60 unlocked maybe, but not def not locked. 1080p 60 should not be used so lightly. We’re not talking Guerilla Games definition of 1080p 60. What we found out that PS4 struggles with 1080p 60. Just because you use those numbers in the same sentence as ps4 doesn’t mean that the PS4 does it effortlessly. Am I wrong?

    • GHz

      “but because devs started using the ps4 hardware and
      making 1080p games, devs had to really push the x1 hardware, something that wasn’t supposed to happen. why do you guys think microsoft only put 32mb of esram on the system ?”

      Devs were able to push XB1 because the API got better. SIMPLE! And every console since the 1st console in history is always pushed for better results always. So what do you mean by ,” devs had to really push the x1 hardware, something that wasn’t supposed to happen”?

      “why do you guys think microsoft only put 32mb of esram on the system ?because that would be more than enough to make 720p games”

      You just keep making stuff up. Everyone agrees that as
      the APIs improve, 1080p will be easier achieved on the XB1. Easier, meaning that it was achievable before, just difficult for some devs, who are on a multiplatform budget. That’s because of time mostly, and of course the state of the API of the time. We have plenty 1080p games on XB1 as of late.

    • GHz

      “microsoft was aiming the 720p. hence the 32mb of esram,
      which clearly aren’t enough for those demanding 1080p visuals.”

      Yet, FM5 was the 1st next gen game to achieve 1080p 60fps locked! No dips! That’s what esram was built for! But the truth is, 32mb of ESRAM got nothing to do with 1080p because you don’t need to use esram to
      achieve 1080p if your building small games. Different devs have different
      needs. If you are building huge games with big worlds then you’d think of using esram because it’ll help you achieve 1080p 60fps giving the scale of your project. 3rd party devs are getting better at this. We will be
      seeing MORE 1080p games locked at 60fps. It’s inevitable.

      “it’s crazy, how 32mb of esram can make all the difference.”

      You have no idea how it will. Tile Resources! They explained all that already. It solves the issue of huge data taking up space on mediums. One dev already said that what esram does will be the FUTURE trend!
      Guess who said that? Answer that being that you know so much about esram. Who are you again? 0_O

      Stop misinforming pple please. It’s not cool. These systems are outputting great games already years end. We can all look forward
      to improvements. One developer who has major access already said that XB1 will be the biggest beneficiary when DX12 hits. One said, coding 1st for esram is better because it saves time if you are multiplatform developer + it’s the trend for the future. All that sounds like good news to me, and I’ll trust those guys over you any day.

    • aeris bueller

      “One developer who has major access already said that XB1 will be the biggest beneficiary when DX12 hits.”

      “On the DX12 question, I was asked early on by people if DX12 is gonna dramatically change the graphics capabilities of Xbox One and I said it wouldn’t. I’m not trying to rain on anybody’s parade, but the CPU, GPU and memory that are on Xbox One don’t change when you go to DX12. DX12 makes it easier to do some of the things that Xbox One’s good at, which will be nice and you’ll see improvement in games that use DX12, but people ask me if it’s gonna be dramatic and I think I answered no at the time and I’ll say the same thing.”
      -Phil Spencer – Head of XBox division at Microsoft

      It’s crazy how much you don’t know what you’re talking about.

    • GHz

      “if DX12 is gonna dramatically change the GRAPHIC CAPABILITIES of Xbox One”

      And the answer to that is a big NO! because XB1 can achieve great graphics w/o DX12. That sort of thing is independent of DX12. And that sort of thing naturally will evolve over time as developers get comfortable with the the platform. This is true for every console. Again you don’t need DX12 to enhance graphics. You can achieve that by various other ways, like using middleware like graphine software tech that exploit to the fullest streaming textures. And thats just one example. Great looking games are here to stay, and graphics will improve! But in the case of XB1, thats NOT DX12’s job. It never was. That’s why games like Quantum Break exists, which is built in dx11!

      You guys are so caught up in graphics you’re not listening, and miss the whole point.

      What they should’ve asked Phil was,” what KIND OF NEW FEATURES will devs have access to on XB1 because of DX12?” To that, Phil would’ve answered, “I’m sorry, but I’m under NDA “

    • aeris bueller

      Uh, no. DX12 is not some secret architecture, and the most important aspects of it were discussed on MSDN a very long time ago, but I’m glad you’re speaking for the man. The reason it won’t make a huge difference on XBox One is because it’s main purpose is to (like mantle) give PC developers lower level hardware abstraction, which consoles already have with their fixed hardware configurations. This is indeed about graphics as well as other computing functions.

      One of the demos they used to show of DX12 was Forza Motorsport 5 running on PC, and emphasized how it was bringing some of the benefits of fixed hardware console development to the PC. They also specifically mentioned how XBox One was already enjoying those benefits when it ran Forza.

      ESRAM isn’t some great secret sauce, it’s there as a crutch to make up for the slower RAM, since usually graphics cards use GDDR instead of DDR. It’s also been described as a pain to use and a bottleneck by several developers, as I’m sure you’re aware if you follow these articles closely. What the SDK improvements can do is make ESRAM easier to use, so it feels a little more like having an abundance of fast ram the way PS4 or a PC does.

    • GHz

      DX12 was kept a secret all the way up to a few weeks before GDC 2014. It was so top secrete, even AMD reps believed that MS was calling it quits on the API, just 2yrs ago 2013. When it was finally revealed, the discussion switched to it will only be for PC. Then when MS revealed that it was also for XB1, then the convo switched to well XB1 wont really take advantage of it. When GDC 2014 finally game about, DX12 main focus was how it’ll affect XB1 games. When that was clear, people were ready to discuss how. When MS said that DX12 will introduce new rendering features and other goodies, pple wanted to know all. MS response was, not until later this year! Some features are under NDA!

      Please stop making up stuff. Everything Ive mentioned is in the history books.

      ESRAM was NEVER a mystery. Internet BS from pple like you made it seem so. Now hear you are parroting what you’ve heard yet again. MS even said that they were surprised at the internet reaction to esram being that it was a evolution to to what the 360 was doing and how it wasn’t so different. While some devs had problems with it it, others came forward and said the opposite. Which let us to realize that not all devs are created equal. Some who had problems with it at least clarified it by explaining how XB1 graphic drivers were not so good, and that they were a moving target. This was the case from the get go. But you guys wanted to believe it was a GPU thing. One dev even challenged other devs on their conclusion on the use of eSRAM and proved them wrong! Look that up! For more experienced developers, they recognised that esram encouraged code optimization, and that when you code 1st for esram, both PS4 and PC benefitted! This is all old news that you wont hear about because you choose not to look for such great news about the XB1. You just want it to be a failure. So you make up stuff that goes against what every experience dev have to say about the matter. It’s 2015, and its time to stop with the lies and the misinformation. Please stop lying,

    • aeris bueller

      AMD had every reason to downplay DirectX 12, seeing as how they were developing their own API, and yes it was revealed at GDC2014. That’s March 2014, as I said a very long time ago (at least by tech standards).

      ESRAM encourages code optimization is another way of saying it’s more difficult to pull performance out of ESRAM. Amazingly, you tried to spin this into a good thing. Being forced to optimize to get decent performance out of XBox One helps out the other platforms – great, but that still doesn’t make ESRAM a good thing.

      More experienced devs? You mean more experienced than Crytek? Legendary engine devs, and developers of Ryse, the game XBox fans loved to throw around as the ‘best looking next gen game’? Because they were one of the first to call ESRAM a bottleneck.

      That tiny bit of ESRAM is nothing compared to the entire RAM pool being far faster. The XBox One’s DDR has lower latency making it better and multitasking, which is great if you care more about your console doing background tasks than you care about it doing gaming. The PS4’s GDDR has higher bandwidth, which is why it runs circles around XB1 graphically and in GPU computing applications, which benefits those gamers who care more about their console being good at running games than they do about snapping apps.

      I own neither an XBox One, nor a PS4. I actually own a Wii-U and a gaming PC until the great games start coming out for PS4(always preferred their exclusives). I also think the Kinect 2 has great potential, and in my work as a graphics middleware developer I need to use Linux, OSX, and Windows, and I think Windows blows away the competition by such a huge margin there’s no hope of them catching up as far as useability and likeability. I had been praying they would axe Ballmer and Mattrick for the longest because I’d like Microsoft to doing great things, instead of ruining themselves and the game industry. So please stop putting words in my mouth. You don’t know me anymore than you know what you’re talking about.

    • GHz

      “AMD had every reason to downplay DirectX 12, seeing as how they were developing their own API”

      Stop with the lies. DX12 was on a need to know basis. Not everyone had clearance. AMD didn’t downplay DX12. Reps who didn’t have that level of access DIDNT know about DX12. One in particular spoke out of turn. That was all. But that proved that DX12 was under guard. So please stop lying!

      DX12 is NOT competing with mantle for the simple fact that XB1 was designed side by side with DX12. Thats one of the things that makes them different. And consider the fact that AMD sits on the board that advises on which direction DX12 should go, that makes us know that AMD and MS are partners! Meaning the same relationship MS shares with AMD is NOT the same as Sony’s. Sony is a customer, while MS and AMD invest together. Big difference. So AMD downplaying tech they help advice on make no sense! Stop with the lies!

      Yup! Crytek is knee deep in it. And please don’t over exaggerate what they meant. The API was bad period. Not having updated graphic drivers can do that. So please say the whole story. If we were to go by how you want to exaggerate the problem, we wouldn’t have parity on 3rd party titles these days. So please don’t lump Crytek with your frame of thought. They know better. And why undermine what they have achieved on the XBOX ONE despite working in a bad environment? SIGGRAPH is made of a panel of experts who’ve created many of the technology that drive computer graphics forward. If you’re nominated by them its for good reason. Neither you nor I are more qualified for a sound opinion more than them. Your opinion can never trump theirs! DONE!

      “Being forced to optimize to get decent performance out of XBox”

      !!!????? WHUT!!? O_0

      How is supporting code optimization a bad thing? What dev would say that they were forced to optimize their code when that IS THE TREND MOVING FORWARD!? Thats what devs said. But your saying that they are being forced. WOW. What nerve you have. Where is the bottleneck in FH2? Where is the bottleneck in Quantum Break? There is a timeline to these interviews, and these devs are honest when sharing their experience. And their experience differ, based on the ver of the API they worked under and a whole host of other things. But look at the games now and what do you see in the 3rd party space years end? Parity! Why!? Better API!

      The BS in you is strong!

      “The XBox One’s DDR has lower latency making it better at multitasking, which is great if you care more about your console doing background tasks”

      You’re a straight up clown at this point cause those are very specific words which lead to a very specific conclusion. Cause now you have to explain how FM5, FH2 runs on XB1. And how all that physics is possible in Quantum Break. The games are the proof!

      As for your all of a sudden agreeable tone, I don’t buy it. You just want to make it seem that you’re open minded. What you’ve proved however is that you’re just a liar.

      Bruh! I don’t need to know your gaming life story. Just stop talking nonsense, and stop spreading lies.

    • aeris bueller

      “DX12 was on a need to know basis.”

      The tiny bit of relevance this holds to our conversation is that you said Phil Spencer would decline to answer questions, and I said DX12 isn’t some big secret. Obviously that depends on when he was asked. I don’t even know what your rant on this is supposed to prove.

      The fact you think Mantle and DX12 are not in competition, or that AMD as a hardware manufacturer (for their console no less) wouldn’t be obligated along with MS to be frenemies at the very least, or have a nuanced relationship that involves both competition and cooperation represents such ignorance, I just don’t have time to explain it to you.

      Having parity on some third party titles does not mean the hardware is equal, and half the time the reason is political. By your logic, every lazy PC port over the last ten years proves 360 and PS3 are as powerful as gaming PCs.

      “How is supporting code optimization a bad thing?”
      It’s clear once again you have no idea what you’re talking about, and I also can’t tell if you have no reading comprehension or you’re just trying to use logical fallacies to prove whatever point you think you’re making. Having some crappy design that forces you to make optimizations to do mundane tasks forces isn’t ‘supporting optimization’, it’s having slow crap that needs extra optimization. Did you seriously misunderstand that, or are you being intentionally hard-headed?

      Optimization is not a ‘trend moving forward’, it’s always existed, and it’s always prioritized with other factors, such as deadlines and manpower.

      Your questions about the bottlenecks in various games is so ignorant it barely merits a response, but I’ll bite. I didn’t say XBox One can’t make games. I said PS4 is more powerful. A bottleneck doesn’t mean the whole system shuts down, it means it’s an area that won’t allow other faster parts of a system to perform at their optimum capacity, because they have to wait on that thing, in this case ESRAM.

      I don’t have to explain how those games run on XB1. Again, I didn’t say DDR doesn’t allow XBox One to do anything. The tradeoffs between DDR and GDDR, and why they don’t just use DDR on graphics cards, or GDDR on CPUs in PCs are for the reasons I stated above (in a nutshell). If you don’t believe me, go research it for yourself.

      As for my agreeable tone – I sincerely apologize if I came off as agreeable, as I don’t agree with anything you say, and I find your passionate ignorance highly distasteful. It’s been an exercise in profound willpower to avoid insulting your intelligence. I gave you my ‘gaming life story’ to refute your claim that I hope MS fails, but it’s clear you don’t want to engage in any type of rational conversation.

      My OCD compels me to respond to each and every profoundly incorrect thing you say, so if you could do us both a favor and respond a little more succinctly from here on out. Also, I apologize if you’re actually a 12 year old, and not just an adult with the mentality of a 12 year old. If you’re as young as your childish desperation and cavalier disregard for accuracy would suggest, I didn’t mean to rain on your parade, or injure your exclamation mark finger.

    • GHz

      You have OCD! Ooooh ok! That explains all the lies you said, and the pointless babbling you do afterwards when they are brought to your attention. My advice, don’t think too hard and just enjoy your Wii u. AND NO MORE LYING! Cause games like FH2 & QB will always prove you wrong. The games don’t lie.

    • aeris bueller

      Whatever helps you sleep at night, friend.

      Speaking of OCD, you may wanna get yourself checked out, seeing as you’re the one who left 800 comments and 1600 ‘guest’ upvotes on this one article.

      see you in the next secret sauce article

    • corvusmd

      “all these PR stunts, announcements, etc..”

      So….hackers getting into an XDK is a PR stunt? A job listing is a PR stunt?

    • TheWayItsMeantToBePlayed

      >how many PC games do that, in 2015 ?

      Skyrim, Metal Gear Solid GZ, Assassin’s Creed Unity (As hilariously incompetent as it is), Far Cry 4, The Witcher 3, Elite: Dangerous, Wreckfest, Star Citizen, Ryse…

      Actually we’d be here all day if we discussed the amount of games that are using the potential of the PC already and far surpasing what consoles can do while you fanboy about the PS4 and it’s 1080P 30FPS (Realistic number because the sacrifices they hit to get 60FPS developers are starting to shy away from. Especially as MGS GZ showed just how bad the LOD the PS4 VRAM is at 60FPS). Both consoles are high profit margin, low hardware no matter how you spin it or listen to MisterCernyMedia and it’s unsurprising the PS4 can’t even hold a stable 30FPS on most titles due to how little power it has. 4.5GB VRAM is only barely better than 32MB ESRAM. So your thought experiment is already a bust trying to compare two low powered APUs with horrible selections for VRAM. 64MB wouldn’t do squat regarding 1080P 60FPS if the engine isn’t made for low level GPU anyway as you see with FOX Engine where it only shines on High level PC chipsets as it will all depend on engine optimisation and not raw VRAM power when dealing with low range budget chipsets like the XB1 and PS4 are using.

      Ipso facto. You know jack squat about how GPUs work in the first place. Please stop pretending to know before you hurt someone with your ignorance.

    • Will

      While I agree that PCs, if powerful enough, certainly far surpass what consoles can do in technical capability, I disagree that games are really utilizing a PC’s potential. I don’t think games are nearly as optimized for PC as they are for consoles.

      If PC sales were better, or at least if a company really put their sweat and tears into making a game specifically for a powerful PC configuration (which is also part of the problem, since it would likely be a VERY specific PC setup), I think we would truly see a game that would be leaps and bounds beyond what consoles are doing now in terms of polygon count, textures, and physics effects, instead of simply having more pixels and seeing further into the game landscape.

    • bardock5151

      I haven’t seen such a long hopelessly incorrect wall of text since derp disappeared. Please do the same.

  • Brad

    This is meaningless as sales have proven. The only way MS is even remotely relevant this generation is when they reduced their price. No one is buying XB1 for it’s power or design.

  • theCharlieSheen2

    Microsoft has been desperately looking to improve xbone since the reveal, always looking but not to much improving, you guys want to improve xbone, just take the L, forget about it, and move on to your next console.

  • waltc4

    The only reason “performance seemed to narrow” between the two consoles is because developers have made an effort to write to the lowest common denominator–the xBone–in order to sell as many copies of their software as possible (developers don’t care which platform is the strongest.) Microsoft needs to can the set-top box/TV portion of the xBone and devote more/better resources to the gaming hardware in the xBone (meaning a redesigned xBone 2)–it’s the only way they will ever best Sony in this market. Lesson for Microsoft to absorb: when you design a gaming console make sure that gaming is the number 1 priority of the device. That’s just what Sony did, and is why the PS4 has beaten the socks off the xBone in unit sales.

  • Mark

    Anytime there’s a new article about Xbox hardware, it pisses some people off to the point where they want to drown out the piece’s information. Humans must’ve been doing this since we were here lol. I mean, it’s not like u can scream away Microsoft’s job posting……..can u? Maybe, who knows.


Copyright © 2009-2017 All Rights Reserved.