Xbox One Dev: GDDR5 Is Uncomfortable To Work With, ESRAM Provides High Bandwidth At Low Power

ESRAM brings in the right balance for power and performance.

xbox one amd

Sony have been praised by several developers for their decision to include of 8GB of GDDR5 RAM in their upcoming next generation console, the PlayStation 4.  The faster memory apparently gives more budget to developers to render out high resolution textures and free up other resources. Microsoft opted for 8GB DDR3 RAM for the Xbox One but according to them the on board ESRAM will provide the right balance of bandwidth and power.

According to Nick Baker, who looks after the Xbox One’s architecture team believes that GDDR5 is uncomfortable to work with and ESRAM gives the Xbox One the right balance for bandwidth and power consumption.

“In terms of getting the best possible combination of performance, memory size, power, the GDDR5 takes you into a little bit of an uncomfortable place. Having ESRAM costs very little power and has the opportunity to give you very high bandwidth. You can reduce the bandwidth on external memory – that saves a lot of power consumption as well and the commodity memory is cheaper as well so you can afford more. That’s really a driving force behind that. You’re right, if you want a high memory capacity, relatively low power and a lot of bandwidth there are not too many ways of solving that.” he said in an interview with EuroGamer.

However, back in July, the lead architect on the PS4 Mark Cerny defended Sony’s move to include GDDR5 in the PS4 by stating that the latency in GDDR5 isn’t particularly higher than the latency in DDR3. Microsoft have continuously emphasized that they are targeting for ‘balance’ in Xbox One and ESRAM is simply an evolution of the eDRAM found in the Xbox 360 and as such developers will be able to harness the power of Xbox One more efficiently. Ultimately, for the core gamer, the games will matter and both systems will have enough of those at launch. But just like the current generation, it will take a year or two before we start seeing what these expensive toys are clearly capable of.

Tagged with: , , ,

  • Axe99

    The heading of the article is off – the devs were saying it was an uncomfortable place (for them) for power and cost. They weren’t passing comment on what it was like to work with.

  • daeryl scort

    Links for the names of these developers please ?????

    • JumpIf NotZero

      This wasn’t from developers. Rather two xbox hardware engineers that gave interviews to EuroGamer / Digital Foundry.

  • TheFanboySlayer

    -____- no one understands what the esram does and it pisses me off.

    watch this video. No fanboyism….no bull…these are just facts.

    http://www.youtube.com/watch?v=gJW3mwIPzJc

    • Jeremiah Enrile

      No one understands GDDR5 either.

    • Kevyne Collins

      Look at that links date. May 21st 2013. In June of 2013, Microsoft found out that eSRAM can read and write at the same time. Not to mention the digital foundry deep dive into Xbox One. So things have changed. And PS4’s cpu does not run at 2GHZ. It’s 1.6.

    • Dakan45

      and xbox one cpu is1.7ghz actually, that will go even better with ddr3 rather gddr5.

    • Eric Michael Fadriga

      True less bottle knecs… If you got a 1.6CPU but get a gDDR5 RAM <- take note it's RAM not GPU, then most likely there will be bottle knecks…

    • Dakan45

      The cpu is a tablet low power consuming gpu. Its a jaguar 8 core cpu of which only 4 cores are real and the rest are threaded. A powerfull phenom II quadcore can beat it.

  • afta

    what a Joke?

  • Bobby Griffin

    I agree with ms on this anyone that has any sense will tell you that esram is the way to go and it will show its true colors in the next few years

    • Mitchings

      I’d agree if it was an addition and a cache (and larger) but it’s a band-aid and a scratchpad. An overflowing framebuffer is not feature; it’s a fuck up.

  • N4GCrossingEden

    Xbox is the BEST. Kinect 2.0 will ensure all games be 1080P and 60 FPS. Cloud Computing will provide near-instantaneous loading and.. of course, the exclusives poop all over what the Wah-Wah WiiU and Sony POS has.

    Once XBox Live “2.0” kicks off, even the PC “elites” will cower in fear. Time to pony up the money and get an Xbox One, you POOR fanboys! Microsoft FTW!

    • Lacerz

      Ignorance. Ryse has already been announced at 900P. And Kinect wouldn’t impact that anyway. This is all stupid stupidity. Go to sleep!

      EDIT: Sorry if I missed the total sarcasm.

    • JumpIf NotZero

      Lol, the fanboy can’t spot the troll.

    • Stranger On The Road

      Assuming that you are not trolling, let me correct you a bit:

      Kinect has nothing to do with graphics improvement, it is an input device. Any way, Microsoft has already said that they will use 10% of the GPU’s time for running the system… including processing the Kinect images. So technically the Kinect reduce the overall power of the system.

      1080p60 isn’t enforced, it is upto the developer to decide (which is a good thing, mind you).

      The Cloud has nothing to do with the local load time of a game.

      XBL 2.0…. you are kidding right?

  • ahlun

    I had a great time reading the comments, thanks guys.

  • Jonam

    xbox dev…enough said.

    • Dakan45

      right brah whatever ms says is bs but sony is 100% fact proven correct

    • Stranger On The Road

      No exactly, this is a comment by an employee regarding the merits of a competitor’s design. You really have to take it with a grain of salt :-)

    • Mitchings

      The problem is that Sony (Mark Cerny in particular) has spoken truth about the PS4 and MS have continually spouted crap about their system that doesn’t hold up to technical scrutiny.

      It is fact, not bias.

    • Dakan45

      Mark cerny downplayed lack of gddr5 as not an issue, he also downplayed the weak cpu and also didnt specificy just how much ram is available to developers.

      This is mark cerny, ATARI, 3DO, SEGA….. all dead. Nice record there.

    • JumpIf NotZero

      Except that THIS ARTICLE IS WRONG. This was not a developer. The two guys who gave the Digital Foundry article were Xbox Hardware Engineers. Not some dev that hopped on twitch saying some nonsense about something anonymous devs might have said.

      This is about as reliable a source you’re going to get from MS. But I guess it’s okay to shit on what they say, as long as you eat up anything Cerny says right?

  • Toysoldier82

    Based on the comments there’s going to be a lot of disappointed folks when both consoles are released next month.

  • kreator

    XBO FTW!

  • datdude

    Ya gotta love Microsoft. Deny, deny, deny in the face of all reason and logic. Remember folks, as a wise man once said, “It’s not a lie, if you believe it”.

  • Solid Snake

    Propaganda!

    • JumpIf NotZero

      ACTUALLY…. This quote came from a much larger and more in-depth article where two hardware engineers from MS talked to DigitalFoundry (Eurogamer) about the hardware of the X1 and some factors for the reasons they chose X approach vs Y. You should really read the whole articles. You’d even see that MS has more real world bandwidth with ESRAM+DDR3 (200GB/s real world) and Sony has less bw and more latency (176GB/s theoretical peak). So… Propaganda! Right?!?

      MS has been exceptionally transparent with their hardware decisions. Sony has been absolutely silent. Almost as if MS has nothing to hide.

      MS has some stunning looking games, Forza5, DR3, Ryse, Titanfall. BF looks great seeing the latest build, absolutely on par with the PS4 version. So… Where is that hardware advantage Sony apparently has again? Oh right, Knack is reported as blurry with sub-30fps.

      MS has proof with games, and being very open with hardware. Sony has shown only Killzone as their “pretty” game but the gameplay looks like a Halo/FarCry Clone. So tell me again which company is pushing “propaganda!”

    • Mitchings

      Xbox One: 16x Micron DDR3-2133MHz 512MB/4Gb Chips (CL14 / Base Clock 1066MHz) = More than 13ns latency.

      PS4: 16x Hynix GDDR5-5500 512MB/4Gb Chips (CL15 / Base Clock 1375MHz) = Less than 11ns latency.

      DDR3 being better for latency than GDDR5 is a myth perpetuated by the citing of traditional PC configurations and somehow it’s spread as some sort of last hope that the XB1 can keep up.

      Now the eSRAM will in itself be capable of better latency but it’s configuration in the Xbox One lends little extra to the CPU. It’s there almost entirely for boosting the bandwidth to the GPU; and the GPU isn’t latency sensitive so it will hold no advantage there either.

      The Xbox One has no real-world latency advantage and in fact; I’d wager that the PS4 has a tiny advantage.

      Regarding bandwidth, 200GB/s is not a real world number. The setup will make it highly variable, with the DDR3 and eSRAM combined on very specific operations it may push past PS4’s RAM speeds but on average the whole thing is going to be running slower.

      You can’t just combine near-peak numbers from different pools, especially with the XB1’s DDR3 UMA + eSRAM Scratchpad setup.

      XB1 is as they say..”balanced”; it’s balanced at a lower level of performance.

      It’s balanced around its primary market goals…multimedia, kinect and games; and the engineers done the best they could with that top-down goal in mind.

    • Solid Snake

      “The absence of evidence is not the evidence of absence.”

  • Nick Albright

    If you can afford more RAM (ie “commodity memory is cheaper as well so you can afford more”), shouldn’t your console have more RAM then?

    • JumpIf NotZero

      That doesn’t mean MORE RAM, it means MORE. As in MORE features like ESRAM. Which provides MORE bandwidth in real world applications (X1 200GB/s real world, PS4 176GB/s theoretical peak). MORE data move engines that allow simultaneous access to the ESRAM and the DDR3 from the SHAPE processor, CPU, and GPU.

      By choosing DDR3 they got a cheap ram with lower latency, and could afford ESRAM that brought the bandwidth up past comparable.

      There is no “complexity” issue. As while it’s great there is a single pool of GDDR5 in a video card. It’s sort of a caching/missing issue when you consider that the GPU and CPU have to access those but you have to be exceptionally stringent with allocation. Otherwise you will have crashing. For what that’s worth. Knack, the ONLY game to be shown running on actual PS4 hardware and not a dev kit, crashed a LOT at EuroExpo.

      So, yes, MS got MORE with DDR3. It’s a trade-off with other factors, but being an evolution of the EDRAM in 360, it should be a simple transition to devs.

      See, you would know all of this if you read the Digital Foundry articles that MS allowed interviews for, and a lot from HotChips conference that MS gave a keynote at. Where is ANY of the Sony tech discussion? Oh, they’ve been completely silent on hardware since E3? Oh, how fascinating.

    • Nick Albright

      Well, I’ve read some of those articles, and I think your 200GB “real world” is off. From this article (http://www.eurogamer.net/articles/digitalfoundry-vs-the-xbox-one-architects) they say:

      ” Of course if you’re hitting the same area over and over and over again, you don’t get to spread out your bandwidth and so that’s one of the reasons why in real testing you get 140-150GB/s rather than the peak 204GB/s”

      Yeah, I’ll be curious to see what the measure latency issues are. I’ve only seen guesses based on hardware for the PS4. And those guesses are based on current video cards, which don’t have a need for low latency. From what I can tell, the latency comes from the controller, and not some inherent limitation of the GDDR5 RAM itself. So there is a good chance the PS4’s controller doesn’t make the same assumptions that are made for a video card, and may not have high latency typically associated with a graphic card. I imagine at some point these numbers will be released.

      Oh, I will be willing to bet that neither the PS4, nor the XBOne retail boxes will have crashing issues due to inherent hardware design. I wanted to say that I MS’s showing on PC vs XBOne dev vs XBOne retail hasn’t been any better. But again, I don’t think they will have any issues. (Though, if the past is any indication, you’d think there is a greater chance for MS to have issues)

      You kidding? It’s definitely more complex to manage more memory pools. And I don’t think adding up all the bandwidth for comparison’s sake is accurate. I mean, do you think developers would perfer to have 8 GB of RAM running at 200GB/s. Or 4 pools of 2GB each running at 200GB/s? The latter gives us a huge number of 800GB/s! While the former is just 200GB/s. So while 800GB/s is bigger than 200GB/s, I think programmers would prefer the 200GB/s.

      Heck, using that logic the X360 is the most powerful, as it had a total memory bandwidth of 278GB/s! 22GB/s for the main memory, and 256GB/s for the eDRAM. But the key stat is how fast is the main memory, not the smaller memory pools. And for the 360 that is 22GB/s. Vs the XBOne’s 68GB/s. Vs the PS4’s 176 GB/s.

      (360 #s from: http://majornelson.com/2005/05/20/xbox-360-vs-ps3-part-4-of-4/)

      Sony even gave a presentation talking about this very issue, and they decided to go with a single pool at 176GB/s vs a main memory at 88GB/s, and a smaller eDRAM pool running at 1000GB/s, for a total of 1088GB/s. Even though 1088 i> 176, they felt the trade off in extra complexity wasn’t worth it.

      From: http://www.gamechup.com/ps4-sony-earlier-thought-about-slow-gddr5-edram-1088gbs/

      Oh, I want to say that Sony did alot of their work before hand. ie, the GameDev conference they gave several talks on the PS4 for developers. They just did their’s first.

      It seems like if you were going to “get more” elsewhere, maybe they are talking about the Kinect? But they have ended up with a less powerful system for a higher price. Just seems like a bad combo.

    • JumpIf NotZero

      You realize by having GPU, CPU, and GPGPU all drawing from one pool, that they need to segment off blocks of that for each section. It’s identical to having blocks anyhow. The difference is, it’s possible for GPU to overwrite/corrupt data for the CPU based on the timing of which is supposed to write or read first. A couple CPU misses because of bandwidth limitations and the GPU will write or read something it’s not supposed to.

      Sony at EuroExpo was having crashes on their first party titles. Take that for what you will.

      Yea Sony can sure talk a big game about 1088GB/s on a decision that they didn’t go with. The truth is, they could not afford the complex SoC. They went with the less customized SoC and GDDR5 they could easily purchase, albeit at a higher part cost. A cost that will keep them from competing on price drops and margins. (Please don’t even begin to argue Sony financials without first looking at their past five years financials, they haven’t posted a profit in 5 years including a 5.5 billion dollar lost just last year).

      I really don’t care what you believe from MS engineers or don’t. If you look at REAL WORLD usage of X1 you see 200GB/s. If you look at JUST ESRAM you have 150GB/s. If you look at Sony’s 176GB/s and consider the same 25% overhead applies, that’s 132GB/s Sony would have for the same real world.

      If MS decided that 150-200GB/s wasn’t enough to properly feed 14CUs over 12CUs that moved faster… How would one assume that Sony is feeding 18CU with 132GB/s???

      The answer is they aren’t. Which is why Ryse, Titanfall, BF4, Forza5, DR3 all look EXCELLENT on Xbox. There isn’t a major difference in hardware – get over it. Look at the games, MS is killing it. Sony has a fine machine, but if there was a 50% hardware advantage… You’d have to be asking yourself – WHERE IS IT?

      This generation is not going to won on exclusive games, both will have great exclusive games. It’s going to be won on features and services, that’s something that Sony is fearing.

    • Nick Albright

      Do you have any sources for your claims?

      I’ve already provided a source saying 140-150GB/s real world performance, vs your 200GB/s.

      Do you have any sources on the complications of the GPU & GPU drawing from the same pool using hUMA? And how do you think the XBOne escapes these pit falls? If they don’t, then they have the same issues. If you are saying that the 32 Megs that they have segmeted off solves the issue, seems like the GPU/CPU accessing it still has the same issue as you would in the main memory.

      Do you think Sony explored having a 1088GB/s arch w/out being able to afford it? DO you know how much more it would cost to design such a thing? 10M 100M? 1B 10B? Seems a bit of a stretch for you to be drawing such conclusions IMHO.

      Actually, looks like they posted a profit last year: https://www.google.com/finance?fstype=ii&q=NYSE:SNE

      Sure, and if you just at “just” the eDRAM of the 360, it’s 256 GB/s. Which is bigger than both next gen systems! However, I don’t think people are going to be saying that it is more powerful than the PS4 or XBOne. That high speed access is just to a relatively small RAM block. (32MB is much less than 8GB. Approximate 1/256’s the size or .4%) The access to the huge pool of 8 GB of RAM is what is key, and that’s where you see the 360 only had 22GB/s, the XBOne has 68GB/s and the PS4 176GB/s.

      As for your assumption that MS Hardware Designers know best… I don’t know if MS is really known for great hardware.

      It’s hard to judge games that aren’t even out yet.

      As for the XBOne it’s $100/25% more expensive, and isn’t a more powerful machine.

    • Guest

      Do you have any sources for your claims?

      I’ve already provided a source saying 140-150GB/s real world performance, vs your 200GB/s.

      Do you have any sources on the complications of the GPU & GPU drawing from the same pool using hUMA? And how do you think the XBOne escapes these pit falls? If they don’t, then they have the same issues. If you are saying that the 32 Megs that they have segmeted off solves the issue, seems like the GPU/CPU accessing it still has the same issue as you would in the main memory.

      Do you think Sony explored having a 1088GB/s arch w/out being able to afford it? DO you know how much more it would cost to design such a thing? 10M 100M? 1B 10B? Seems a bit of a stretch for you to be drawing such conclusions IMHO.

      Actually, looks like they posted a profit last year: https://www.google.com/finance?fstype=ii&q=NYSE:SNE

      Sure, and if you just at “just” the eDRAM of the 360, it’s 256 GB/s. Which is bigger than both next gen systems! However, I don’t think people are going to be saying that it is more powerful than the PS4 or XBOne. That high speed access is just to a relatively small RAM block. (32MB is much less than 8GB. Approximate 1/256’s the size or .4%) The access to the huge pool of 8 GB of RAM is what is key, and that’s where you see the 360 only had 22GB/s, the XBOne has 68GB/s and the PS4 176GB/s.

      As for your assumption that MS Hardware Designers know best… I don’t know if MS is really known for great hardware.

      It’s hard to judge games that aren’t even out yet.

      As for the XBOne it’s $100/25% more expensive, and isn’t a more powerful machine.

    • Mitchings

      The GDDR5 ‘overhead’ as you call it won’t be 25%, it’ll be around 5-10% and much more consistent.

      The DDR3 will be around 10% and the eSRAM around 30% on average with a greater inherent variability; and only a hand full of calculations will be able to take advantage of the combined bandwidth of both.

    • Mitchings

      Xbox One: 16x Micron DDR3-2133MHz 512MB/4Gb Chips (CL14 / Base Clock 1066MHz) = More than 13ns latency.

      PS4: 16x Hynix GDDR5-5500 512MB/4Gb Chips (CL15 / Base Clock 1375MHz) = Less than 11ns latency.

      DDR3 being better for latency than GDDR5 is a myth perpetuated by the citing of traditional PC configurations.

      Now the eSRAM will in itself be capable of better latency but it’s configuration in the Xbox One lends little extra to the CPU. It’s there almost entirely for boosting the bandwidth to the GPU; and the GPU isn’t latency sensitive so it will hold no advantage there either.

      The Xbox One has no real-world latency advantage and in fact; I’d wager that the PS4 has a tiny advantage with latency too.

      Regarding the ‘real world 200GB/s'; that’s going to be a good bit lower in reality and much more variable too due to the nature of the setup. The eSRAM isn’t a ‘cache’ either like the 360’s eDRAM, but more of a ‘scratchpad’ which is essentially a step back in its usage.


      PS4:
      ——-
      x86-64 & GCN1.1+ Based APU
      w/ HSA-like Features & hUMA-like UMA (Unified Address Space, Full Coherency etc.)

      AMD Jaguar x86-64 8-Core @ 1.60GHz (?)

      AMD Radeon Custom @ 800MHz Clock / 1.84TFlops
      18 Compute Units
      1152 Shaders
      32 ROPs
      72 Texture Units
      Fillrate: 25.6 GPixels/s & 57.6 GTexels/s
      8 ACES * 8 Queues = 64 Compute Queues

      8GB GDDR5 @ 176GB/s
      Real World Performance: Approx. 170GB/s (Consistent)
      5500MHz Clock
      11ns Latency

      Additional Buses:
      20GB/s Coherent CPU Read/Write + 10GB/s Coherent GPU Read + 10GB/s Coherent GPU Write

      Known Additional Hardware:
      Custom ARM Chip for Background Processing, Basic Audio DSP etc.

      XB1:
      ——
      x86-64 & GCN1.0+ Based APU
      w/ HSA-like Features & UMA + Scratchpad

      AMD Jaguar x86-64 8-Core @ 1.75GHz

      AMD Radeon Custom @ 853 MHz Clock / 1.31TFlops
      12 Compute Units
      768 Shaders
      16 ROPs
      48 Texture Units
      Fillrate: 13.6 GPixels/s & 40.9 GTexels/s
      2 ACES * 8 Queues = 16 Compute Queues

      8GB DDR3 @ 68GB/s & 32MB 6T-eSRAM ‘Scratchpad’ @ 218GB/s
      Real World Respective Performance: Approx. 60GB/s (Consistent) & 140GB/s (Variable)
      2133MHz Clock (DDR3)
      13ns Latency (DDR3)

      Additional Buses:
      30GB/s Shared for Coherent CPU Read/Write & Coherent GPU Read

      Known Additional Hardware:
      SHAPE Audio DSP, 8GB NAND Flash, ‘Data Move Engines’ etc.

  • Jerry Hu

    Could you be quiet? Could you just make some good games? Could you take care of Chinese area? You know…….XBOX360 is almost a dead console in Chinese area.

    • Eric Michael Fadriga

      lol you can’t have PS3 or Xbox 360 in china, it’s banned there.

    • Jerry Hu

      Sorry. I am living in Taiwan. So I have my PS3 and XBOX 360. But Sony is do a great job better than Microsoft. PS3 has a lot of exclusive games now, and PS3 also has games for Chinese version. I can play them with Chinese subtitles. But XBOX360 now…….nothing. Microsoft just abandoned the XBOX 360 in Taiwan. Many players are regretting to buy the XBOX 360 now.

  • Blayne Staley

    Once again. PS4 for the win.

  • Michael Norris

    Oh MS you silly bastard..you

  • jlcurtis

    So many people think they know what they are talking about bc they read something from somewhere. Haha comment sections over this are so funny take a few college level computer class(not typeing) and then maybe you can give your 2 cents
    Think of DDR3 and GDDR5 ram as two cars
    One has less power but has good brakes and good tires.
    While the other is more powerful but can not turn or stop very well.
    Now ad the esram its like a supercharger to the less powerful car now depending on how efficient the supercharger is designed and how well a tune can ve ne created will decided on which car is faster in a strait line but the car with the better bakes and tires will aways ne faster through the turns
    Asked a professer at school and easy ways to explain thinks to some freinds and this is what he said. Me being a car guy i though this was awesome

    • YnotNDalton

      LMFAO … your one of these ppl you refer to man

    • jlcurtis

      Prove me wrong then, if you can’t you have no ground to stand on

  • Junior

    Which is crazy that a Micro$oft developer saying this without having the PS4 Dev Kit… Pathetic…

40 queries. 0.306 seconds