Geomerics Founder On Xbox One eSRAM: ‘I’d Be Wary of Rushing to Snap Judgments About The New Hardware’

The PS4 and Xbox One have similar features with each of them offering some unique things, says Chris Doran.

Posted By | On 07th, Aug. 2014 Under News | Follow This Author @GamingBoltTweet


PS4 Xbox one

Unless you are living under the rock, the Xbox One’s eSRAM has been subject to much debate and controversy regarding it’s inability to render 1080p resolution. Several developers in the past have been critical about their experience working on the Xbox One with one notable developer straightforward stating that Microsoft cheaped out on the RAM. GamingBolt got in touch with Chris Doran who is the founder of Geomerics, a middleware company working closely on the PS4 and Xbox One. When we raised the question about eSRAM, Chris stated that it’s too early to pass a judgment about the new consoles.

“I’d be wary of rushing to snap judgments about the new hardware. You cannot really say anything until the first games are out and consumers have got their hands on both devices,” he said to GamingBolt.

“From what we are seeing both PS4 and Xbox One are well thought out pieces of kit, and consumers are having a hard time choosing between them. Both have similar features, but both also have their unique points, which is great for choice. The important thing is that they have re-invigorated the console space. I remember having discussions with investors who were convinced that the console market was dead. It’s great seeing them proved so dramatically wrong!”

Geomerics’s lighting middleware, Enlighten is being used in several upcoming games like Dragon Age Inquisition, Star Wars Battlefront and Mirrors Edge.

Thoughts? Let us know in the comments section below.


Awesome Stuff that you might be interested in

  • Guest

    This is nothing more than just PR speak, nothing to see here. Its obvious the guy just wants both to sell so he can hopefully make more money. He said nothing technical about either system.

    • bardock5151

      That’s right mate keep those damage control “skills” in form, you have to try and downplay this as quick as possible with another generic “just PR speak” line your lot love to spout so much.

  • GHz

    Forza is 1080p and is a beautiful game. Destiny is 1080p, diablo will be 1080p on both platforms, the upcoming halo collection will be 1080p 60 fps, and there are a few other games on both platforms that hit that mark. ESRAM seem to garner different opinions based on the skill level of the developer in question. One dev snuffed out the belief that ESRAM is a bottle neck and went as far as to say that coding for ESRAM 1st benefits all GPUs with GCN, implying that if you want better performance in regarding multiplats, code for ESRAM 1st. Alien Isolation Dev Had No Issues With Xbox One’s eSRAM.

    At this point in time, we shouldn’t be questioning whether the XB1 can do 1080p when they are games on the system that runs that resolution. By now it’s evident
    that the evidence supports, ESRAM was just a matter of getting use to for the
    more skillful developers, and for the less skillful, of course we expect them
    to have problems. My question is, why do you point to the less skillful ones as
    the leading authority? Isn’t that backwards? All in all, the fact is coding for ESRAM 1st benefits all GPU’s with GCN. This has been a fact for some time now. How about exploring that myth busting statement Rashid?

    One more thing Rashid. What does it mean when a game running @ 900p (Ryse) wins SIGGRAPH Award for Best Real-Time Graphics? What does that say about games on the PS4 running @ 1080p? How about exploring that fact?

    “A composite is a euphemism for a lie. It’s disorderly. It’s dishonest and it’s not journalism.” ~Fred W. Friendly

    • demfax

      Some XB1 games are still set to arrive in 900p or possibly lower. Even if XB1 games reach 1080p, they will need visual effect or framerate compromises compared to the PS4, because the GPU is still weaker among other things.

      The Siggraph award was for 2013 games, meaning Infamous wasn’t considered. 2014 award will likely go to Infamous or Driveclub.

    • Charles – The Great and Powerf

      Some games on PS4 are set to arrive at 900p as well. Don’t be an idiot. Some are set to arrive at 30 fps on both consoles as well. Its just obvious that the XBox One can achieve the performance of the PS4 at this point. And I’ll repeat another reply here as well. “Think of graphics processing like food processing. A microwave (ES RAM) at home (GPU) helps do small meals superfast, but you can always go out to pick up the big ones. The bus to the DDR3 RAM is just as fast as the bus to GDDR5 RAM. Its just got a wider bus on the GDDR5 RAM. If you keep the traffic down to the DDR3 RAM by nuking smaller graphics jobs, there’s no reason you can’t leverage the DDR3 RAM for larger scale graphics. I think DirectX 12 will bring that to the XBox One. Right now, they are shoving everything into the Microwave and its working. Its at equal par with the lame PS4’s GDDR5 junk. Just think if they leverage that DDR3. We are talking 4K graphics capabilities then! And I for one can’t wait! They are talking about Ryse being 4K on PC. I bet it will update to 4K on XBox One soon as well. So exciting, the future is! Yes. Very exciting.”

    • Psionicinversion

      are you to the same guy? how can ps4 or x1 game at 4K when to run 4K well with high grapgical detail require like 7-8TFLOP’s i reckon? neither console is capable

    • Reddz Foxx

      Well both will do 4k but it will be upscaled from 1080p. Sony hasn’t spoke much about upscaling nor even mentioned if there is the right hardware to do it so the assumption is they will let 4k TV’s use onboard hardware/software to upgrade the 1080p resolution . MS said they have the hardware/software in place so they can control the results of the upscaling.

      Both companies were well aware of 4k being around the corner so we will see how that plays out. I personally would rather the console manufacturers control the upscaling as TV manufacturers wont give 2 damns about improving the picture quality.

    • Psionicinversion

      actually there more than likely to give a damn, there isnt going to be 4K TV’s having mass penetration into ppls homes for like 5 years, so the TV manufacturers need to sell the TV’s some massive ones cost over $100,000 nobodys going to buy the TV if the upscaling from HD content is complete garbage are they. Id prefer to turn the upscaler off and let the TV do it, would probably be better

    • There is no freaking way Xbox One will have 4K without a hardware upgrade, and that’s not gonna happen.

    • GHz

      So there were no 1080p games on ps4 last year? SMH.
      Fact is 900p games can look stellar, surpassing games who only have 1080p to boast about. What about shading, shadowing, lighting, animation etc.? What about overall presentation? Countless NPCs, all moving around while all those other FX are turned on? What’s the point of pushing 1080p if your environment is sterile? Wouldn’t you prefer a more immersive game or a game that can offer more in social gaming interaction via modes etc. As a dev, why shed your games from what matters, just for the sake of 1080p? The only answer I can think about is your company mandates it.

      For example the screenshots below. Two great looking games but one looks a whole lot more immersive than the other. can you count the number of NPCs in SSOD compared to ISS?

    • demfax

      Sony does not mandate 1080p.

      Infamous: 2nd son can have a dozen+ enemies and civilians on screen. You’re comparing an in game screenshot to a bullshot.

      Every console or gaming device has a power budget that can be put towards resolution, framerate, or visual effects. PS4 has a higher total budget than Xbox, and good PCs have an even higher budget.

      That means PS4 would run Sunset Overdrive better, and Xbox would run Infamous 2nd Son worse.

    • Michael Norris

      Xone should be able to hit 1080p,these are first gen titles that use older graphics engines.The problem is even when developers ”figure out” the Esram it will always be limited to 32mb.Also, Snipper Elite a 1080p Xone game has vomit inducing screen tear and runs at a lower frame rate.Ps4 has more leg room regardless.

      You should take your fanboyism elsewhere.

    • GHz

      “The problem is even when developers ”figure out” the Esram it will always be limited to 32mb.”

      So after they figured out how to efficiently use the ESRAM, the 32mb limit will be a problem? Is that as long as they use older engines or true next gen engines designed for implementing TR? Because they haven’t implemented that yet.

      In regards to Sniper Elite, you answered why it struggled on the X1. it ran on an older graphic engine (xbox one wasn’t designed for those) and the PS4 is. The point here is like you said, 1st gen titles, they are only getting started. Logic is it’ll get better right. ESRAM will have plenty help as the tools improve.

      I’ll support the words of an experience developer over yours anytime.

      “They (Microsoft) are releasing a new SDK that’s much faster and we will be comfortably running at 1080p on Xbox One. We were worried six months ago and we are not anymore, it’s got better and they are quite comparable machines.” – Rebellion Games’ Jean-Baptiste Bolcato, a Senior Producer

      How is quoting these DEVs fanboyism. ? Why are you taking what they so personal? And why are you projecting your hate my way with your labels because of that?

      I recognize that the PS4, for what it is designed for, is a remarkable machine. Never has there been a console so programmer friendly before. In fact, when you look at the history of consoles, PS4s achievement is abnormal. The Xb1 position however is norm. Meaning it is expected to struggle with a system for quite a few before getting use to it. The PS4 somehow avoided that transition.

    • Charles – The Great and Powerf

      Think of graphics processing like food processing. A microwave (ES RAM) at home (GPU) helps do small meals superfast, but you can always go out to pick up the big ones. The bus to the DDR3 RAM is just as fast as the bus to GDDR5 RAM. Its just got a wider bus on the GDDR5 RAM. If you keep the traffic down to the DDR3 RAM by nuking smaller graphics jobs, there’s no reason you can’t leverage the DDR3 RAM for larger scale graphics. I think DirectX 12 will bring that to the XBox One. Right now, they are shoving everything into the Microwave and its working. Its at equal par with the lame PS4’s GDDR5 junk. Just think if they leverage that DDR3. We are talking 4K graphics capabilities then! And I for one can’t wait! They are talking about Ryse being 4K on PC. I bet it will update to 4K on XBox One soon as well. So exciting, the future is! Yes. Very exciting.

    • Matt

      *sigh*

    • Psionicinversion

      your other comment is waiting moderation. i think they just mean that it can upscale content to 4K i suppose they never said native 4K so maybe able to get around it that way, youd need 2x 290(x)’s, 2x 780 780Ti or titans to run at 4K well

  • Mark

    I cannot wait till X1 and PS4 uses Tiled Resources/Partial Resident Textures…….minds will be blown. Just look at that Glider demo from Build 2013, it was using only 16MB of memory, which conventionally woulda been 3G. I had a little Q & A with GraphineSoft (who makes Tiled Resources/PR middleware) on Ytube, and they claim that the games who use this tech may look up to 4 times better than our launch games. It is why Wolfenstein maintained 1080p/60fps on both systems (Id Tech 5). Except Id Tech uses the software based version. And it has nothing to do with the fact the graphics aren’t “intensive”. GraphineSoft announced that the first games using TR/PR will drop in 2015. I expect us to see more games running 1080/60fps. And we’re only in year one.

    • demfax

      Wolfenstein doesn’t maintain 1080p on either next gen console (PS4 dips down to 1720×1080 and Xbox dips down to 960×1080 to maintain 60 fps), and some of the textures are rather last-gen looking.

      Hardware texture streaming is a nice trick but won’t result in mind blowing performance gains like you’re suggesting.

      Actual next gen developers on beyond3d:

      Sparse textures/tiled resources just provides some native hardware support for certain parts of a virtual texturing pipeline. In particular it lets you avoid a manual indirection in your shader, which makes it easier and cheaper to sample and filter your virtual textures. It’s not going to allow for a dramatic leap in texture quality or anything like that. Most likely it’s just going to give you better performance, and possibly allow for higher-quality anisotropic filtering than what the software path will provide.

      Hardware support for partially resident textures (called Tiled Resources in DirectX API, or more generally “sparse textures”), do not bring any new possibilities over software virtual texturing implementation (custom shader code). It mostly shaves off a few ALU instructions (and one cache friendly memory load), makes anisotropic filtering easier to implement properly and doesn’t need tile borders for filtering (that saves less than one percent of storage space, but makes addressing easier because tile borders make power of two alignment hard).

      Hardware PRT also provides a minor memory saving (~10 megabytes total), as you don’t need your own indirection lookup texture. However if you use a hash instead of a texture with full mip chain (one pixel represents one page, for example 128×128 pixels of texture data), your memory consumption is equal to hardware PRT version. But this requires one extra memory lookup (if cuckoo hash is used) and some extra ALU.

    • Mark

      From GraphineSoft;

      Mark Dupree
      7 months ago · Shared publicly

      Another question. What would say, on average, that the average Xbox One or PS4 game shows on screen (40 inch 1080p tv) at a time, or every second. 2 gigabytes, or 3 gigs per second? My question is, can a developer show tons of more data/detail on screen at a second because of this software? Can u throw me a number in terms of gigs per second hypothetically, compared to what I’m seeing now in a game like Call of Duty? Or am I missing something? Lol. Thank u. 

      GraphineSoft
      7 months ago

      Hi Mark, thanks for your question.
      The exact numbers are very application specific but we estimate that – using the Granite SDK – game developers can use about 4 times the amount of unique texture data per scene compared to a coarse streaming approach.
      Game developers can choose to use these extra resources to upscale all the textures which results in ‘sharper’ final renders. Or they can add more different textures that add more unique details to the world.

    • Mark

      From Microsoft concerning GigaTexels

      “Jun 28, 2013Microsoft have announced DirectX 11.2, which will be released for windows 8.1 only. This update to DirectX 11.2 brings a new feature, which will work on the Xbox One too. It’s called Tiled Resources.

      If you ever wanted the flexibility of gigatexel resources but couldn’t enjoy them because of their massive memory cost, or GPU bandwidth limitations cripples your GPU performance, this new technology called Tiled Resources -for Direct3D- allows your GPU to handle massive textures -large atlas textures too- using a minimal memory footprint, increasing bandwidth efficiency.

      In this talk, which took place in San Francisco from June 26th to June 28th, some engineers have a deep dive into the API and its intended usage, and show how the technology works using excellent demos that take full advantage of it!

      As you can see in the video, this update allows textures to swap into the graphics cards -and your Microsoft Xbox One- memory much easier, and places far less strain on their memory, allowing the GPU to much better render textures”.

    • Mark

      Wether I’m right, or u are, we’ll have to wait n see next year. I’ll be watching GraphineSoft’s blog for updates on games that use their middleware. I can judge for myself from there. The Glider Demo from Build 2013 used 16MB of memory to render a 3GB texture, so as of now I’m leaning toward excitement.

    • GHz

      Hey mark, I think there is a difference between the two tech, PRT & TR. PRT just do textures and TR are not limited to textures. Its almost misleading to say the PS4 will have TR. As is PRT is available to both because of their GPUs are GCN based. TR however will be only available to XB1. It is an advancement over PRT for the fact that it is not limited to textures among other things.

      POV: But if you would ask me if Linux programmers can help improve upon PRT to equal what TR can do, I cant see why not. Mind you now, I’m not as tech savvy as you guys. The Linux PRT improvement thing is just my guess. Maybe you can help make sense of what I’m trying to convey.

      EDIT: Also I was under the impression that PRT is already in use on both platforms. Wolfenstein The New Order and, and Trials Fusion.

    • Mark

      Yes, PRT or what some may say Virtual Texturing, is built into Id Tech’s game engine. John Carmack implemented this tech into the Id Tech’s engine years ago when they released Rage for last gen. However it is only software based. GraphineSoft has said that hardware based TR/PRT are faster streaming, as the hardware will automatically do more jobs for the programmer. They even said DX12 will make better use than DX11, which will be faster. For me, the Esram in X1 will use this tech and it will allow for bigger, better looking worlds, with stable frame rates. Actually Crytek used a custom tiling solution on the Esram for lighting in Ryse. They “chopped” up the light, which saved them alot of bandwidth to stream the other assets. This is a form of TR, because assets are chopped up and streamed where needed, instead of just loading textures “dumb”. This is what I gather from GraphineSoft and Microsoft. Look, check out Rage that used software based PRT/Virtual Texturing. This was 2011, and it still has tons of detail. Carmack was able to do this with only 512MB of memory! It is because “tiling” the textures, saved him huge amounts of memory, and so he added massive more textures.

      This is what Microsoft refers to as huge amounts of GigaTexels in my post above (or below). Tiling the textures, shadows, lights etc, will save them massive memory, thus more “unique texture data or sharper renders, or no loading screens” per GraphineSoft. If u search GamingBolt for GraphineSoft’s interview about this, and how it will be even better with DX12, you’ll see for yourself.


 

Copyright © 2009-2015 GamingBolt.com. All Rights Reserved.