Targeting Full HD Using Xbox One’s eSRAM Is A Challenge, Constantly Optimizing Usage: Witcher 3 Dev

But they are taking advantage of it according to Lead Engine Programmer Balazs Torok.

Posted By | On 30th, Jul. 2014 Under News | Follow This Author @GamingBoltTweet

the witcher 3 wild hunt

It’s no secret that several developers have faced issues in outputting 1080p resolution due to Xbox One’s eSRAM. In fact one such developer stated that Microsoft cheaped out on the RAM. But there is this potential with eSRAM and texture streaming which when used in combination brings the best results.

Speaking to Balazs Torok, Lead Engine Programmer of the upcoming Witcher 3: The Wild Hunt, GamingBolt asked whether the development team faced any challenges due to the eSRAM and whether they are taking advantage of the same.

“I would say that targeting Full HD with a complex rendering pipeline is a challenge in terms of ESRAM usage,”  Balazs said to GamingBolt. “Especially when this pipeline is built on deferred rendering. We are definitely taking advantage of the ESRAM on Xbox One, and we know that we have to fit into these limits, so we are constantly optimizing the usage.”

As expected with CD Projekt, they will make sure that they will be pushing and optimizing each platforms to the max. So we can be rest assured that each version of Witcher 3 will look gorgeous according to their respective technical specifications.

Witcher 3: The Wild Hunt is due in February for the PlayStation 4, Xbox One and PC in February 2015. For more on the game check check out our hub page here and stay tuned for more on The Witcher 3 in the coming days.

Awesome Stuff that you might be interested in

  • trywizard

    reaching 1080p on X1 is just a matter of time , if one dev figured the secret how to make there everyone will use it …
    sorry for my bad english .

    • Kamille

      the only xbone games that are 1080p are cross gen games, So you can bet this will never become the standard.

    • It already is becoming more frequent with upcoming games.

    • Emanuel Correia

      yep Battlefield Hardline, Destiny, Forza H2…

    • Michael Norris

      Destiny is a cross gen title Forza H2 is running at 30fps and I am quite sure it’s running on a forward rending graphics engine.The thing is even with a forward rendering engine you have games like Wolfenstein,that still runs at a higher resolution even with a dynamic frame buffer 1760x1080p vs 960x1080p.

      At the end of the day unless developers gimp the Ps4 version,Ps4 titles will look or run better.Xone titles are getting better, but you can be damn certain that developers have to work hard to get that 1080p.Even then you have games like Sniper Elite,that has so much screen tear it will make you sick. MS must be telling developers to push for 1080p no matter how bad the screen tear or graphical cutbacks are.

    • Nintendo Fan 4 Lif3

      actually Forza H2 is on the same engine as Forza 5

    • albatrosMyster

      yup, that’s what he meant, forward rendered, very static lighting… it’s a 360 game with more textures and polygons.

    • Nintendo Fan 4 Lif3

      But Playground Games are making the X1 version while Sumo Digital do 360 so technically that’s incorrect

    • albatrosMyster

      You really think it will be more different than say titanfall was between xb1 and ps360?

      I mean, I have no problem with games being “cross gen” – “Cross platform”, etc. but I find some level of absurdity in people who try to point to this game as being “different” because the programmers working on one version have someone else’s name on their pay check…. they’re still porting it, there has to be some common code, otherwise they would call it something else.

      The crew is ported by some 3rd party on the PS4, does it mean it’s a completely different game on this platform? Then is it platform exclusive? obviously no.

    • Nintendo Fan 4 Lif3

      well let’s see Forza 5 is only one Xbox One and Turn 10 Studios said Forza 5 is only possible on Xbox One. So put two and two together yes this pair will be drastically different than Titanfall. Plus, Playground already said that the engines would be different between the two versions of the game, that is they’re entirely different and not that one is an upgrade or downgrade version of the other. Also, by different I meant we won’t have the graphical problems Titanfall had because that game shared the same engine across both platforms so performance wasn’t as well as it could have been. Besides you can’t really compare them since they’re different genres right genius? I’m certain there is common code obviously for them to keep the game play as similar as possible but of course the majority of it for each version is new since both are being made for different platforms. Lastly, what does that have to do with them being the same game? Also, the PS4 version is being done by Ubisoft Reflections actually, while Ivory Tower does PC and Xbox One so no, since it’s a collaboration, it’s not exactly what you’d call a third party outsource in this case. And obviously that wouldn’t be platform exclusive nor is it a different game since the game play is kept essentially the same any casual gamer could nail that stupid.

    • Guest

      BFH isnt even confirmed 1080p on either system. So dont go getting ahead of yourself there skippy. The PS4 beta was only 900p with only 32players and it had worst frame rates than BF4 did with 64 players. So i wouldnt get my hopes up.

    • Guest

      And yet Witcher 3, Shadow Warrior and more arent. Weak system!

    • Guest

      The fact this is even an issue speaks volumes of how patehtic the system really is.

    • Really? Because the last time I checked not every game runs at 1080p on the PlayStation either. So is it pathetic also or are you just singling out a specific console for no reason.

    • Guest

      Ooh, touchy,touchy, you gonna cry? And where did i say that all PS4 games are 1080p? You oversensitive, jumpy fanboy. And what are we talking about? 3 games on the PS4 which arent 1080p vs like 20 on the X1? Heck some are even 720p and barely 30fps. So yeah, thats just as bad,….please! The PS4 is weak but the X1 is just straight up pathetic. And even more pathetic than that is the MS fanboys sticking up for such a weak POS. I wasnt singling out anything you overprotective twit! This was a article about X1, so get Sony and the PS4 out of your mind you obsesed clown!

    • LOL. Youre the one whose pathetic. You trolled all this way just to call this console weak. If its weak and you don’t like. Why are you here? Maybe because your life is worthless. The funny thing is this console gets more respect than you. You’re the one whose crying. Its weak, its weak. LOL. Look at you. Head back home trollboy.

    • Guest

      Im weak for pointing out a fact?! Youre the weak one that is sitting here crying, defending it like you’re offended by the truth. You’re pathetic! Get your facts right, stupid fanboy..

    • I’ve stated the facts about you. As well as the fact of the Xbox. Did I hurt your feelings? LOL

    • Guest

      Nah, kid, I’ve stated the facts about you as well as the fact about the Xbox and it definitely hurt your feelings, LOL!

    • You can borrow my phrases all you want. I know that its hard for an idiot to be creative. LOL. I teach class every day if you want more lessons. No charge.

    • Guest

      Wow, how pathetic, you actually think youre smart and cool. How sad. Im embarassed for you, I’ll leave you alone now, you’re one of those “special” kids.

    • Very special! I have the ability to make fanboys cry. LOL

    • albatrosMyster

      to be fair, he is right.

    • Right about what? If you’re talking about these consoles being weak. Then let me remind you that both of these consoles are 8-10x more powerful than the previous ones. That didn’t stop people from gaming then. Did it?

    • albatrosMyster

      8X for the XB1, 12x for the PS4, that’s what each project targetted…

      Now, maybe it did not stop you, but as for me, I only played the console only titles on consoles, everything I could, I played on PC… the PS4 has made this decision a little harder, since the multi-platform titles are not completely broken by low resolution and frame rate on it… I can’t say the same for the XB1.

    • LOL. Spoken like a true PlayStation fanboy. Your credentials were not taken serious by me when you defended that troll earlier. 8gigs vs 8gigs plus esram and yes contrary to fanboys beliefs esram is ram. Let me give you a hint es(ram). Just because its used as a frame buffer also. Doesn’t make it a frame buffer lol. Just to correct your previous statement. One has very fast unified ram that’s mostly used one way. The other has multiple pipelines plus small but ultra fast ram if you’re not counting the 6gs of stored data which no one is currently using because the granite SDK kits aren’t ready for release. Now you have proven that you’re a biased Sony troll hiding behind a PC mask and I have proven you to be wrong. Again. So stop wasting my time this will get nowhere for you.

    • HoLeeSchitt

      You must be rubbing your nipples in joy whilst writing all dat. Lawl.

    • Lol.

    • Kino

      Forza 5 isnt cross gen and it’s 1080p.

    • d0x360

      Forza 5 like looks absolutely gorgeous on my TV. Its one of the few racing games with an absolute locked frame rate @60 fps.

    • I need to get a new by black friday.You can see the pixels, very low dpi.

    • d0x360

      I got a 75 inch Samsung led LCD at Costco for $900 a few months ago. I was worried it was a cheap panel from a holiday sale or something (door buster black Friday deal etc) but Costco let’s you return anything no questions so I gave it a shot.

      Hooked it all up and it definitely wasn’t a cheap panel. Has amazing picture quality, no ghosting, its pristine. Its a true 240hz display with all the bells and whistles in its software. Turning in smooth motion mode in ps4 last of us looks absolutely incredible. Since its a 60fps game (most of the time) it doesn’t get any of the weird jutter other games get with true motion turned on.

      My TV budget was going to be up to $1800 but I spent half that thanks to Costco so if you have one around check them out. They also have a 3 year return policy,no questions asked, full refund no restocking fees.

    • That sounds good!

    • Guest

      Forza does not look “gorgeous” its aliased as heck and blocky and flat and sparse as can be (due to the low poly count). You just have low/bad standards.

    • It looks beautiful on high DPI TVs….

    • Guest

      Right, right.

    • albatrosMyster

      Have you seen Wipeout HD on PS3? it looks amazing too! 1080p 60fps! back in 2008!

      Honestly, Forza 5 only has cars that look better than the games we already had on consoles, everything else is PS2 level…

    • d0x360

      You are more than welcome to your opinion and I’m more than welcome to mine. I’ve never seen any ps2 game with Cars that detailed or tracks that detailed. The tracks have every single tiny little bump, groove, stain, and piece of graffiti that the real world tracks have. Even the burms have scratches on them from cars cutting. I’ve also never seen a ps2 game render tires that deform based on a physics simulation or treads that wear down and get scruffed up during a race. I’ve never seen a ps2 game where the engines are perfectly modeled down to the last bolt. I’ve never seen a ps2 game with lighting and reflections that are perfect.

      Sure there is some aliasing and some of the crowd are made of sprites but I’ve never played a racing game to stop and look at the crowds. I tend to play them to actually… You know, race. Its also a launch title. A LAUNCH TITLE.

      At the end of the day it’s a fantastic looking game full of crazy detail with a phenomenal physics engine & sound engine with some incredible AI and online racing. Its fun.

    • albatrosMyster

      Well, as I said, the cars are very detailed…

      The reflections are half or quarter resolution, also they are completely clear (reflections in the real world tend to be softer than the original)…. this is not a high point.

      Now most of those track details you mention, they only appear in photo mode, when you race the texure filtering is so bad that you barely see the road’s texture a few feed before you, the sprite crowd and trees mess with your perspective when you take a turn and the trees + people keep their face perfectly aligned with you, I would say this is MUCH more obvious when you race at 200miles per hour than on screenshots, as it passes as OK in screenshots, it’s just jarring when you play this game and you expect a “next gen” experience… then you take a turn and you notice how eveything is made of cardboard…

      So yeah, the photo mode looks great, but the game, not so much, it should not be brought up as a graphical showcase… if anything it tells you why you don’t want 1080p 60fps games on this console, there are just too many compromises to do to achieve it.

    • d0x360

      Track detail is always there. I’ve put about 100 hours into the game.
      Reflections are rendered at full resolution and 30fps. You can read words in the reflections. You can read the corner info written on the ring which is written in small letters on bumper reflections.

      I don’t know what the purpose of your nonsense is but you aren’t going to convince me I’m wrong when I play the game daily and see these details constantly.

      Does trying to discredit graphics in a game make you feel better about buying your ps4? Does it somehow offend you that a game on another console looks good? Does it in any way effect your daily life?

    • albatrosMyster

      No, I was just disapointed when I saw it in action, that game is really not a graphical showcase.

      – It barely has any anti aliasing (this is not to blame turn 10, I am sure they wanted to use AA, and they tried it… see the E3 2013 demos, AMAZING)

      – The details on and off the track, while you race, are truly not present, I stand by this… they only show up when you go in the photo mode (bullshot mode) the car’s motor is a simpler version when racing, there is barely any texture filtering for the track, so you don’t see any details on the track or any object that is almost vertical.

      As for the reflections you are right, they are full resoltion, but half frame rate….

      Obviously, this is still the best looking car simulation out there, as no GT game was released on PS4 and project cars is not out yet on either consoles or PC… so for many people this is the best in class, bar none, I have no problem with that, if I was a car simulation fan, I would probably have an XB1 right now (or it would be hitching me).

      as for your last point: It affects my daily life when I read the comments and someone like you bring it up as an example of graphical fidelity prowess… then, just as you took the time to write about it first, I took time to respond to you.

    • d0x360

      You must have played a different forza 5 than I did because all the details are there all the time.

      The only difference between photomode and race mode is reflections run in 60fps mode during photomode.

      I can capture a video right now on a capture device not Xbox DVR at full 1080p that shows the detail. Its always there and the fact that you are trying to argue that it isn’t absolutely boggles my mind.

      As for aliasing, its not that bad. Blue cars have the most jaggies then white but even at its worst it isn’t offensive to the eye.

    • Eagles83

      Forza 5 and Forza H2 are not cross gen games as two examples. Sure Forza H2 is releasing on 360 but is developed on a different engine by a different team which really doesn’t count.

    • albatrosMyster

      Typical XB1 users excuses, does the fact that the version of The Crew is handled by a different team make it a PS4 exclusive too?

    • Eagles83

      If it is developed as a different game then yes. I’m not really familiar with “The Crew” as it didn’t really look interesting to me. Seems pretty simple to me that if a game is using a different engine and developed by a different team then that would make it a different game. It isn’t like having Wolfenstein on both console generations………same game/engine but different dev teams. This is an instance where both is different. BTW I have a PS4 as well so I’m not making excuses. Just pointing out flawed logic where I see it.

    • Michael Norris

      You can get 1080p to fit the issue is you have to make cutbacks.If MS would had put in 64mb Esram this would be a non-issue.Sony made a more complete console end of story.

    • Guest

      You can fit 1080p into ps3/360 games, doesnt mean the system can really do it, is good at it. The X1 is not good for 1080p, sure it can do it, if it strains with all its might or you make the game simple enough. Neither is a good situation.

    • d0x360

      Ms didn’t go with ddr5 because it wasn’t believed yields would be high enough by launch. Sony took a gamble and it paid off. They got very lucky. Had yields not been high enough Sony either would have had to delay the ps4, switch to ddr3 or only use 4 gigs of ddr5. Neither of those options would have turned out very well because while the esram has proved difficult it does have its benefits.

    • Guest

      You’re right about the reasons as to why MS went with DDR3, But what are these “benefits” you speak of?

    • albatrosMyster

      You know what, Sony had a vision for a powerful console, MS had a vision for a set top box that plays games better than the 360… They each stuck to their original vision, each with an expected cost for their machine, Sony had people with enough vision to push for the no compromise approach (within the limits of the project) … they had to argue for the 1billion $ cost of the extra memory, because even if yelds got better over time, it still cost more if you put twice as much memory, they could have decided to sell a gimped version of the PS4 with 4 or 6BG or GDDR5 and make a profit on each console from the get go (or sell them 350$).

      esram is only a benefit because the DDR3 memory is comparatively slow… keep in mind, it’s still slower than the GDDR5 memory.

    • d0x360

      Sony’s vision had nothing to do with a separate company being able to boost their yields of ram. It was just lucky. Sony themselves said they got lucky. End of story. You think vision matters? All that matters to these companies is money.

    • albatrosMyster

      So yeah, these guys design a console that uses GDDR5 in quantities that are unheard of and they will need to order millions of parts… yet they don’t talk to their pats suppliers until they place the initial order, you know, they certainly did not need to at least retrofit some plant(s) to accomodate this order, all luck! Sure this is how it works… Sony also bought the AMD APUs from the Newegg, they were on sale! Lucky guys, because originally they were planning on using the same as MS, but because of the sale, they were able to afford a better pard!

      While MS on the other hand were all brilliant and despite their obvious failure to deliver a console that perform as it should, they deserve no criticism, because they though their plan through and went on with it no matter what, this is what we call being brilliant these days, keeping on with your plan, even when it’s clear someone has a better plan… at least admit they could have done better… sold the console cheaper from the get go, etc.

    • Guest

      Wow, you people just make up any excuses to make yourselves feel better. He’s telling you right there, that its a struggle. Its a crappily built system. MS cheaped out on RAM. If they push the PS4 (another weak system) as much as they push the X1, the PS4 would yield better results. MS messed up bad, they suck, and you fanboys suck even worst.

    • Gekko36

      Jesus, you are arguing over a TOY.

      You’re not a developer, you’re not an engineer, you’re an armchair enthusiast at best.


      “MS messed up bad, they suck, and you fanboys suck even worst.”

      If you read this comment back, it becomes clear that you didn’t do very well at school. Your grasp of english is very poor, so I suggest you give the forum a rest and go back to school and get a better grasp of English (well, American half arsed English anyway)

    • Failz

      XB1 actually uses more RAM then the PS4 8gigs + 32mg of ESRAM compared to 8gigs in the PS4.
      And PS4 only uses 4.5 / 8 gig where I think the XB1 uses 5 / 8 gigs. Either way X1 has 32mgs more of Ram.
      You console peasants are too funny 8 gigs is nothing and these consoles don’t even use all there 8 gigs for there games. Talk about holding PC gamers back.

    • demfax

      PS4 games use 5GB of GDDR5 RAM. Wrong.

      Stop acting like an ignorant fanboy.

    • Jecht_Sin

      Troll. The PS4 has 4.5GB of “static” RAM reserved for games. Plus the virtual memory, with a pool of at least 512MB which can get larger using the RAM left by the OS.. That’s at least 5GB of RAM to you.

    • albatrosMyster

      OK, you cannot count a buffer as an extra amount of RAM, everything that is in your esram has to be copied from the main RAM pool… then when the work is done and the GPU needs to work on another chunk of the amazing 32MB it has to transfer it back into the DDR3 and copy another chunk of 32MB into the esram… if you do too much of these access your total memory bandwidth becomes equal to the slower bandwidth of the two (even if the memory is managed by DMA)….

      Now for memory usage of the different OSes, it’s relevant, but you have to account that the numbers we got for their respective reserves probably already changed, or will change soon… and like on the PS360 it will only get lower as time goes + you cannot compare available memory without taking into account how programs are compiled and how they use it.

      By example, on the PS4 they use the numa architecture, both the CPU and GPU can read and write in the exact same memory space, this saves used memory by removing the need to duplicate information for each of them AND it frees up memory bandwidth because, again, you don’t need to move those memory block around… There were NO game engine, or any software, made for this setup until recently, since that did not exist in a commercial system yet! once 3D engines take advantage of this setup there will be a boost there (one that cannot be taken advantage of by the xb1, because of the esram setup).

    • Roger Larsson

      You are correct, the XBox one has more memory than the PS4.
      PS4: 50 GB BR + 500GB HD + 8 GB RAM = 558 GB
      XBOX One: 50 GB BR + 500GB HD + 8.032 GB RAM = 558.032 GB (and the mighty cloud!)

      But you can not really merge memory like that, you need to take bandwidth and persistence into account.

    • Roger Larsson

      lol correctly notified me that I had forgot 8GB of flash in the x1.

      PS4 does also have a flash… Smaller…
      “An additional 32 MB (256 Mbit) flash memory chip is also fitted.”

      Guess that is PS4s bios + boot filesystem
      It does also

      “PS4 includes a secondary ARM processor (with separate 256 MiB of RAM) to assist with background functions and OS features.”

      Adding these up shows that XBox One wins this round with a good margin – congratulations! 🙂

  • d0x360

    It’s going to take time because ondie ram like this isn’t standard. They need to figure out the best way to make use of the memory and we won’t start seeing that for about a year. Engines being used today weren’t designed with this hardware in mind so they are stuck trying to work around it without rewriting their rendering engines.

    It doesn’t help that the SDK doesn’t have any rock solid tips and tricks built in to make use of it. Right now the biggest tech team on Xbox is working on halo 5. Once they get that game content complete they will start rolling some of its engine features into the SDK so all devs can make use of them.

    That’s how its always been done and how it always will be done. that’s the main reason as consoles age their visuals improve. Best practices are developed and then rolled into the development tools so other devs can use them.

    The ps4 uses a brute force approach almost exactly like the original Xbox. Its easy to get results but it doesn’t have many hidden tricks up it sleeve and before people attack me that isn’t an insult nor is it bad. Much like the ps3 and to a certain extent the xbox360, the Xbox one requires thinking outside the box and finesse. The e performance is there but you can’t just instantly access it. You need to figure out the best ways. Now I’m not saying the xb1 will ever outperform the ps4. I’m saying it will get much closer to parity than it is today just like the ps3 did last gen.

    Either way we are set for a fantastic gen of games from both platforms and I’m excited to see what Devs bring to the table

    • Guest

      Excuses, excuses, and more difficult to develop for consoles are not better, thats a fallacy perpetrated by fanboys. And the PS3 was more powerful than the 360. Thats why as they figured it out more it became better. The X1 is more difficult to develop for and weaker. So it wont be like the PS3. The only saving grace is devs gimping the PS4 ver. And just so you know, devs still have alot of figuring out to do when it comes to GCN and compute computing. Two areas were the PS4 is defnitely superior. But ultimately, both systems are weak.

    • d0x360

      I’m not going to argue with you. You are free to believe whatever you want. Take care.

    • Guest

      But you just did, Take care.

    • demfax

      “PS4 is off the shelf, brute force” is a myth. PS4 has several important customizations to GPGPU compute (8 ACEs, onion+ GPU cache bypass bus, and volatile bit flag), and unified GDDR5 RAM.

    • albatrosMyster

      What they mean by optimizing their engines for it, means that some simpler kind of lighting will be used to fit the frame buffer (see how Wolfenstein run at 1080p 60fps, it’s even very close on the xb1, as it does it most of the time) … while I think it looks on par with other XB1 titles, on the PS4 it’s definitely one of the less impressive games, even compared to other cross gen titles like The Last of Us or Tomd Raider:DE.

      So, 10% or so more resources available won’t change much in the end, it’s still 50% less powerful than the PS4, but now it’s closer to use all of it’s potential… which titles like DriveClub/Bloobborn/The Order/Uncharted will show… We still have to see ANY in engine footage of Halo 5 and even the remake of Halo 2 may end up not being 1080p (or it will look simpler than advertised at first).

      So yeah, it will be better than the launch games, but the PS4 will also offer some serious benefits over the early games, drivers and draw call loops will be re-written in pure ASM, GPU compute will be used more sparingly, and all bandwidth saving features used to optimize for the xb1’s weaker hardware will also make the PS4 version of games run faster… as smaller chunk of memory objects move around faster on faster memory too!…

    • d0x360

      That’s not what it means at all

    • Roger Larsson

      PS4 does not take a brute force route. Not in any way!

      A brute fore route would have had a current GPU with dedicated GDDR (2-4 GB) paired with a 6-8 core x86 CPU with DDR3 (8-16 GB).

      Two chips that over time would have been merged into one (requiring the CPU to be an AMD). But note the merged chip would still have to support both DDR and GDDR memory buses…

      Instead they said what use do we really have of the DDR3?
      A temporal storage of graphical assets, need to manage when and what data to move inte GDDR. Wait now the GPU has full speed access to all memory! Lets use its computing power to help the CPU – add some extra GPGPU related hardware!

      Now take a second look at XBox One, what is it really
      Its SRAM is only slightly faster than PS4 GDDR.
      Its GPU has been reduced to fit that amount of SRAM.

      Microsoft is really trying to sell you a GPU with dedicated 32MB of graphics memory paired with an 8 core CPU with 8 GB of DDR3.

      I would have taken a 4 GB PS4 over the current XBox One…

    • d0x360

      Brute force means they don’t need much optimization to get results. If they have any issues they dedicate more resources.

      So yes it does brute force things currently and no it has zero to do with specs because its relative.

      That won’t be the case further down the line. As the console ages more optimization will take place, more tricks will be found and more efficient code will be written.

      As for the Xbox that 32mb of ondie memory is on top of a unified pool of ddr3. Both pipelines can be accessed at the same time. It is the only difference in the entire rendering architecture between the 2 consoles. They have the same basic hardware design Sony just used better components.
      Let’s futilely argue some more over a topic that means absolutely nothing. I play games to play games. Don’t much care what they run on as long as it works and both consoles work.

    • demfax

      Both Sony and MS have world class coders that will extract every bit of performance out of their consoles with their drivers/APIs/SDKs.

      The difference is PS4 has more powerful hardware to work with, so it will stay ahead in graphics performance. Naughty Dog and the Sony ICE team will make sure the PS4 SDK is as optimized as possible.

      Once developers start using PS4’s GPGPU customizations heavily, they’ll be able to get even more performance out of it.

    • d0x360

      Did anyone claim otherwise? Anyone who thinks the ps4 doesn’t have more resources at its disposal is foolish. The issue at hand is people have this impression that the xb1 is some crippled box and it isn’t. Its plenty powerful to hold its own as far as consoles go.

  • David

    Projekt red is an amazing developer. They’ve had it at 900p in Xbox for awhile now and I think that’s prior to the freeing up of GPU space. I’m confident they’ll be able to hit 1080 on Xbox. This is the only game I’m anticipating more than Destiny.

    • maybe

      Very high on my list too. Amazing graphics, huge open world, and so much content. What’s not to love.

    • David

      Exactly. Should fill the void skyrim left.

    • Guest

      You do realize that the difference between 900p and 1080p is 44% more pixels right? Yet you think getting back 10% from its initial 1.31TF is going to be enough? You better think again.

    • David

      No I think that I’m thinking right. I just think your fanboyism is leading to denial. Lol Bungie already did it with destiny. I think Projekt red is more capable than Bungie.

    • Guest

      My fanboyism? Youre the clown defending this nonsense, instead of accepting the truth. Stupid fanboy. And Projekt Red and Bungie suck.

    • David

      Projekt red and Bungie both suck? Your opinions no longer have any merit with me. You have a nice day. And like I said Bungie has already done it. It’s fact. Not opinion.

    • Guest

      I dont care if my opinion has no merit to you. You mean nothing to me and your opinion has no merit with me you’re nothing more than just another generic fanboy with a generic name. And i know the res was already upped to 1080p on Destiny. But if you werent just a dinklehead, you’d understand that the game was probably already able to run close to 1080p even before the SDK update, so the update just allowed them to do it. Cuz there is no way that a 10% boost to GPU performance is going to get you from 900p to 1080p. Not gonna happen, Nope, sorry, no, no. Now buzz off and go sweat companies some more.

    • d0x360

      I went through witcher 2 again a few weeks ago to brush up on lore. Very excited for witcher 3. Haven’t decided which platform to get it on yet. I won’t be building a new gaming PC till roughly march so it’s gonna be a console. Based on what has been shown both versions are stunning so I imagine its going to come down to controller preference.

    • David

      Yeah controllers mean a lot. I never played the witcher 2 but I heard it was pretty amazing. Really looking forward to seeing what Projekt red can do

    • d0x360

      On the 360 the graphics are VERY rough. Things pop in pretty close and its sub 720p so if you have a capable PC and despite it being old it does still take a decent PC you should go that route. that being said the Xbox version is still a fantastic game. Tons of content, excellent mature story with tons of lore and quests. The combat can be really easy or deadly hard depending on difficulty but it never feels cheap. Highly recommend it.

    • David

      I’m more of a console person and I no longer have my 360 :(. But I’ve heard about the graphical issues but that the gameplay is great. I’ve watched some gameplay of it. It looked very fun. But sadly I’ll have to wait until February to experience a witcher game

  • Illusive Man

    CD Projekt Red has already achieved 900p without the lastest June SDK or the 10% GPU reserve. My gut tells me they will achieve 1080p without sacrifices before launch.

    • Guest

      Oh boy, another hopeful wishy fanboy, you do realize that the jump from 900p to 1080p is a 44% pixel more right? Not saying its impossible, but seriously its simple math.

    • Edonus

      It’s not that simple. Counting pixels themselves is misleading because in resolution they are exponential. As an example to double the 1 pixel range you would need 3 pixels to double that you would need 12 pixels. So to get from 1 pixel to 4 x what it produces it’s not 4 pixels 16. the pixel shift needed to upscale 720p to 1080p is 1.25 pixel for each native pixel. That would be one full commercial step. 900p to 1080p is even less. Then take into account the pixel density of TVs since that is what consoles are built to display on and how images are created pixel counting is just a PR stunt that these sites have pushed and mis educated the public about.

    • Jecht_Sin

      It’s still 44% more pixels however you want to twist it around. A PR stunt is when people say that the difference between 900p and 1080p is only 180 pixels. That would apply (stretching the definition) when using a frame buffer of 1600×1080 instead of the Full HD frame buffer of 1920×1080.

    • I’m ok with 900p and a locked 30fps.

  • I don’t care what res it is, just make sure it’s a locked 30fps. Framerate>Res

    • d0x360

      Just ignore him.

  • Goodacre

    I really hope CDPR is spending the same amount of time on each platform. nothing worse than a developer favoring a weak console to get parity.

    • JerkDaNERD7

      I have full faith in these guys. They’re are top notch and never come off bias. There are tech to the core and understand fully how XOne architecture functions.

  • JerkDaNERD7

    Once developers get a hold of deferred-rendering, games will look amazing if not better on XOne. Love the choice Microsoft engineers took with the architecture to fully implement deferred-rendering.

    • demfax

      Will they look amazing? Sure. “Better”? No.

    • JerkDaNERD7

      Of course, but better…? Well we’ll see. Too early buddy.

    • demfax

      Nope, not too early.

    • JerkDaNERD7

      That’s what I just stated?

  • TJ

    X1 is weak but the best looking game to date & from many debs & reviewers says Ryse is the best looking next gen game out LOL

  • Aussiesloth

    Really who cares it will still look OK and playable some people care more about the resolution that the game itself

  • Pingback: Anonymous()


Copyright © 2009-2017 All Rights Reserved.