Crytek Comments On PS4’s GPU Compute, Can Be An ‘Incredible Performance Win’

‘The CPU performance is better than last-gen but not by a huge margin, the GPU on the other hand is a really sizeable improvement. ‘

Posted By | On 26th, Oct. 2013 Under News | Follow This Author @GamingBoltTweet

ps4 amd

Sony have emphasized a lot on the GPU compute capabilities of the PlayStation 4. But the question is, will multiplatform games explore this added advantage on the PS4? Cevat Yerli, founder, CEO and president of Crytek was asked about his thoughts on PS4’s GPU compute and this what he said:

“GPU compute is definitely the future. The CPU performance is better than last-gen but not by a huge margin, the GPU on the other hand is a really sizeable improvement. If the task is suited to it, moving to the GPU can be an incredible performance win. However, this is taking away performance from the traditional graphics pipeline so there is a limit to what you can move to the GPU.

As for the broader multi-platform question, supporting GPU compute isn’t really much more difficult than supporting multi-platform rendering, so it’s certainly something we’ll be using more and more on all platforms,” he said in a tech interview with EuroGamer.

It seems at this point both systems have a set of unique advantages. Xbox One’s board memory has a bandwidth of just 68GB/s, but it is more or less offsetted by the on-chip ESRAM, which tops out at 204GB/s whereas the PS4 has a faster GDDR5 RAM and reportedly powerful GPU.

It will ultimately depend on the developers on how they will use these unique features in their games. The PS4 and Xbox One are out next month. Let us know your thoughts in the comments section below.

Tagged With: , ,

Awesome Stuff that you might be interested in

  • RandomUser2yr29387

    The ESRAM isn’t an Xbox One advantage…..its a huge disadvantage for developers. This is the main reason for games like COD Ghosts only being native 720p.

    • Mike Greenway
    • Megaman

      Your a developer???

    • Megaman

      You are an idiot for even making a comment like that…..Like you hate for no reason….you think games now determine the future of these systems….get over it…stop spamming these site with your foolish thoughts ….Your not a developer and if you are then you need to be fired asap…go outside and play hockey or go watch a movie and do a review score

    • CultureShock

      Wow look how many up votes you get for lying. Absolutely nothing you said was factual, but it’s treated like it is. Insane. One of the huge advantages of having 32mb is esram is discussed here. Follow the link and get educated.

      And stop with the ignorant comments about tech you have NO clue of.
      Keep in mind also that the Cryteks comments were made at a time when the XB1 was still being optimized via drivers and it will be so until launch.
      Most of the info that people like you get come from sites like this, who’s readers make uneducated comments which are then treated like facts. Incredible. The truth is out there. Just look for it.

    • CultureShock

      And another thing, 1080p don’t mean anything when there is no significant difference it makes in how a game looks. You forgot that Ryse is so impressive visually, Sony’s own guerilla games developers thought that it was running on A HIGH END PC until crytech proved to them that it was indeed running on XB1 hardware.

      “Yes everything we showed at E3 and Gamescom was on kits. From my point of view the development is coming along well and with the optimizations we have put in, I am pretty confident the final game will look better than w at we showed at E3. I even had to open the cupboard to prove it to some guys from Guerilla who didn’t believe me. At the demo stations our kits were on top of the cupboard directly connected to the screens, so no way we could have done anything else other than run on kits.”

      Think about that for a minute. THEY THIOUGHT IT WAS RUNNING ON A HIGH END PC!! RYSE was never a 1080p game, people just thought so because of the it’s high fidelity graphics. Point is, there are new technologies in the XB1 that supports SMARTER ways on how DEVs can manage there resources. When the XB1 is fully optimized, this will become more apparent. That crytec interview dispels a lot of myths that people make up about the XB1, still you have people like you who make it there business to enforce them. No wonder MS say, “we’ll let the games do the talking!”
      When Sony fans accused Ryse of being CGI, it was a compliment! That how great the game looks! It burns you to find out that it was actual game engine and gameplay!

    • CulturelessShock

      yeah…i sort of approve…but not to the full extent. The only thing that needs attention here is the 2X Antialiasing MS is promising at 720p or 900p for their exclusives. And it needs to be mandatory and that which is something yet to be seen. 1080p makes a whole lot of difference with GPU computing per pixels. And the evolution from edRAM/GDDR3 to ESRAM/DDR3 is definitely worth their acid test next gen. It onlyaccounts to almost the bandwidth attained by full 8GB GDDR5 of PS4, not surpassing or same. The advantage is clear here.

    • CultureShock

      “only thing that needs attention here is the 2X Antialiasing MS is promising at 720p or 900p for their exclusives”

      No it don’t. Why,because people going around believing Ryse was CGI, and you had top developers like Guerilla Games who thought it was RUNNIING ON A HIGH END PC!

      FORZA, was running on XB1 @ E3 at full 1080p/60fps, and that was THEN!

      Consider this, Crytek achieved something that majority though impossible on XB1, and the fact IS, it is NOT YET optimized. MS is still sending updates, which makes game development painful for DEVs who then have to constantly go in and change things to take advantage (IF THEY CAN) of the new drivers as they come.

      During that time, Sony had delayed their 1st party title Driveclub. MAN if it was the other way around, and Forza was delayed, the internet would be going crazy right now. And that’s truth.!

      “It onlyaccounts to almost the bandwidth attained by full 8GB GDDR5 of PS4, not surpassing or same. The advantage is clear here.”

      Concentrating on ESRAM/DDR3 by itself is not how you going to measure the overall potential of the XB1.The very fact that we now know that XB1 supports Tile Resources via HARDWARE, you should be asking how esram plays with that. Textures are DEVs main concern most times when developing games. TR is a solution and is ONLY AVAILABLE ON THE XB1. That is massive!
      And the wild thing is, none of the games coming out for the XB1 next month utilizes TR effectively or not at all! That will be for the next wave of games, then you will see TRUE NEXT GEN. RYSE and Forza are only appetizers.

    • Bristow9091

      I love how you’re calling people Sony fanboys, and yet you’re defending the One as if it were your love child… your entire argument seems very biased if I might say so myself… but hey, this is coming from a guy who isn’t cool enough to use caps…

    • CultureShock

      If you’re not considering all the facts, and point me towards tabloid style articles that has nothing concrete to support your argument, and you start of with hurling curse words, yup you’re a fanboy. If I link you to an article, it’ll be because the DEVs involved are actually sharing info indirectly or directly to us, via quotes or interviews. In the case of dual shock article, that was click bait, and it was nothing concrete about their opinions. In fact that article should’ve started with, “In our opinion.”

      The FACT IS guerilla game devs thought that what they were looking at was a game running on a high end pc. The fact is that when the latest vid was shown, sony hardcore fans accused Crytek devs of showing a CGi version of Ryse. No one was arguing about whether it was 720p or 1080p. They started that again after they found out that it was all gameplay, so they had to find something wrong with it. Shit is just getting ridiculous.

    • Matt

      Sorry I couldn’t make out your point due to the lack of caps lock.

    • Bristow9091

      Oh snap! 😛

    • Guest

      But those FACTS though. At least you learnt something.

    • howfuckinghighareyou

      How the fuck did you read that article and then not read this?

      That did do the talking.

    • CultureShock

      EXPECTED! Cant hear truth w/o rebutting with curse words intact! TYPICAL SONY FANATIC ATTITUDE!

      That article ONLY assumes that the game was degraded somehow and threw numbers to prove it. It’s not definitive proof that it’s a downgrade. Then there is the fact that it’s a still from a vid that’s not even high def feed! Then there is the fact that there were added elements in the new and IMPROVED VERSION! Like greater draw distance, weather and better lighting/shadows. If you’re bias you’re not going to notice all that.

      AND when the latest build was shown AGAIN, you all thought it was STILL CGI!!

      You ps4 fanatics were still none the more wiser, so they addressed it AGAIN! Wow. HOW DID YOU NOT READ THAT!?

      What does that tell you? it tells you that the game is damn impressive visually. We’re talking crytek after all. Those guys know there stuff! Still pics cannot convey what a direct feed CAN. It was another article intended to get clicks, and nothing that carries weight for people who look for facts in order to be better informed. That is a tabloid like article that’s it. What the guerilla games developers experienced, now that’s REAL! That’s FACT!

      The fact still remains that the game was never 1080p, it was always 900p. Nobody would’ve known if crytek didn’t bring it up. The fact is, RYSE, is the MOST impressive console game visually right now. That’s why it gets so much attention. That’s why when people see in motion they think it’s CGI!

      If you see the game in motion, you wont know the difference. Stop pretending you do! It’s getting old!
      Does that look like something an under powered system CAN DO!?

    • wsoutlaw

      man, you are trying way too hard and shouldn’t worry about it so much. Ryse was actually announced to be 1080P by MS untill crytek announced the down grade. They also went from 150k polygons for the character to 85k and added some other stuff. Did you think that was just made up by everyone to make you mad? Alot of xb1 games were running on dev kits and dev kits are pcs not the actual xb1, but yes ryse was shown on official hardware. Honesty ryse has great character models but the environments arent special and are limited, and the game runs at 900p at 30 fps. Ryse just really isnt that impressive of a game with unexciting gameplay. If there is a ryse 2 im sure it will be much beter and 1080p

    • CultureShock

      Not worried at all. Was just trying to point out that a good looking game is a good looking game. You don’t need 1080p native for that, nor massive amount of polygons. Software tech has evolved enough to make games look better while running more efficient. Hardware/software integration is key going into next gen. But you’re right, I was a little bit too wordy. And we will not know until the games out. It looks good but gameplay, shrugz*. we’ll see. Personally I haven’t invested in neither platform. Thinking about getting a graphic card instead.

    • CultureShock

      What crytek had to say w/o no tabloid style conspiracy theories twisting their words around.

    • Looks like the Xbot fanboy got negative downvotes. You gunna cry kid? you seem butthurt

    • CultureShock

      ShiiET! What I’m finding out is that people like you are allergic to truth. Your heart jumped when you read that guerilla games devs thought Ryse was running on an high end pc, when in fact it was on actual XB1 hardware. You don’t want to hear that shit. That ruined your day. It wasn’t based on opinion or rumor. It simply happened and was documented. You don’t like those kind of articles. You know, factual ones. I’ll take your down vote, cause you learned something today.

    • Hiya_tiger

      running 900p at 30 fps. lol.

    • Dakan45

      Not next gen enough for ya? Have QTE and scripted gameplay.

      Next gen enough for ya? Or do we need to get ellen page in the game?

      I used to play games about i dont even know anymore.

    • Matt

      Press (X) to slander.

    • Dakan45

      Just finished zenoclash 2, how the hell this game got 5 and 6s is beyond me.

      It is a huge improvement, kinda of hub open world, upgrades, secret areas, amazing level design, good story, combat has a alot of options, yet gamespot gave it a 5.0 while samey borring scripted crap gets 9.0s..go figure.

    • Mike Sombi

      You obviously have bad taste in games my friend lol

    • Dakan45

      Says the guy who looks forward to killzone shadowshit.

    • Mike Sombi

      pc = construction worker, fitness trainer with no friends.

      ps4 = elegant but fun executive.
      xbox one = cool but lazy surfer.

    • Dakan45

      pc=Innovation, variety, progress, highest exlusive count, most developers,most support, mods, better prices, better community, better control and superior master race.

      Xbox= annoying kids on xbox live but devent fanbae.

      sony=in denial cuntfucks with extreme cases of fanboysim, shitty consoles and absolute apathy to sony fucking them in the ass. No life or friends and endless spare time to troll on the internet due to not having any good games to play.

      Like yourself, good job trollfuck.

    • Raymond Santiago

      Your comment is the epitome of elegance. It oozes sophistication and grace.

      Sarcastic comment aside, though, way to pigeonhole 3 gamer groups into self-biased categories based off a small sample size. If I were to just go off of my online experiences with communities, PC gamers are arrogant snobs with a superiority complex and complete hatred for consoles and console gamers because they feel they are entitled to have all their favorite games become PC exclusive. Xbox fans are defensive trolls who will jump on the smallest crumb of news that support their brand while trying to downplay anything that might say otherwise. Sony fans are in some ways the same as Xbox but don’t try too hard to defend their console. By far, though, PC gamers tend to be the worst, as they tend to de-rail any thread/conversation about games and consoles so they can talk about how much better they are than console gamers.

      This, though, is only from my experience with fanbases online. Most people who would be either pc/xbox//ps fans are actually pretty chill.

    • Dakan45

      Yeah yeah yeah whatever you say. The guy who i replied too has posted over 700 mesasges to me and continues to spam, he got two of his acounts banned, this is his third. I proven he is a troll by showing off his n4g page

      Google slasher716 and you will see his disquss comments.

      This has NOTHING to do with any system, the guy is a fucking troll with grade A rating, no life, enough time to post HUNDRENDS of messages and fill in my inbox. I also found his facebook acount on his slasher716 disquss acount.

      Then he got pissed and deleted his acount on fb and found some poor guy on facebook who uses “dakan” as his user name and posts it everywhere talking utter shit about me and saying shit like “stalking little girls” and other stuff.

      He has to be banned from the internet. So just report his acount on disquss like i did and his facebook acount.

    • Raymond Santiago

      My main concern was of the over-arching statements about the different fan bases really. Granted, I never took the time to look through his past comments (an error on my part), what you said about him specifically wasn’t of far too much concern. If he is as bad as you say he is, then he himself may deserve malignant comments, but I was going off your seemingly enormous hatred for the Sony fans and endless praise of PC gaming (making you sound like every other PC elitist).

      It’s like saying all Latinos are backwoods idiots cause a guy named Juan won’t leave you alone. I do feel for you though, a guy like that must be nothing more than an annoyance (why trolls exist I will never know), but the rest of us are alright.

    • Dakan45

      From my experiance and others, sony fanboys are the worse, they never STFU and wherever something bad is said about them, they go crazy and bersek to whoever said it including reviewers and youtubers.

      Mike sombie, aka Slasher716 is a gigantic troll cunt that spams me for days, He stalks me in old articles and the Admins of 2 other sites had to close down the comment section to stop him.

    • Raymond Santiago

      I was going off the impression that you were bashing on Sony fans/supporters in general (all you said was Sony after all). If what we’re talking about is fanboys, then I can’t stand a single one of them, for ANY console or subject really. From personal experience, though, I haven’t seen as many of the Sony fanboys as I have the Xbox One’s. That probably has largely to do with my tendency to look up more information on the PS4, and all I see are raging Xbox/PC fanboys in the comments. I may see the flip-side by visiting X1/PC sites I guess, but that’s not to say I haven’t seen any Sony fanboys. When I do happen to see them though, I question their idiocy and lack of general reasoning like I do any other fanboy.

      From personal experience, though, I still have a general extra bit of disdain for the PC fanboy. Fans are not bad in any way for any and all consoles/devices, but the fanboys should be shunned like the blight on our public image that they are.

    • Dakan45

      There arent x1/pc sites, xbox and pc are far away from each other these days.

      To be fair pc gamers do have vastle superior graphic, sound, mods, better and customizable controls and better prices and more exlusives. They are not being fanboys with no arguments, If you ask a sony fanboy, he will say they got the best exlusives and that sony is saving gaming or something and ofcourse free mp over xbox fanboys. I dont see what xbox fanboys have. Both can argue that console gaming is easier than pc gaming, but other than that, i would say pc fanboys have the most reasons as to why they choose pc.

    • Raymond Santiago

      The main reasons I prefer console gaming are: 1) It’s more physically social (easier to set up local co-op). 2) It’s the quickest and simplest way to game, just plug-and-play. Little to no distractions. 3) All my favorite game series (Final Fantasy, Persona, Tales of -, Tekken, etc) are console exclusive for the most part. I have played games on the computer, like L4D2, Starcraft 2, and various MMO’s, but gaming on a console is just a much better experience for me. I understand it’s not like that for everyone though.

      My main beef with PC fanboys is, although they do have all those arguments, it’s always “PC Master Race” and “Console scum/other derogatory term.” Then all they talk about is graphics, and although I know about mods and all the rest of the stuff, they act like they don’t. “Wut who wuld play Crysis 3 on consool lol go die” is not something I’ve only heard once. The complete graphics obsession, superiority complex, pettiness, and overall arrogance of most PC fanboys is what makes them top my list.

      Sony/Microsoft fanboys are usually going on about how their console is better than the other’s. PC fanboys act as if they themselves are better than others. Mind you, I speak only of fanboys, not general fans/pc gamers.

    • Dakan45

      the master race was created by yantzee crosshaw on zero punctuation becasue he couldnt play wticher due to his crappy laptop and the game had no tutorials. So he said that the game is optimized for the master race so the console peaseants wont ruin it for them.

      its a joke, console fanboys took it and turned it into a insult, we embrassed the joke and thus they think we call ourslefs the mater race.

      But seriously if after this long long generation you still cant afford a pc, its just sad.

      Since you brought up crysis 3, the game on consoles runs UNDER the lowest pc settings, there is practicly no reason to play it on a console, since we established its a graphics game and crytek believes that graphics are 60% of the game.

      “The complete graphics obsession, superiority complex, pettiness, and
      overall arrogance of most PC fanboys is what makes them top my list.”

      All those are missconceptions but the sony fanboys are all the same.

      You dont need to bother with all of that, just read the first 2 pargaphics and you will get it.

      Sony fanboys act as sony cant make mistakes and will defend them no matter what, as if their exsluvises are the best, yet get low sales and if something bad happens and someone reports on it, they hate them for being “anti sony” when they just reported the news, now that xbox one is in a complete disaster, it became “socially acceptable” for everyone to bash xbox one and thus the sony fanboys have turned into the biggest self preclaimed arrogant superior duchebags of all.

    • Matt

      720p to 1080p is like going from MSpaint to Photoshop.

      There’s a huge difference when comparing both those resolutions.

      And Ryse owes it’s visuals to the Crytek engine not the XB1.

    • CultureShock

      “720p to 1080p is like going from MSpaint to Photoshop”

      Depends on art direction. If you don’t need the 1080p, then you don’t need it. get off the numbers already. I know that’s hard but, really in end who cares when you have a game that looks like Ryse does that’s good enough to make pple believe that it’s CGi. No one said. “OOOOH IT’s 1080p!” they said wow, pretty game, must be CGI, running on high end PC, all the while it’s sitting comfy @ 900p/30fps on actual hardware. Once those facts are established, convos like this should not even exist. The fact is that there are just new ways and better tech that allows great visulas w/o worrying about if a game should be 720p or 1080p. Just wait when they start using that ray tracing more effectively. We need to stop parroting what these game sites are telling us what a game should be and let the developers just create w/o having to being eating alive by petty comparisons and WORSE, trying to explain themselves to pple who have no clue about the hardware/software tech involved. It’s gotten really bad. Worse when people (RandomUser2yr29387) make it their business to lie about tech they don’t know nothing about.

      If you’re the type to go squinting at the your monitor to find little flaws while playing your games, more power to you.

    • Matt

      I care… when I switch resolutions from 900p to 1080p on BF3 I can notice a clear difference without squinting.

      GTA V manages to squash your hardware/software debate… it all depends on the talent present in the dev along with the dev time and budget.

    • CultureShock

      Like I said, Art Direction. I don’t think you understand what that means. Not all games are created equal. Ryse running @ native 900p still looks superior to killzone running @ native 1080p.

      “GTA V manages to squash your hardware/software debate”

      if you going to point to that game, look at others that challenges GTA V merits. Only then you will have a broader view of things. Challenge everything.

      “it all depends on the talent present in the dev along with the dev time and budget.’

      You are right! That there is also a factor.

    • Whiskey Rye Bread

      “Ryse running @ native 900p still looks superior to killzone shadow fall running @ native 1080p.”

      Stopped there, laughed. Continued with day.

    • CultureShock

      The FACT is Killzone SF DEVs thought that Ryse was running on a HIGH END PC. For them to describe it as such meant that they themselves thought that they were looking at a game that was running at 1080p.Then there is the other FACT that hardcore ps4 fans thought that the latest build was all CGI. They challenged Crytek, who confirmed on twitter that it was all game engine and gameplay via twitter. My dude, if you’re going around believing what you’re looking at is CGI, what does that tell you about the power of the XB1? Bias aside, you know the answer to that. You should be asking, how do they do it? No matter what you say, that’s is impressive. Point is NO ONE ever looked at Killzone SF which is 1080p, and thought it was CGI.
      What’s all the talk about higher res, when the art don’t look like right? Ps3 can do 1080p too, but does that mean it’s equal in power to ps4? Which game will look better? A game running @ 1080p on wii u or a game running @ 720p running on ps4? Res is nothing more than an option, and 900p on xb1 looks good enough to fool talented devs like the ones from guerilla games. Give credit where its due.

      ps4 killzone vs xbox360 halo 4, still image. Even though we know the ps4 is superior, KZ is still not leaps and bounds better looking than old ass halo 4 in SOME scenes.

    • Menda

      Problem is that all videos released have been of a native 1080p build, not 900p.

    • achjjjan


    • CultureShock

      Riiight…It makes you feel better calling me one, because you prefer to be lied to in order to protect your feelings. it’s cool.

    • dirkradke

      When a developer actually states unequivocally that it is a huge disadvantage to use the ESRAM I will take note of it. As for this MUST be the reason – there are probably others as well. After the game is out I will look at comparison shots and see how much difference there is between the PS4 and XB1 version. Somehow I doubt it will make that much difference between 1080p/720p.

    • swinny

      Developers already have spoken, saying the ESRAM is a pain to use and the speed difference between XB1 to PS4 is ”clear and obvious”.
      So, shouldn’t you be taking notes right now? And if you think there’s no difference between 1080p and 720p, then next time you watch a 1080p movie, downgrade it to 720p and see if your not tempted to switch it back up to 1080.

  • Jack Slater

    Why didn’t microsoft use 64 or 128mb of this famous esram?
    Maybe it is expensive, but if it is actually this memory that accelerates all the system, why having limited it to 32?
    For the real pros, my question is, using more esram, for example, 64mb, would it require a complete change on the current architecture? Like all those busses, those paths, data movers, etc, everything is calibrated, clocked, etc, according to the 32mb?

    Or would 64mb change everything, giving must more power and bandwidth to the entire system, without having to change the motherboard?

    If the system would have been much much faster with 64 or 128mb of esram, it’s a pity microsoft didn’t invest more money on it.

    Just like Nintendo shouldn’t have made that ultra expensive touchpad, that forces gamers to move the eyes from the action: main TV, and should have invested all that cash on a f aster CPU, double the memory and use a much faster GPU, maybe microsoft shouldn’t have spent all that cash on kinekt, which I’m sure, it must count for 20-30% of all the manufacturing costs, and maybe should have chosen a much faster GPU, or much more esram.

    I believe esram is expensive because it’s complicated to make, and must require some rare and expensive materials, to make. But damn, if the system could gain a 30-50% boost in performances, with 64mb of esram, instead of 32mb, with all the gazzilions $ microsoft have, they shouldn’t have saved money on something that crucial.

    With the 4 256bit bus they have between the system and the esram, using 64mb certainly would have make a difference. While now, the peak bandwidth , in 100% reading OR 100% writing to the esram is about 100gb,theoretical, but more like 60-70 in real world, but if the commands being done are 50% read AND 50% write, then, the system can output like 200gb theoretical peak, or 140-150gb in real life.
    I’m sure with 64mb, the system would have permanently REAL 150-200gb of bandwidth crossing those crazy 4 256bit bus, instead of the actual 60-70gb.

    If possible, that would have made a huge difference. And maybe having 64mb instead of 32 was just a matter of 20-30 extra $.

    • theouteredge

      The ESRAM is actually part of the chip which also contains the CPU and GPU.

      The chip has a limited die space. So to add more ESRAM you would have to cut into the transistor budget you have available for the CPU and/or GPU. Making them smaller and less powerful

    • plcn

      could’ve shafted it in lieu of more shader cores/silicon… and spent that budget on higher bandwidth ram than DDR3. not sure if anyone else thought of that though…

  • Trim Dose

    lol in the end he says nothing XD, I mean is obvious he can’t talk freely while Ryse is around the corner to be released. LOL

  • Slay

    A incredible performance or smoke and mirrors…

  • jonam

    204 GB/s and yet COD Ghosts confirmed to run at just 720p on Xbone…am i missing something here? Things don’t fit well with those numbers…

    • CerN

      The bandwidth of the RAM does not determine the system’s total performance.

    • YnotNDalton

      The adding together of those two numbers is an illusion… that 32 mb esram is just a framebuffer.. it may be lightning fast but its only 32 mb’s… that will NEVER compare in real life to 8gb gddr5 high bandwidth graphics ram with a unified pipeline straight into the gpu that runs faster than the fastest pci express busses in pc’s these days… Bandwidth is key in graphics programming these days when we are moving to larger and larger textures… they need bandwidth.. latency has a miniscule effect on graphics processing and this is on top of the gpu difference 1.8 tfs vs 1.2… 1152 shader pu’s vs 768… and then for the future you have 62 asynchronous compute engines that sony had amd custom build into the gpu for them (which by the way amd is taking this idea and incorperating it into its next line of gpu’s).. this will allow many of what where cpu computations before to be qued for the gpu.. its called general purpose computing on graphics processing units.. as the gpu is much more capable as its not serial its parallel… and sony even figured out a way to do this on the gpu’s “off” cycles when its not busy rendering frames so it will have NO impact on framerate or graphics processing… Mark cerny has built and put a ton of thought into this system while MS only put a ton of thought into casual aspects such as kinect and tv and nfl to try and get more money from the masses. They put ddr3 in because compared to gddr5 its dirt cheap… they never expected sony to come out swinging with 8gb gddr5 unified in the ps4. Anyone who says ms went with ddr3 because of latency issues doesn’t know what they are talking about… Bandwidth is the key to graphics processing especially at this moment in time when textures files are becoming so large.. i mean we got 50 gb game dl’s … those large hd texture files need bandwidth… latency has a minimal effect on graphics processing compared to bandwidth and the little effect latency might have had was on the cpu not the gpu but as stated before they added the 62 ace to offload much of those computations to the gpu so latency will really be a non factor for the ps4 in graphics processing… Sony has out gunned ms this time around and really it was mark cerny who did it.

    • Jack Slater

      Yeah, those 64(8*8) computing units/paths, where even PC only have 2, and xbox one 2*8,once amd drivers are finished and devs have access to it, it’s gonna hurt. Sony devs will be able to extract even more power from the ps4.

    • Matt

      Don’t try to compare the PS4 or XB1 to PC when dedicated GPU’s dwarf their performance levels just on their own.

    • plcn

      204GB/s theoretical (includes both read and write) on a 4 x 8MB “cache”. That is an M, not G. As fast as the pipe to that cache is, it’s not a whole lot to work with… and code needs to be made to utilize it in an optimized fashion somehow and not just live within the standard DDR3 RAM pool…

    • sd

      One possibility is that from what I understand MS have moved the benchmark a lot over the past 12 months. I think that initially they had an under powered development kit, which was then improved as the year progressed. On top of this my understanding about the software for the x-1 is that it was actually very poor until recently. In addition they then increased the CPU and GPU power. So all of this would have made the X-1 difficult to work with regardless of any edge that the PS4 may have over it. This is also not the only thing holding the X-1 back which would make it less capable at launch. MS actually reserved 10% of its system power for Kinect. But recently they have announced this reservation will now be unlocked and developers can now access this Kinect reservation for games.

      So in a nutshell MS have been pretty poor at giving the developers a decent kit to work with and have changed things so many times that it is obvious to anyone following the news that launch titles might suffer. Whether this will change over time remains to be seen. But all of these queries will be answered over time.

    • wsoutlaw87

      ms did not remove the 10% from kinect they just said it was a future possibility. I have also never heard of an under powered dev kit because its not like ms just suddenly changed the cpu or gpu and they are usually over powered. Even if it was slightly changed, games are scaleable so it wouldn’t have been a big deal, especially if they are on other platforms. Launch games are never the best looking games in a generation though, so there will be an improvement on all systems.

    • jlcurtis

      That is a rumour, with only one unamed source

  • DoUbLeZz (Lydon)

    Like this article! Like he said its up to the developes!!! Let the games speak for themselves!!! =)

  • Solid Snake

    So according to most sources xbo has 47MB of esram@204GB/s & 8GB of DDR3@68GB/s which would bring it to a total of 272GB/s so xbo has 96gb/s more total memory bandwidth then ps4.However acording to this source due to xbo having a weaker memory architecture & gpu in comparison to ps4’s memory architecture & gpu ps4 has a significant graphical edge over xbo.

    Cheers Gamers & Happy Gaming!

    • wsoutlaw87

      trying to add ram bandwidth is ridiculous and its 32mb, 47 is the Cache. A game has to be very specifically designed to use that ram which it needs to support ddr3. That may be a problem for mulitplatform games. The esram is an impressive set up in the way it can write and read, but its only 32mb that can use the 204 vs 8gb @176 on the ps4

    • dirkradke

      Supposedly it is 88% faster than GDDR5.

    • Solid Snake

      Correct if i’m actually wrong but Albert Penello Microsoft’s director of product planning said on neo gaf forums that esram can do read/write cycles simultaneously.So with that being said correct me again if i’m actually wrong but xbo should be able to do parallel computing to solve this memory bandwidth problem since most processors can perform this task these days to varies degrees.

      P.S. i’m still on board with PS4 but if this is true then it should be recognized & understood by everyone at this point as both consoles final specs are pretty much set in store if i’m not mistaken.


  • damian

    I seriously people would quit being so lame over graphics and what not. They both will be good systems and it’s all gonna come down to exclusives. I for one will own both systems. For me Xbox has the better launch games but Sony has ones coming that are a def must play for any gamer.

  • Ahmed Abd El Salam

    I can smell now ryse on ps4 with native 1080p resolution 🙂

    • Dakan45

      Didnt he say it wouldnt run on 1080p even if it was on ps4?

    • Mike Sombi

      No troll… just no.

    • Dakan45

      He said ps4 cant run it on 1080p ,take it fuckass.

  • Matt

    Wut up Consoletards?

  • Pixelsword

    The PS4 and Xbox One (without Kinect) technically retail for the same.

    Can anyone give a rational reason, a.k.a. without trolling, as to why?

    • dirkradke

      What sort of answer are you looking for? Why are they the same? Why does the XB1 cost more? You’re not exactly being clear on what you want to ask. The XB1 includes KINECT and a headset and that is where the extra $100 supposedly went. Without that I’m guessing it should be $100 cheaper, but I’m guessing Microsoft will not go without KINECT for at least a year. By that time developers will have had the opportunity to develop for KINECT and Microsoft can see if it’s a strategy working for them.

  • Guest

    YearSupercomputerPeak speed
    (Rmax)Location1993Fujitsu Numerical Wind Tunnel124.50 GFLOPSNational Aerospace Laboratory, Tokyo, Japan1993Intel Paragon XP/S 140143.40 GFLOPSDoE-Sandia National Laboratories, New Mexico, USA1994Fujitsu Numerical Wind Tunnel170.40 GFLOPSNational Aerospace Laboratory, Tokyo, Japan1996Hitachi SR2201/1024220.4 GFLOPSUniversity of Tokyo, JapanHitachi CP-PACS/2048368.2 GFLOPSUniversity of Tsukuba, Tsukuba, Japan1997Intel ASCI Red/91521.338 TFLOPSDoE-Sandia National Laboratories, New Mexico, USA1999Intel ASCI Red/96322.3796 TFLOPS2000IBM ASCI White7.226 TFLOPSDoE-Lawrence Livermore National Laboratory, California, USA2002NEC Earth Simulator35.86 TFLOPSEarth Simulator Center, Yokohama, Japan2004IBM Blue Gene/L70.72 TFLOPSDoE/IBM Rochester, Minnesota, USA2005136.8 TFLOPSDoE/U.S. National Nuclear Security Administration,
    Lawrence Livermore National Laboratory, California, USA280.6 TFLOPS2007478.2 TFLOPS2008IBM Roadrunner1.026 PFLOPSDoE-Los Alamos National Laboratory, New Mexico, USA1.105 PFLOPS2009Cray Jaguar1.759 PFLOPSDoE-Oak Ridge National Laboratory, Tennessee, USA2010Tianhe-IA2.566 PFLOPSNational Supercomputing Center, Tianjin, China2011Fujitsu K computer10.51 PFLOPSRIKEN, Kobe, Japan2012IBM Sequoia16.32 PFLOPSLawrence Livermore National Laboratory, California, USA2012Cray Titan17.59 PFLOPSOak Ridge National Laboratory, Tennessee, USA2013NUDT Tianhe-233.86 PFLOPSGuangzhou, China

  • Solid Snake

    Their going by the flops measurement saying if the consoles meets or exceeds a certain amount of flops their classified as super computers see for yourselves by going to the page by clicking the link below.


  • Mathew Bridle

    I played Ryse at the recent #xboxonetour. I don’t normally fighters but Ryse is exceptional. I had no idea what the resolution was I don’t care. The game is stunning and the game play, which matters most, is awesome. And as for teh controller – the best just got better.


Copyright © 2009-2017 All Rights Reserved.