PS4 CPU Was Hard To Develop For – Grand Ages: Medieval Developer

The team at Gaming Minds Studio ran into a few problems while developing Grand Ages: Medieval for the PS4.

Posted By | On 02nd, Sep. 2015 Under News | Follow This Author @GamingBoltTweet

ps4 amd

On paper, it seems the PS4 has a decent hardware with a mid-level GPU and GDDR5 memory. However its CPU clock speed is on the lower side compared to the Xbox One. Apparently this is giving developers who are new to PS4 development some troubles.

Speaking with GamingBolt, Daniel Dumont, Creative Director at Gaming Minds Studio, the team behind Grand Ages: Medieval revealed that they ran problems into CPU problems despite the console featuring a powerful hardware as a whole. When asked whether they were able to translate all the graphical features from the PC version to the PS4, Daniel stated that, “The PS4-version has all features of the PC-version including the unlimited viewing distance. As this was our first title for PS4, it wasn’t a piece of cake to get there. However, we are using our own inhouse-engine which we fully understand.”

“This made it possible for us to translate and optimize every feature. And we had not stopped before as we knew that the PS4 offers a powerful hardware. However, we ran into problems with the CPU as we have some very deep simulations running in the background which really pushed the CPU to the limit, especially when using the fast-forward feature. So we had to tweak and optimize here as well until everything ran smoothly.”

On a related note the PS4 version of  Grand Ages: Medieval will be running at 1080p. The game launches later this month so stay tuned for our full review soon.

Awesome Stuff that you might be interested in

  • andy

    That’s why Mark Cerny stood up 2 and a half years ago at the PS4 reveal and said the GPU can be used for CPU tasks too and that “we cannot wait to see what devs will be able to do with this”.
    Shame it has mostly only been Sony development studios that have put out amazing next gen looking games so far and at full 1080p resolutions too.

    • Psionicinversion

      you cant throw everything on the GPU you know. It is a still a weak GPU at the end of the day. Its only sony first party with custom built game engines that get the most of it. a multiplatform engine cant

    • angh

      2 CU’s aren’t used this much for gpu calculation because of bandwidth limit, so they can be used for cpu’s related tasks i guess.

    • TheChosen

      the problem is though that PS4 is bandwidth-starved. Its just 100-120 Gbyte/s for graphics. And 30 gbytes/s for cpu-tasks. much too less compared to a PC or even the Xbox One, which has 200 Gbyte/s just for graphics (or 180 Gbyte/s) and another 30 gbyte/s for cpu-tasks.

    • angh

      You really saying that xbox one have 200gbyte transfer?:) 32 mb is already all busy with frame buffering. Access to ddr3 is only 64gbs, and you have separate write/read. other ->

      jokes aside, guys are saying ps4 cpu gives them some hard time, but they not comparing it to xbox one but to PC. They not developing for Xbox and they are mostly PC guys so they had to change a bit their approach to use the cpu. They found a way and they seems quite happy with that, because it was something new – it will be easier next time;)

    • TheChosen

      Xbox One has about 190Gbyte/S for graphics. And another Additional 30 Gbyte/s for cpu (Which the ram delivers). Just that easy. But its cpu cannot adress the eSRAM-cache!

      While PS4 has 100-120 Gbyte/S for graphics alone. And 30 Gbyte/s for cpu. BUT: instead of having a seperate ram, it has to share its ram for everything which leads to STALLS if the data does not get fast enough from A (Ram) to b) (graphics-card or cpu)..
      Since that bandwidth is simply too less for complex games.

      Deal with it. It was a stupid design from the beginning. Everyone warned about that.

      For Comparison: WiiU has over 500 Gbyte/s for graphics (more than double than Xbox One), has guessed 60 Gbyte/s for its cpu (since no lazy SoC-design, it doesn`t have to share its bandwith, but the cpu has its OWN bandwidth it can use!). And another 10 Gbyte for its OS-cpu (dualcore Arm 1Ghz, which is the point why the WHOLE OS of WiiU is running on that ARM and you can switch off the console and it still does messaging and all OS-things since the OS still runs) and it also has a seperate ram which is about 12 Gbyte/S which delivers the data for the fast caches.

      That is what we know so far from hackers which found out these things about WiiU. We also know:

      The WiiU does not only have just “normal” 2 GByte of Ram. But that RAM was DESIGNED like 2x 1 Gbyte. IT IS devided by TWO! Which means => 1 Gbyte of Ram is directly addessed to the ARM 1 Ghz Dualcore-OS-cpu. It can also access the eDRAM. However the PowerPC-maincpu cannot adress that Ram! That part is called “The OS-Part”, where the big Internetbrowser (6 tabs), the complex Miiverse-Messaging-System, the OS and other things like Friendlist are running.

      While the other Ram is ADRESSED for the 3 PowerPC-cores.

      The Main-cpu can also adress the cache (32 MB of eDRAM). Both cpu can talk to each other via that eDRAM-cache. Its a BIG advantage over BOTH Xbox One & PS4.

      The advantage of this system is => The OS-cpu has its OWN seperate Bandwidth. It doesn´T have to take the bandwidth which is allready limited on both PS4/Xbox One. And thats a big advantage if you ask me.

      The Disadvantage is => The OS-system runs slower since its arm and arm is allways slower than e.g. x86.

      Thats why PS4 has soooo many problems right now- it simply has not the advantage of having fast caches- like WiiU or Xbox One- to iron out such situations in where the cpu/gpu can`t get the data in time (to iron out such “drops” you need to have a cache in where you can puffer/store that data, the 4 MB of cache in cpu doesn`t help you because you can only use 2.5 Mbyte for games in PS4 (see Infamous Second-Son Developers).. Dumbed down graphics (no reflections) in Metal Gear Solid 5 for example: Less lightings. etc. You get it.

      Btw: Metal Gear Solid 5 itself is a really bland game. Not much going on. So of COURSE it can run with 1080p. No heavy AI, not many enemies, just a bland, empty openworld. It Isn`t comparable to Witcher 3 or even Assassins Creed Unity where a LOT is going on.

      Just do a bit more action which is seen in Witcher 3…and bam the fps go down in 1080p or the game is not 1080p (or both).

    • angh

      You can’t simply add ddr3 bandwidth and esram bandwidth. And esram memory is constantly blocked as frame buffer. This is the reason for 900p resolution, or worse frame rate if resolution are matched.

      And what soooo many problems PS4 have now because of no cashe? I never heard about one. quite opposite – having the whole gpu-gddr5 connection at 180 gbps (dunno where you got 100-120 – the Garlic buss is not shared with Onion bus) is much better than having 32MB of esram with one direction speed of 102GB/s (204 gb/s both way, which is just a bit faster than direct access to like 5 GB of gddr5).

      Witcher 3 is 900p on xbox one, it’s 40% less pixels on screen than in ps4 version. And it works still quite fine.

    • Rhodri

      TBF the esram problems should clear up with directx 12

    • angh

      ESRAM is not a problem. It’s great for frame buffer and posprocessing. Problem is, it’s too small for full HD, and no dx12 can fix it. But surelly thanks to it xbox keeps up.

    • Rhodri

      That is not true if QB can run 1080p any game can.

    • angh

      post processing requires same frame being reworked a few times. seperately for each effect we want to put on top – bloom, AA, light, fog, deferred shading and so on. For that esram is too small:

      And in general, in the end this doesn’t really matter – games on xbox one are as enjoyable as those games at ps4. Difference is hard to spot, or you’d have to have two systems side by side to see it. But from technical point of view xbox one is simply bit weaker – eSram surely helps, but at the same time the gap doesn’t go anywhere. And that was the only thing I was addressing.

    • Rhodri

      But 50% more powerful on paper does not relate to 50% real world performance in games. Last gen proved that the ps3 was more powerful than the x360.

    • TheChosen

      You never heard about one? Are you serious?

      Thief 4 crashed down to 9 Frames/s on PS4!

      And many games followed which run pretty much the same. They even fu***d up Tetris – yes TETRIS- because of this fact it ran like ~5 fps (yes no joke here! 5 fps!). They had to PATCH it to death, because it wasn`t working from the start. Just like “Streetfighter V” btw. which also came with HORRENDOUS lags/problems.

      Witcher 3 runs worse than Xbox One, and many more games too.

      And even Zombi- that old 2012-game Zombi, runs WORSE than on Xbox One. Zombi furthermore has missing textures, no flying away birds and lots of missing things.

      THAT alone should tell you it has enough problems.And it still can´t get stable 25 fps on PS4, not even 25 fps! It fails in every corner. Even with Sonys own games which can´t get stable 30 fps in more and more cases.


      “Witcher 3 is 900p on xbox one, it’s 40% less pixels on screen than in ps4 version. And it works still quite fine.”

      Doesn`t matter since the game NeedsCPU-power, not GPU-power. You don`t need a GTX 970 for 1080p and 60 fps in Witcher 3. You need a FAST processor. And a GOOD architecture with a lot of cache. PS4 has no cache. Therefore it runs like it does. Thats why Xbox One runs it simply faster. Because the cpu of Xbox One can use more than 2.5 Mbyte for its games (a) and b) because it has the eSRAM to help it out for graphics and it has more free bandwidth for the cpu.

      PS4 crashes down to ~17 fps in Zombi.

      PS4 crashes down to ~15-20 fps in “Everybodys gone to the Rapture” when you go from outside, to the inside…etc. You see? Lots of proof.

      PS4 HAS problems. And not just a few.

      Oh and about “dunno where you got 100-120 ”

      Sony itself said it. Bam. There is your numbers. Its on Sonys Official foliage. Saying, since the cpu is dynamic-clocking the gpu does not have 180 Gbyte/s (176 Gbyte/s is the real number), but its just theoretical numbers which it cannot reach in real-world-scenario (games).

      The full numbers of PS4:

      – 30 Gbyte/s for the cpu (each core only 3.75Gbyte/S max)
      – 120 Gyte/s for LOW-demanding cpu-dependent games (e.g. Indies or games which only require 2 cpu-cores)

      And 100 Gbyte/s for HIGH-cpu-dependent games such as Witcher 3 which occupy all 6 cpu-cores.

      Thats it.Bam. No magic. Just logic. and Its on Sonys foliage my friend. 176 Gbyte/s was just theoretical blabla PR. The real number is 100-120 + 30 Gbyte/s for EVERYTHING. Its not new that Sony gives away theoretical numbers

      and Btw: The PS4 also has the problem that the bandwith of the gddr5 is HEAVILY occupied when graphics-intensive games are running. Thats how it goes. But it allways has big disadvantages if you don`t have other parts coming to help you when the data cannot be fetched in time.

      And thats PS4s problem: Often enough data are not fetched in time. Thus the framerate crashes down to 9fps or 15 or 17 fps. And you will SEE that when it happens.

    • angh

      you dont know what you talking about. in xbox one cpu is connected to ddr3 by normal bus. ESRAM doesn’t help. ESRAM is connected to GPU. In xbox one bandwidth between CPU and RAM is limited to 20GB/s as this is max speed of communication with north bridge.

      And in the end: CPU doesn’t need to consume this amount of bandwidth. CPU doesn’t consume textures, AI scripts/physic are heavy to process but not heavy to transfer. Just arrays with numbers.

      And no, games about which you wrote are not broken on ps4. And no game is running faster on xbox one, if you start think about resolution difference.

      You need a fact CPU in witcher because dx11 is a single task API which heavily using a CPU to coordinate – this is not a case in dx12 as they improved spu usage and multithreading, it is not a case in xbox one as it was using modified version of dx11 and is not a case for ps4 as they do not use dx at all. AMD created mantle as base for dx 12, and that knowledge was already used by engineers in both companies.

      As for witcher 3: path 1.8 makes possible to turn off post-processing which fixes ps4 problems – and post-processing is only gpu based, with full HD its not so weird it have bigger impact on performance than hel;f less size on xbox. As well for some reason in cutscenes ps4 is synced to 20fps – this is development decision, not a hardware problem.

      And what zombie game you talking abput? Dying Light? check the digital foundry comparison: “Between the two console versions we definitely have to give the nod to the PS4 version with its improved frame-rate and better texture streaming, its higher resolution, and a near complete lack of screen-tear. ”

      And if you saying that 40% pixels difference doesn’t matter then I’m wasting my time.

    • TheChosen


      “ESRAM doesn’t help. ESRAM is connected to GPU.”

      I allready said that above! Can you read? No you can`t, obviously.

      Don`t tell me things i said nearly 10 hours ago before you!
      Btw, just a PRO-tip for you:

      You doN`t need a cpu-dependency to get a BETTER performance out of eSRAM.

      Xbox One isn`t the one which is bandwidth starved. PS4 is.

      So i tell you how it works in Xbox One:

      Xbox One does its front/graphicsbutter via eSRAM.

      That is fast (190 Gbyte/s).

      So- graphics simply go there. And they go FASTER than on PS4 in average.

      The cpu-data however goes from the cpu- to the DDR3-ram and back.

      So you see? No magic. Just logic.

      Thats why a PC does use the same architecture.

      There is no stupid GDDR5 which feeds the cpu.

      The cpu in modern day PCs wants DDR3/DDR4-ram NOT gdddr-ram 😉

      AND: The Xbox One thus has no bandwidth-starving like PS4 has it.

      Thats why you don`t need to cut graphics-effects on Xbox One-games like they do for PS4.

      Btw, idiot:

      “In xbox one bandwidth between CPU and RAM is limited to 20GB/s as this is max speed of communication with north bridge.”

      And why does your image show TWO busses with 20 GB/s each? Are you dumb?

      The PS4 is not the Xbox One.

      And your above image shows that Xbox One has even 40 Gbyte/s bandwidth, idiot.

      Man. Don`t talk about things if you have no single clue what you talk about. Have a nice day my friend. And don`t forget: More games which will crash down like Until Dawn/The Order/Thief 4/Tetris/ or Witcher 3 will follow.

      Oh another thing though: AMD Jaguar does only support 30 Gbyte/s bandwidth. Because its a tablett-cpu, not a PC-model. AMD said that. So this means, even if Xbox one has 40 Gbyte/s, it can only use 30, the rest is not usable/gone.

      “path 1.8 makes possible to turn off post-processing which fixes ps4 problems -”

      That is stupid. So they are further downgrading the game? Wow…lame…there you see what crazy hardware PS4 uses…crap.

    • angh

      sorry, should be: is not connected, according to the diagram.

    • AndrewLB

      Good point. People seem to forget that the PS4 has a pretty slow GPU, considering it’s only 1.84 tflops. I’m pretty sure they had to do additional “optimization” aka “downgrading” to get it runing on PS4 because as of now, I have yet to see a single PS4/Xbone game look identical to the PC version. They always reduce LOD, blur backgrounds to hide low quality textures, etc.

    • jacksjus

      I didn’t see this post but you are correct. I remember them speaking on this.

    • TheChosen

      You cannot do complex AI (like Alien Isolation, Zombi etc) on a
      simple thing such as a GPU. It will never work. Since => complex AI
      can ONLY be done on one single cpu-core!

      You can only do things like Assasins Creed Unity on a gpu. Having 100s or 1000s of pretty cheap enemies. All acting together. but thats also why the game ran like crap. Because GPUs are simply not really suitable for such things.

      And you can throw physics on there, like shown with GTA 5. But thats it.

      No magic, since GPUs does not have things like “branching-units” or big
      caches (besides a few kilobytes) or other things. GPUs are simply that
      => DUMB. They are just simple SIMD-units. With some integer-units and
      a little bit cache to power them. Thats it.

      While a CPU is a big Complex circuit, no matter what the cpu-type is. ITS ALLWAYS much more complex than a simple GPU.

  • jacksjus

    Sony’s ICE Team stated last year that a lot of these early bottlenecks with the CPU can be resolved by off loading CPU related task onto the GPU. They will all get there eventually.

    • Guest

      Honestly, that just sounds like excuses, and not from them, but from you. And it still doesn’t change the fact that CPU on these systems are weak. Sony needs to up the CPU to at least 1.75ghz or more too and also unlock up to 80% of the 7th core. And while they are at it, also up the GPU to at least 850mhz and unlock the 2 extra Compute Units that are there for redundancy, if their test show that there has never been a failed one yet.

    • Psionicinversion

      you can only offload tasks to the GPU IF theres enough power left to make it work

    • TheChosen

      It was typical PR-bs. Don`t trust any “developer” today which does not know why PC was created like it was created. In Theory they think a LOT. But in the end of the day we all SAW how that turned out so far xD

  • Elitepwnsface

    1080p don’t work if the game does not run smooth. Hope it does though

    • Guest

      Assumptions, assumptions assumptions. Where does it say that it doesn’t run smooth? And there are plenty of 900p and lower res games on the X1 that don’t run smooth either.

    • Michael W

      Name one… Framerate issues are a Sony problem, and MS has Rez problems.

    • Rock Gnarly

      Tomb Raider: DE, Murdered: Soul Suspect, Project Cars, F1, Amazing Spider-Man. F1 2015. X1 having a better frame rate than a Sony counterpart is not even close to being a common occurrence. X1 is usually on the tail end of everything.

    • Shi’a at

      The Witcher 3, Batman, MGSV and Dragon Age all perform better on X1 with frames..

    • Rock Gnarly

      The Witcher 3 has had that frame lock problem on PS4, but Batman and MGSV perform better on PS4 according to Digital Foundry. One drop in a cutscene in the PS4 version does not make the X1 version a winner. It’s still problematic in every other area.

    • TheChosen

      I tell you what: EVERY drop counts. SORRY for you. ZombiU runs FASTERon WiiU than PS4. And Xbox One also runs FASTER than PS4.

      YOU FAILED. #Denial

    • Rock Gnarly

      I guess in that case, every pixel counts. 900p with screen tearing AND frame drops? No buy!

    • TheChosen

      For me, a smooth framerate counts more, sorry. No matter what resolution the game has.

      Mostly you can´t see 1080p, if you don`t play on a big screen. However you can see a game crashing down to single/dual framenumber. Even one can see it which is used to 30 fps.

      Btw: The bigger the screen, the SMOOTHER the framerate has to be- to NOT notice a framedrop.

      Which means => normally for a 4k-screen you NEED at least 60 fps. Otherwhise it will be laggy.

      If you want to know why its like that => Because your eyes are the problem. The bigger range your eyes has to cross to “search” for the information it wants, the higher the latency is.

      The higher the latency => not smooth for the eye anymore.

      And whats even more funny is: PS4-Version of Zombi is downgraded everywhere. No dirt-effect, no this, no that, no birds flying away, many missing things.

    • Elitepwnsface

      “Hope it does though” my disclosure statement right here

  • Fear Monkey

    The Xbox One is a whopping 150Mhz faster than the PS4’s. That kind of speed difference meant alot when we had Pentium 2’s, now not so much. The article talks about PC to PS4 development, not sure why the mention of Xbox One’s Cpu was even mentioned.

    • Guest

      It was mentioned because this is the type of thing that Gamingbolt loves to do, is start all these flames wars to get hits, but its not working as much, as people are finally starting to catch on.

    • AndrewLB

      Do you enjoy making up your own facts to fit your agenda? Because it was the developer who brought up the CPU performance issue, not Gamingbolt. So now that your accusation has been proven to be a lie, how bout you go troll elsewhere.

    • Fear Monkey

      But the developer was talking about PC to PS4 CPU and vice versa, nothing about the Xbox One GPU was mentioned by the dev, that was thrown in by the site.

    • Gabriel Porto

      150mhz x 8 cores
      And after Directx 12 all the CPU cores will be able to ‘talk’ to the GPU directly at the same time. So, it is a very significant difference in the same way that PS4 GPU has a very significant difference over XOne GPU.

    • chrisredfield31

      150mhz x 8 cores is nothing. That’s like two clicks on the multiplier lol

    • Gabriel Porto

      Well, when we are talking about a closed box that will remain this way until the end of its life cycle, it means a lot.

      Remember when Sony allowed PSP to be overclocked from 250mhz to 333Mhz? That made a huge difference.

    • Psionicinversion

      150Mhz when its already 3.5Ghz means nothing. 150Mhz when the clocks are so pathetically low is alot

    • 2econd gpu unlocking

      That’s the reason we asked for the faster and more powerful CPU.
      You are bang on the money.
      We also wanted the 7th core unlocked.
      They listened.

    • Gabriel Porto

      That’s nice, after Phil became the head of XB things have been different. They seem to listen what fans and devs want/need.

    • 2econd gpu unlocking

      Yes it has suited what we as fans have wanted.
      We did want and ask for Kinect only games before, then changed our minds and wanted only more games focused, Phil then made it more games focused for us in the latest E3.

    • Shi’a at

      The XBOX ONE CPU is significantly more powerful than PS4s. Lol

    • lagann

      It isnt only about cpu. Dont forget that the ps4 leans more towards GPU compute with it using gddr5. Lots of bandwidth but lots of latency too. Theres a reason pc’s still use ddr ram and video cards use ggddr ram.

      The xbox one is the opposite. They use ddr ram with a faster cpu and use esram to boost data to the gpu. So it leans more towards cpu compute.

      Both systems went different directions to attain the same goal. Ps4 being easier to develop for, since there is no need to code for esram.

      In the end, ps4 has an easier time reaching 1080p because of the concentration in gpu while xbox one has an easier time with having more stuff going on in the game because of its concentration on cpu.

      In the end, both have pros and cons.

    • Failz

      PS4 only uses 6 cores for gaming which equals 9.6ghz
      XB1 uses almost 7 cores for gaming which equals 12ghz

      That’s 2.4ghz of extra CPU performance for the XB1. Sounds like a decent gap to me.

    • Jecht_Sin

      The last news I’ve found about it say that the 7th core is shared, which isn’t good:

      Secondly, the amount of CPU time available to developers varies at any given moment – system-related voice commands (“Xbox record that”, “Xbox go to friends”) automatically see CPU usage for the seventh core rise to 50 per cent. At the moment, the operating system does not inform the developer how much CPU time is available, so scheduling tasks will be troublesome.

      Then Project Cars used the 7th core but it didn’t help much in comparison with the PS4.

    • bardock5151

      Forgetting about all the extra’s like move engines, and special purpose processors that take more load off of the CPU.

    • TheChosen

      not many games can use such features. Deal with it.

      Btw: BOTH Xbox One/PS4 DON`T have any more “special purpose” processors to take load of CPU. Theres no other cpus in there idiot. The only box which is able to do that is WiiU. And it does that with help of its big eDRAM.

      And it is also able to let the os-cpu speak with the main-cpu, the gaming-cpu.

      Xbox Ones eSRAM does not support speak to the cpu and thus cannot be used as a fast scratch pad.

    • bardock5151
      Idiot, read up. The “move engines” are special purpose processors, the SHAPE audio chip is more advanced than the standard AMD trueaudio chip the PS4 has etc.
      Go check out extremetech too, lots more there.

    • lulzprime

      Because you know, this is gamingbolt. A bolt of retardedness in every article, you should be used to it by now.

    • omarcominyo

      To stir up the comments section, gotta get those hits!!

    • TheChosen

      150 Mhz x 8 you dumbass! That makes? Do your math:

      8 x 150 = 800 + another 400 = 1200 Mhz!

      Thats a WHOLE more core, well nearly 😉

      And: Since Xbox One can use 7 cores instead of 6 …there you go. You should get the clou which box is faster. Nope, its not PS4.

    • Fear Monkey don’t know how processors work evidently…

    • TheChosen

      I know EXACTLY how they do. Surely you cannot add up it like that. But you said its just 150 Mhz. And thats wrong.

      What you talk about is Amdahls Law. I know about that. And its true of course. But right now with 8 cores each 150 Mhz. Its still 1 Ghz more power, even if you count 200 Mhz (1/6th) done.

      There you go. You have no clue. Btw: Since games do only use SIX cores in average the engine will have no problem with Amdahls Law at all.

      Since 6-core cpus have not that much problems with Amdahls Law. 8-core cpus have. And with 8 core its meant that you run your games on FULL 8 cores. However, neither PS4 nor Xbox One can use its 8 cores only for games.

  • Guest

    “However its CPU clock speed is on the lower side compared to the Xbox One”
    What grossly exaggerated nonsense! Then why do games like PCars and F1 that rely heavily on the CPU (among many more!) run better on the PS4? It really all just comes down to how its coded and the API differences. The X1 CPU may be better in certain situation and vice versa. And ultimate both systems really have vastly weak CPUs.

    • angh

      on top of that, those guys doesn’t compare ps4 cpu to xbox one cpu. Game is developed for pc and ps4, so yeah, in comparison to I5/i7 the console cpu is surely a bit slower and requires a different approach.

    • Shi’a at

      Actually the games you mentioned dont run better on PS4. From what I read on DF Project cars and F1 gave a blurry look to the ps4 versions and had frame rate issues.

    • ImOnaDrugCalledSheen

      You’re wrong.

    • E.J457

      We have an Xbox fanboy here folks.

  • chrisredfield31

    It’s called Asynchronus Compute, learn to use it devs!

    • Mark

      Boom. Ur right. But u know some of em are; Infamous, BF4 on PS4, Rise of Tomb Raider, probably more too.

    • Psionicinversion

      RoTR will only use it if its DX12 as xbox DX11 doesnt have that functionality built in

    • Mark

      I found the article, actually u made a comment there too, c’mon Psionic! Haha.

      Anyway I think all Xbox exclusives will use a custom SDK using a mix of DX11 and 12 features, going forward. I think Phil said “We’re already using some DX12 features, full coming”. He must’ve meant ASYNC COMPUTE.

      I think Turn10’s using A-Sync too, although not full DX12 for F6. I have every confidence that Qbreak, and all 2016 exclusives will be using DX12, from what I’ve seen, those games pull apart from the bunch imo.

    • Psionicinversion

      some dx12 features are present in the current DX11 one cus its a kind of hybrid. they must of added in functionality to TR seeing as MS have been coding parts of the game so makes sense. Have to see what happens though.

    • karlton d

      why implement a feature the majority of PC gamers cant use. Most have Nvidia gpus that suck at async compute so…

  • Cenk Algu

    The problem is CPU is weak and GDDR5 RAM killing it with high latency. Sony trying to save the CPU with extra %50 GPU Shaders using as GPGPU.

    • Jecht_Sin

      Quit talking out of your a** as usual. It has been stated many times by developers that the GDDR5 RAM latency is not an issue. As it has been proved by far too many multiplatform games, now.

    • Cenk Algu

      I am talking about facts which ignorant dont want to hear about as usual. GGDR5 RAM not CPU friendly and it is more than enough to choke even the most powrful CPU. Fot the last sentence devs usualy using GPGPU to stable the performance on PS4 otherwise that weakCPU and GDDR5 RAM combination ıis more than enough to brick the system. PC was overheating the PS4 my. That weak CPU without GPGPU help can not handel the visuals at high resolution my friend;)

    • Sylmaron

      Nonsense. There are plenty of PS4 titles that already have shown what the PS4 can really do.

    • Cenk Algu

      Where is nonsense and what is The purpose of your comment?

    • Shi’a at

      I own a ps4 but there have been countless games that the ps4 has struggled with. The Witcher 3, Batman even MGSV has some frame rate issues.. Im sure there are more..

    • TheChosen

      Until Dawn….The Order…and even Sonys Games itself, such as “Everybodys gone to the Rapture” so far…

      And even THIEF 4 crashed down to a single 9 fps in certain situations.

  • Jecht_Sin

    I read the headline in N4G. I think “that must be Gamingbolt”. I open the link.. TAH-DAH! Gamingbolt indeed!

    I must be a psychic.

  • Balbir Pabla

    Its time to bring back the Cell Processer from the PS3

  • omarcominyo

    I thought German’s were meant to be efficient!!!

    • TheChosen

      Not when it comes to SONY-devices. Btw: Sony is not german. Its more japanese + american-society based. Its made in japan. but DESIGNED for americans + europeans.

    • omarcominyo

      I was talking about the developers Btw!

    • TheChosen

      Well. You can only be efficient if your hardware is DESIGNED to be like that.

      Unfortunately the only console which is DESIGNED to be like that this gen is the WiiU. Having lots of lots of things which no one ever heard of and no one ever worked with (e.g. asynchronic cpu-engine, no developer knows what that even is!) Which enables => lots of optimizations.

      However PS4 just has its ACEs which a typical todays PC graphics-card also has and uses (thus no optimization since the pc uses the same tech now) otherwhise just PC-parts.

      No optimization possible. Well, nearly none. The PS4 also lacks bandwidth, since gddr5 is now considered OLD (HBM is the new tech and gddr5 old now).

      And Xbox One is not much different, since it cannot use its eSRAM for its cpu to further enhance its cpu-performance. And the gpu itself has fewer aces and fewer graphics-cores.

    • TheChosen

      Well. You can only be efficient if your hardware is DESIGNED to be like that.

      Unfortunately the only console which is DESIGNED to be like that this gen is the WiiU. Having lots of lots of things which no one ever heard of and no one ever worked with (e.g. asynchronic cpu-engine, no developer knows what that even is!) Which enables => lots of optimizations.

      HoweverPS4 just has its ACEs which a typical todays PC graphics-card also has and uses (thus no optimization since the pc uses the same tech now) otherwhise just PC-parts.

      No optimization possible. Well, nearly
      none. The PS4 also lacks bandwidth, since gddr5 is now considered OLD (HBM is the new tech and gddr5 old now).

      And Xbox One is not much different, since it cannot use its eSRAM for its cpu to further enhance its cpu-performance. And the gpu itself has fewer aces and fewer graphics-cores.


      Designing a console “efficient” just
      means that there is no bottleneck like- on PS4 – where its the bandwitdth and the slow cpu- and on Xbox One where it is the bit slow eSRAM and the not availabe ability to let the cpu talk with the eSRAM.

      So efficient means => You design EACH part to be exactly that powerful like the REST of your parts you put in.

      That also goes for 8 Gbyte of Ram. Completely bad design today. And inefficient too since x64.

    • omarcominyo

      Jesus I only made a joke, no need to be condescending! HAHAHA I game on PS4/Xone and have on PC for years so am quite aware of their inner workings!

  • TheChosen

    Ooops. PS4 = Weakbox confirmed 😀

  • ImOnaDrugCalledSheen

    LOL, hilarious propaganda title. Should get lots of revenue clicks, which was the point I’m sure.

  • Pops

    I guess this is the balance microsoft was talking about

  • arrianadavis

    Thanks for the article. Just want to inform all folks who live outside US that PS4 is a great media Player. If you want to access Netflix and other streaming stations on your PS4 you can use UnoTelly as I do to get around the geo block.

More From GamingBolt


Copyright © 2009-2017 All Rights Reserved.