Microsoft Looking To Improve Xbox One’s Graphics Performance, Working On GPU Optimization

Microsoft are also looking for UX designer that will develop frameworks for episodic storytelling.

Posted By | On 16th, May. 2015 Under News | Follow This Author @GamingBoltTweet

xbox one amd

According to a job listing on the official careers page of Microsoft, it seems that company is looking to improve graphics performance of the Xbox One. The Advanced Technology Group within Microsoft are looking for a Software Engineer that has technical expertise in graphics performance optimization, GPU architectures, and HLSL shaders.

The job listing further states that the candidate should be able to make the GPU dance like it should be in a music video and improve frame hitches, possibly indicating towards frame rate optimization, and work directly with game developers to understand and address their technical problems and performance requirements. The Xbox One reportedly has a lesser GPU compared to the PlayStation 4, and optimization programmes such as these will go a long way in delivering better performing games on the console.

On a side note, Microsoft are also looking for Senior UX designer that will develop UI Frameworks and Wireframes for episodic storytelling features and blend Modern design language with the visual brand to create experiences. Since the listing states high quality video games, TV, or other entertainment products, we are not sure whether this is pointing towards a new game that will have strong story-telling or a new app that will be made available in the future on the Xbox One. Regardless, the idea of developing beautiful, usable, and delightful experiences for fans the world over sounds intriguing.

Awesome Stuff that you might be interested in

  • Starman

    They have the money and recourses , so this shouldn’t be that hard for them ….

    • Doll8313

      Resources can mean ppl and if they are hiring they clearly do not have all the ppl they need.

    • Starman

      And your point is …..?????

    • uptownsoul

      Point is…having more money doesn’t mean it will always be used wisely

  • Guest

    Tier the hardware MS. This graph is no lie and because of this performance deficit between the current consoles and the PC we’re already seeing the fallout from it. Tier it and make a new console in tandem with the NX. Doesn’t matter that you leave Sony behind, they’ll sell on hype anyway.
    You’ve always been the kings of software compatibility so you could easily keep the Xbox One in the picture as a budget console.

    • Macfool

      but this likely isn’t about the XB1/PC gap – that’s pretty much uncloseable. It’s the XB1/PS4 gap MS are wanting to plug.

      IMO tiered hardware wouldn’t work for anyone. MS would feature ‘out of scale’ manufacturing costs. Third party devs might not write for it because until it sells sufficiently well there’s no user base to make money back from, and how do MS market it so as to a) inform the public directly what the new tier is, without b) consigning the lower tier to the bin from cannibalising your own sales?

    • Guest

      Oh they will, that’s what the unification of their ecosystem is about with Windows 10. You write one program and it’s easily deployable to the Xbox as well. MS is the only one of the big that can pull this off now, it’s essentially Steambox done right.
      And to answer your question, it’s called compatibility, you simply name the next Xbox the Xbow Two and continue to release software with the Xbox One labels making it clear that the software still runs on the lower tier. Every game that releases will simply run much better on the Xbox Two. Only after a couple of years you start releasing games with the Xbox Two label, which will require the higher tier to work. It’s easy, very much doable for every developer that ever developed using MS’ tools and will still guarantee a 5+ year lifecycle for the Xbox One.

  • Zyx

    Good Luck Ms

    • ImOnaDrugCalledSheen

      They’re going to need it.

  • Tga215

    Here we go again

  • Tha Truth

    Oh dear…more smoke and mirrors BS from Microsoft desperately trying to save their dying console from flopping at retail. Here they go again making a bunch of crazy promises that they know they’ll never keep just to give their delusional, low – IQ fantards something to cling to as they bawl their eyes out over the worldwide hardware charts.

    Microsoft need to focus on releasing better games for the system. The games so far have been nothing but cheap, overhyped garbage and sales prove that.

    • Terminator

      Here we go again with the Basement Dweller “Engineers” thinking they know what they are saying. Only fantard with a low IQ here is you talking as if you know better than a company who has made millions and made many of the great stuff we use today. Well thank god they don’t listen to idiots like you or they would have been in trouble (money trouble like Sony).

    • Reign SUPREEM

      This^^^^ a thousand times. I own both consoles by the way!” Tha Truth”is the ultimate FANTARD!

    • Tha Truth

      Ah yes, another desperate, delusional, low – IQ peasant fantard desperately clinging to any scrap of news to make him feel better about his worthless, cheap, dying platform. Thank god not all consumers are as gullible, ignorant, brainwashed and uneducated as you or we’d all be playing games in 900p and not striving for anything better, PEASANT!

      Keep defending Microsoft and their cheap, worthless, poverty console because you’re too poor and too stupid to afford a machine that can do 1080p. Right now you’re displaying the only true “Exclusive” that the Xbox DONE has – buyers remorse.

    • You Are Flat Out Wrong

      You are also a peasant, a console peasant you worthless peasant.

      Your console can’t do 60FPS. 30FPS is a peasants framerate you illiterate, guillible loser. Never mind garbage like DriveDud and The Odor 800P

      PC remains supreme :^)

    • Guest

      That guy is such a schmuck, truly a Sony fanboy.

    • Icaraeus

      Many PS4 games run at 60fps.

    • Triton

      “The games so far have been nothing but cheap, overhyped garbage and sales prove that.”

      Is that why PS4 lost last months sales even though they had a big release with Bloodborne?

    • You Are Flat Out Wrong

      I smell desperation from Sony peasants deflecting to MS now their cheap, garbage hardware is holding back the generation and Sony is bribing developers to create parity.

      You are the cancer upon the gaming industry. At least Sony is going out of business so you won’t be around for long as you drown in tears due to yet another failed metacritic disaster.

      PC Gaming is the only way forward

    • OC Guy

      consoles are not holding any games back…Just how deep do you think the development pockets of the average studio are? To fully develop a game you would want would cost a ton and drive the price of a game to $120 each…before the season pass. That won’t fly. Gaming is right where it should be. Progression slowly and surely and delivering great games.

    • OC Guy

      How is this smoke and mirrors. Explain with all you “knowledge” in detail with facts not fanboy opinion. Oh, you can’t? What a shock!. Dying system? X1 is still selling faster than the 360 at the same point in time…X1 also outsold the PS4 last month. Maybe you should try getting some facts before you post…Meh, you probably enjoy being ignorant. Oh well, not my problem.

    • Orion Wolf

      You know I find it funny how people still use “dying console” when as you said its selling far better than the 360 (when the 360 had no competition) and then there is this “from flopping at retail” –
      clearly the latest NPD numbers show just that /sarcasm.

  • Riggerto

    They probably should just cut their losses and release Xbox Two in a couple of years – don’t cheap out on the GPU and RAM this time.

    • OC Guy

      What? THe X1 may not have the same exact power of the S$ but it is more than capable to produce amazing games. Maybe not spending so much time on your high horse would be a good idea.

    • This guy hasn’t seen Ryse.

    • Orion Wolf

      “don’t cheap out”

      They spent $100 million on just the controller and the x1 deal is worth more than $3 billion according to an AMD employee.

      Sony went with the cheapest path possible i.e. using existing hardware and making due, the x1 on the other hand is custom tailored, the GPU has seen changes to accommodate dx12 – you know the API no current gen gpu is able to run effectively.

      By everything we know about the ps4, the console should be miles ahead in terms of visuals compared to the x1, but that’s hardly the case. The ps4 is according to everyone at least 50% more powerful (at one point it was even 50% easier to code for) in every aspect and it has an API that’s lower level than dx12, on the other hand the x1 is a lot weaker and it
      has an API that is according to a dev:

      “In general – I don’t really get why they choose DX11 as a starting point for the console. It’s a console! Why care about some legacy stuff at all? On PS4, most GPU commands are just a few DWORDs written into the command buffer, let’s say just a few CPU clock cycles. On Xbox One it easily could be one million times slower because of all the bookkeeping
      the API does.”

      So how was a console with horrible SKDs, a horrible API and weak hardware capable of running Ryse at launch?

      btw do read GHz’s post.

  • dapaintrain

    The major contributor to my decision of buying the ps4 over the xb1 was more to do with my experience of last gen and exclusive games.

    The start of the 360 gen I originally owned the 360 and closer to the end I picked up a ps3.

    Sure we got halo, gears and mass effect all games that I loved but towards the end of the generation actually after mass effect 2 I began to notice the quantity of first party games hitting the ps3 and literally nothing but boring kinnect games hitting the 360

    Looks like ms is trying to remedy this but focus on the titles graphics aren’t the only thing optimisation will come with time but I the xb1 and the ps4 are pretty much neck and neck anyway.

    With the ps4 seemingly becoming more focused on remakes like gow 3 instead of new games nothing until uncharted 4 looks like I’ll be picking up both consoles sooner then I normally would.

    • Joseph Ruiz

      That’s why i have an Xbox one it’s great with 3rd party and first party games
      I was gonna get a ps4 but it has bad 1st party games to no first party games

  • ImOnaDrugCalledSheen

    Poor some more secret sauce on it.

  • Steven OConnor

    To be honest even if they had a higher specked system compared to PC it would be a brief time they’d most like be mid tiered in specs. Few months even just weeks PC would have far better specs again. Game consoles just can’t keep up the way they are. The only way is to have gaming on the cloud with very powerful servers running them

    • Macfool

      It probably not about keeping up with PCs, as much as keeping up with the PS4.
      The idea of gaming via cloud is good, but the practicalities of the infrastructure means that it’s unworkable for a great many people for the foreseeable future.

    • Guest

      That is the point of this article, but neither console is good enough anymore, no matter how much the PS4 sells on hype.
      We’re talking about a fivefold in performance difference between the PC and consoles already only a year into their lifecycles. Next year will be the big punch, 14nm process APUs, HBM2 memory, etc. That gap will grow to over tenfold in less than 2 years into their lives. It’s absolutely pathetic how weak these consoles are.
      People are bickering over console graphics when the PC obliterates them both.
      MS’ only option is to tier their hardware, now that their software environment is unified and completely compatible, it’s the best way to continue the standalone box gaming environment and keep up with the tech.

    • Steven OConnor

      I agree work needs to be done in order to achieve that vision. Keeping up with the ps4 is relative. Each company made their decisions on platform architecture they would implement. Even though it’s now a more common factor ps4 can keep a 1080p resolution vs 900p it’s a situation that both companies chose and for a few years have to deal with and optimize. I’m grateful I was back during the days of Nintendo Atari. I think back to those days and realize at the end of the day both systems look good compared to past consoles. The past taught me that in the end games should be about the fun. People in general have forgotten that and in general conversation on games people usually ask these 2 questions how was the game? Second one asked how are the graphics? For me it’s about the fun. Everything else is relative

  • Macfool

    Maybe a RAM pack that plugs into the back of it?
    Too old school?

    • Guest

      Old school indeed. The industry has moved away from the upgrade pack because it’s impractical and confuses consumers even more than just releasing a new console. Besides, you can’t put an eSRAM pack on the USB bus, the latency would negate any benefit of it.
      The eSRAM’s small size is the problem of the Xbox One, because you can’t fit proper deferred rendered 1080p buffers with all the bells and whisles into 32MB. It should have been larger, but MS’ problem was that it would have eaten away at the die space even more, likely at the cost of more GPU power.
      eSRAM is great, but at 28nm, their options were limited. A larger die would have made the console much, much more expensive. DX12 will change a few things that will make 1080p more doable with the 32MB, but that 32MB will remain a wall for developers.

    • Cinnamon267

      More would have been ideal. But ESRAM is far from die efficient.

    • Cinnamon267

      Even via USB 3.0 it’d be useless due to poor bandwidth. And really high latency.

  • GHz

    “we’re based on the Sea Islands family. We’ve made quite a number of changes in different parts of the areas.” – Andrew Goossen 22/09/2013

    And if anyone wants to claim that MSFT didnt make any changes to that 8000 series APU, heres what AMD recently said.

    “Xbox One leverages a single-chip, semi-custom AMD APU, with custom components co-developed with Microsoft” – AMD

    Now why all the custom components/hardware in the XB1? To insure that the XB1 take full advantage of DX12.

    We knew what DX12 was doing when we built Xbox One. – Phil Spenser.

    When the question, “To get the full support of DX12 will users need to get a new graphics card?” was asked, Ybarra replied, “To get the “full benefits of DX12,” the answer is yes.”

    So if you’re rocking an old school 7000 card which includes the Bonaire or w/e the PS4 is packing, aint no way in the world you’re going to have full access to DX12. Some access, not FULL! XB1 has FULL. So what does logic dictates based on the evidence provided by MSFT engineers/staff and their partners?

    This is what NVIDIA had to say about the matter.

    “Developers also want to take direct advantage of ADVANCED GPU HARDWARE FEATURES currently INSULATED to provide FULL PROOF USAGE. DirectX 12 was designed from scratch to provide the infrastructure for these ADVANCED APPLICATIONS.”

    And for the fanatics that want to believe that this don’t apply to the XBOX ONE, you need to stop lying to yourself and the public because AMD already told us that it does!

    So yeah, what this article is saying is 100% accurate.

    • Guest

      It is, DX12 will improve the Xbox One in many ways, but don’t kid yourselves, it will widen the gap with the PC even more as well. The most practical benefit of DX12 is easy low level porting of code between all MS’ platforms in their Windows 10 ecosystem.

    • GHz

      kid myself about what? Nothing I wrote was my opinion. I cited everything.

      DX12 Widen the gap with PC? Maybe, But I don’t care. I’m strictly a console gamer. That’s my sole interest.

      “The most practical benefit of DX12 is easy low level porting of code between all MS’ platforms in their Windows 10 ecosystem.”

      I’ve Been saying that since day one, when it wasn’t popular to say such things. Cite-able links i provided, shows its not the only thing DX12 will do. Its about features features features & more features. Hardware to take advantage of these features. The “graphics” problem was solved since last gen. Hence the reason why Phils said that DX12 will grant Dev’s access to new features. He also said that DX12 will not improve graphics. But he said that early in the lifespan of this gen. So the question is in comparison to what? Early games? Because that’s all we had. Point is, we have to use our heads. It’s inevitable that graphics will improve even w/o DX12. Phil shouldn’t have to spell that out even though he gave us a clue in how games built from he ground up in DX12 will look on XB1.

      @nickomatic20 When you think about start of gen to end of gen teams learn a lot. DX12 will help as well, think PDZ to Halo 4.— Phil Spencer (@XboxP3) April 7, 2014

      No matter what, games are going to look prettier as the years go by. It’s only normal that they do no matter which system you game on.

    • Guest

      I wasn’t talking to you specifically ;), that’s why it was plural.
      It’s more of a statement I was making to people believing DX12 to be the end all thing to make everything 1080p on the Xbox One.

    • GHz

      Still its an odd thing to say when all that info is cited. But w/e.

    • Guest

      Nothing odd about it. Many things make up the image that appears on your screen. Resolution and framerate are only a part of it. Those new features you are talking about are the newer DirectX12 feature levels and will allow different and more efficient ways to produce the 3D geometry that is rendered by the GPU.

    • GHz

      Where I’m from, when you tell someone, “not to kid them self “, you’re giving them a heads up that they might be wrong about something. So what you said is odd, because everything said was cite-able. It was not my opinion. In essence youre saying NVIDIA, AMD, MSFT are kidding themselves.

      Everything else you said is old news and Ive been saying those things when pple still thought that DX12 didn’t exist for the XB1.

    • Guest

      I apologize, that wasn’t my intention. No, MS and the GPU makers aren’t kidding anyone. With more efficiency you’ll get more performance, but the 1080p thing is dependent on the hardware and will continue to pose a problem even with DX12.

    • GHz

      1080p is a software problem not a hardware problem. Devs have discussed this. Devs have also admitted in some cases that it was a matter of time and resources having to deal with DX11 on the XB1. There are 1080p games on XB1. I do expect the problem to continue as long as there are cross platform 3rd party games still being developed with DX9-DX11 graphic engines.

    • Guest

      It is a hardware problem. You can’t fit 1080p with deferred shading properly in 32MB. Software can only mitigate that by dividing data properly over the DDR3 and the eSRAM, essentially becoming a split frame rendering technique (which leads to tearing if not done right) or you choose to go with another less intensive rendering technique.
      PIX already gives good insights on load balancing at the moment, but developers still have a large chore optimizing for this and graphics engines essentially need to have this in mind as well. So yes, time and resources is part of the problem, especially if the competitor doesn’t have this limitation, it becomes easier to just downgrade for the Xbox One.
      Of course there are 1080p games on the Xbox One, but we all know what software limitations the games have used to achieve it. The best thing so far, a solution that can fit most games, is to simply go with a fixed y axis and a varying x axis to keep everything in the 32MB rendering space.

    • GHz

      Dude all I know is this. XB1 lifted 1080p 30 fps locked no dips open world using 4xMSAA, and when Ready at Dawn tried to do the same, they couldn’t. They settled for 800p, closed in environment near perfect 30fps. Couldn’t even lock it. My question is where is the ESRAM problem if they were able to do that?

      Not all devs are created equal.

    • Guest

      They aren’t, which is exactly what it all comes down to. Like I said, the chore of optimizing for this setup isn’t easy and is time consuming thus adding to development costs which with the Xbox One not being in the lead sales wise will not change how developers use the hardware now. You’re mostly (if not only) going to see the dedication for this in first party projects.

    • GHz

      You need to pay attention to the whole story and opinions of all devs. Not t just the ones who have trouble accomplishing certain tasks on the XB1. I understand though their input is valuable. Devs were saying from the get go that DX11 was horrendous, and locked them out from accomplishing tasks that should be simple.

      Both systems are selling healthy. PS4 sales success is phenomenal, but that shouldn’t blind you from the success of XB1 either. If the sales figures were grades,. PS4 is an A+ and XB1 is an A. They’re both passing.

      “You’re mostly (if not only) going to see the dedication for this in first party projects.”

      That’s always the case and will never change. What the XB1 can do is defined in that space, not 3rd party.

      When DX12 becomes the norm, we’ll see how development goes in the 3rd party space. From what I hear, the pple @ DICE already wants DX12 to lead. Have you ever wondered why? 2016 onward will be very interesting 3rd party development wise.

    • Guest

      Yes, but no developer can ignore the 2:1 advantage (with nothing to show for it) of the PS4 over the Xbox One and more development time will be used to program for the PS4 as a result of it, especially if that gap grows. DX11 wasn’t the only API available for the Xbox One, which already shows that developers aren’t willing to optimize.
      The only thing that can make developers go the extra mile for the Xbox One is if the development tools make it easy enough. DX12 makes things more efficient but doesn’t change the hardware limits and developers still have to look for ways to make their 3D titles 1080p.
      I’m sure many developers want DX to lead, it’s the leaner API. OpenGL was an incompatible extension mess and so far Vulkan will not change that approach with upcoming revisions.

    • Psionicinversion

      Actually the power advantage of ps4 will go the same way as PC, more powerful means less work in getting things to run means less programming time.

      But dx12 should be lead just because its out selling 2:1 doesn’t mean anything to DICE. They get paid a salary so the game programmers will want to use the easier api that allows them to get alot more work done faster, they see nothing from the sales figures. Each person won’t get a £100,000 bonus for completion. No at the most they’ll say hey well done you can keep your job for another year.

      EA is the only one that sees the profits from it and as long as the games keep selling they don’t give a sh’t what is lead platform.

      For devs like CDPR they may use ps4 as lead because its kind of in the middle. You can pretty much just drop the resolution of x1 and keep the same settings as ps4 then on pc you bump up the draw distance, npcs, textures, shadow maps etc etc so the game is more balanced when it comes out.

      As for deferred rendering… Many devs want it to die. Forward rendering and object space rendering(which is how CGI films like Pixar and that render) are up next. Ashes of the singularity uses object space rendering.

    • Guest

      PS4 has other problems and that hardware advantage isn’t so great in certain areas other than benchmarks. It does have a 30% advantage over the Xbox One, but even then, if you look at the graph below, that difference is negligible compared to how far behind the PC they both are now, both already hold back hardware wise. Nothing in the middle, they’re both low end of the hardware spectrum for newer games.

      DX12 should lead because it’s the better API, no other reason should be necessary and CDPR basically deceived everybody during the last 2 years and now it becomes very apparent how weak these consoles truly are because they are at the level of PC 4-5years ago. So even 2 years ago CDPR knew what they were doing showing that footage when the consoles launched. They purposely started focussing on the consoles and basically continued lying with their PR making it another Watch Dogs. The Witcher 3 will no doubt be a great game, but the downgrades are very apparent.

      As for deferred rendering, I don’t see that dying at all. It still has clear performance advantages and ease of use. I’ve seen the forward+ demonstrations and they can be a good compromise but deferred will never go away, ESPECIALLY in weak hardware scenarios. It’s the primary reason why it’s being used so much.

    • GHz

      Deffered is going the way of the Doh Doh bird. The old always gives way to the new. The only pple who will benefit by using deffered as things change will be indi devs. But wil they pass up the ability to code once for a wealth of form factors? I’ll bet no. I believe in the 30% advantge on the PS4 number wise. That # just dont translate in final code. We dont see it at all to an OBVIOUS degree. You literally have to run it through a pixel counter to find out whats going on, and even so called “experts” like eurogamer get it wrong sometimes, or admit that a lesser res game looks better than their higher res counterpart. We need to kill that 30% argument for the simple fact that the best looking game on the PS4 is a 800p game. Let the games do the talking.

    • Guest

      Deferred isn’t going anywhere.

    • GHz

      Ok. Shrugz*

    • Jetman082

      I’m not some beasty developer or anything, I’m just an amateur coder, but I just wanted to say 1080p is absolutely a hardware issue with the XB1. You can’t have millions of polygons all over the screen with advanced postprocessing effects and antialiasing and expect 1080p/60 1080p/30, etc on a console that’s cpu is the equivalent of a low end dualcore i3 and a gpu that’s a pseudo 7850. That’d be like asking the xbox 360 to run a game at 720p with advanced post processing effects, realtime lighting, softshadows at a good resolution, etc, and run it at 60fps too.

    • GHz

      I get that. The fact is there are limits. You’re a coder. Explain Forza Horizon 2. How is that possible on a “meager” console? Why in the world would they choose to render that game using forward rendering just so they can have access to 4xMSAA?

      I have another question. Which requires more heavy lifting overall? All physics assets and features in FM5 minus car renders or the entire Driive club game?

    • Jetman082

      What’s impressive about FH2? I’ve been games with better graphics. I think what you’re trying to get at is all developers aren’t equal, which I agree with. There’s SOME optimization you can do to help a games performance, example in FH2 many assets aren’t fully streamed in until you get within a certain distance of the object (which, if it were me coding i’d use a float for that). Shadows in particular stood out to me in the gameplay, the quality is low until you’re within a certain distance of the object casting the shadow, at which point its’ resolution is increased. @the whole physics vs entire game comment, I don’t know much about physics seeing as I’ve never coded anything that deals with them, but physics can be quite heavy on a cpu, unless a physics engine such as Nvidia’s PhysX engine is used to offload it onto the gpu. Also, a game like FH2 you can probably assume is DECENTLY multithreaded, it’s not like you’re running the entire game off one core lol, so that can help performance tremendously (of course you run into huge issues with multithreading, mutex’s being one). TL;DR, FH2 is possible on a console b/c though the one is a weak system compared to a gaming desktop powerhouse, it’s still a decent system that can run many games at a nice 900/1080p res with post processing effects like FXAA taking the place of MSAA, and with settings like SSAO turned down. It’s also dependant on the skill of a coder. Example, the first calculator I ever coded used roughly 500 lines of code (after adding checks to prevent unwanted behavior). a month later I went in again and did the exact same thing with less than 100.

    • GHz

      “What’s impressive about FH2? I’ve been games with better graphics.”

      So better graphics automatically makes a game more impressive. Forget everything else? Forget the code, forget the features, because you’ve seen better graphics? So its better to keep the game closed circuit, up the textures, rather than make it open world, and balance the textures? Forget giving the gamers the ability customize. Thats not important because graphics arent impressive enough for you? You’d rather stump the features to pour more on textures because you want the game to look CGI as best as possible. Do away with heavy features? Your thing is, We cant afford that because those textures need to look good. Thats pretty much what you’re saying.

      Doesn’t sound like a coder to me. Sounds like a child pretending.

      Point is, TRADEOFFS!! BALANCE! Give n take. If you count what you get with a game like FH2 or FM5 compared to say a game like DC, you get a whole lot more in meaty goodness in the FH2, FM5. So we can understand why DC have better rain FX, slightly beter textures. Tradeoffs.

      And another thing you’re babbling and going all over the place in attempt to reason, and you’re failing.

    • Jetman082

      Whatever you say mr I’m so mad b/c someone said the hardware in the one isn’t good enough to output extreme quailty graphics at 1080p. Graphics aside, FH2 doesn’t look like anything special to me. Coding wise I’m sure I’d marvel at it, especially the shader work. I’m still lightyears away from being able to implement shaders into a program. You never said a thing about features, you asked how the game works, I gave you a few examples of how you can get the most out of a machine (code optimizations, don’t run the damn clock cycles sky high by using if else if statements when you could use a switch for example, use multithreading to offload certain parts of the game onto different cores *again something I’m still not able to do yet*, waiting to stream in higher quality textures & assets until the character is within a good distance of them, etc). Idk why I even explained clock cycles & if statements vs a switch to you, I’m sure you have no clue what a switch is. You want to call me a child, but you’re the one getting irritated at the statements I made. Customization has been in games since I was a child, don’t act like that’s something impressive lol. I do agree that the nature of some games allows for more impressive visual fidelity. Example, a mortal kombat game (not one of the open world ones), could have exponentially better graphics, due to the fact the stages are smaller. I wasn’t saying you shouldn’t have features at all, just that graphically and gameplay wise, forza’s nothing special to me. Hey if you’re such a good coder, why don’t you explain to me what a map is? What’s the key in a map used for?

    • GHz

      Nope Im not a coder. Never said I was. I just asked a simple question and look how you answered. You toss everything out the window because the game doesnt look good to you, and thats fine. But You dont even entertain the idea of what makes the game as a whole. Thats why i doubt your legitimacy. Arent features part of a code? What is FXAA, MSAA, global illumination etc, aren’t those “features?” When you write a graphic engine, isnt the objective to code to better support for the best of features? So i dont get you. And your’e babbling again. It look like you have a problem with focusing.

      Other than that its fine to not think of those games as impressive, even though you turn around and admit that the code might. As a coder, isnt that what matters to evaluate the meat of a game in the end? Overall assets? The code? But what do I know.

      You’re line of code don’t mean you’re smart. Youre trying way too hard.

    • Jetman082

      I’m not babbling at all, why don’t you answer the question about why the code I just gave an example of is completely wrong. FXAA is a post processing effect that has the ability to antialias alpha’s, so it’s widely used as a replacement for MSAA, or as a second option to antialiasing in the graphics option frame(can be called many different things, frame/panel, etc). Features are indeed part of coding, Last week I coded the ability for the user to choose their own font. Const char’s are nice (although my superior got on my back for not using a string). but fxaa is nothing like adding the ability to customize a car by adding decals & a different paintjob. As a coder no, you don’t have to appreciate how the game plays in order to be impressed by it. I find the crysis games to be the most boring games on the planet, yet I can be impressed by how difficult it must have been to code. Also no, I don’t have a problem with focusing, you said I’m a fake coder, so that last paragraph that I gave you is asking you to explain something that I already know the answer to. If you can’t explain it, why judge whether or not I know how to code?

    • GHz

      You’re babbling and tripping over your words. And youre trying to drag me into a convo that got nothing to do with how you approach a question. I don’t care what you did last week. All that unnecessary explaining makes you sound weak! FOCUS!!!

      All this cause you wont admit that its all about the code and not you’re subjective view on graphics. WOW!

      That being said, if you’re legitimate and aint trolling. Good luck on coding skills. Laters.

    • Jetman082

      Did I ever say it’s not about the code? No I didn’t, you asked how the game ran, I gave you example of how it was able to run on the system, as stated in my last response to you. Yeah I like graphics, but you’re the one that never asked me what i think about the features. When you mentioned features, I told you I don’t find the features to be very impressive (and you weren’t talking about graphical features, you were talking about customization). The customization doesn’t impress me, the code would. Of course it’s about the code, everything’s about the code. No code no game, but that doesn’t mean you can optimize code to the point that you’re going to make the one a supercomputer, you can’t and only an idiot would believe you can. Only so much you can do with the hardware it has. As I stated before, you’re not going to get 1080p/60fps with advanced postprocessing effects, high resolution textures, etc. on the one, you just can’t do it.

    • GHz

      lets look @ your time line. The 1st thing you said,

      “1080p is absolutely a hardware issue with the XB1. You can’t have millions of polygons all over the screen with advanced postprocessing effects and antialiasing and expect 1080p/60 1080p/30, etc on a console that’s cpu is the equivalent of a low end dualcore i3 and a gpu that’s a pseudo 7850.”

      Mind you you introduced yourself as a coder & thats your 1st statement to me.

      So I asked you explain FH2! Because according to your statement, the XB1 cannot do a game like FH2! But look @ the facts. FH2 is 1080p, employs millions of polys, advance FX, notably the advance lighting, some of the best GPU intensive AA (4xMSAA) @ 1080p 30fps all the time, not some of the time, on a jaguar APU under house arrest by DX11. So BOOM! Right away FH2 proving you wrong. Thats what XB1 is doing with DX11! How!?

      My question obviously was about the code the features etc, because that is what make up a game. It is a direct response to your claim! A rebuttal!

      But your answer to that was, “What’s impressive about FH2? I’ve been games with better graphics.” DUUH! Like some 1diot!

      You lack focus! And you are a troll and a pretender. Done!

    • Cinnamon267

      “When you write a graphic engine, isnt the objective to code to better support for the best of features?”
      The best features you can manage within your budget and timeframe. Not the actual “best” features since that becomes a little too subjective for a lot of things.

    • GHz

      Of course within your budget. It’s always the best within you’re budget.

    • Cinnamon267

      “Why in the world would they choose to render that game using forward rendering just so they can have access to 4xMSAA?”

      They wanted a lot of light sources as well as MSAA. You get both with forward+ rendering. You could go deferred, but support for transparencies and MSAA is a mess. You can do it, as games have, but you can just avoid it by using foward+.

      “How is that possible on a “meager” console?”

      They have pretty fantastic artists. Just like A lot of those PS2 games running at 60 look amazing still. Really good art and art direction.

      Forza horizon one looked really good and that was running on a 7 year old box. And, racing games are always a bad-ish example to use.

      “Which requires more heavy lifting overall? All physics assets and features in FM5 minus car renders or the entire Driive club game?”
      Driveclub. The features they wanted required them to drop the framerate down to 30. Double the rendering time.

    • GHz

      I know the forward plus 4xmsaa rendering story. I’ve been explain that months back.

      There is no such thing as game being a bad example. It’s all about the code. How they go about it to better exploit the hardware. To get the performance they need, they’ll do what they gotta do.

      FM5 car physics and features (customization etc). DC is a glorified demo. A work in progress. Very basic. That’s why it look so great. Just strip it down to bear bone minimum.

    • Psionicinversion

      Well Forza Horizon 2 and The Order 1886 is forward rendered so… it does have advantages but it is harder to use than deferred.

    • Guest

      Correct, it all depends on how complex your game becomes. The more complex things will get, the more we will fall back on deferred rendering and other prebaking techniques, especially when the hardware causes limitations.

    • Psionicinversion

      Dont know anything about how object space rendering apart from the little bits Wardell has said seeing as his game is actually using it but never know it could be the next great way of rendering stuff

    • GHz

      1st of all, I think too many pple are jumping to conclusions of where PS4 will stand in a DX12 world(if it ever comes to that). PS4 developement will not be left behind because it is still a DX11 machine. And the word is that you can do everything DX11 in DX12. What I see is certain features may not cross over to the PS4, and DEVs will be alright with that. And we’ll all have to realise that it wont be their fault! Same situation we’re seeing now with the near parity case with 3rd party cross platform games. Some features don’t cross over to XB1, with res being the dominant one. However the success of the XB1 proves that gamers didn’t care that much because the difference is trivial. The same will apply to the PS4 owners over time if these DX12 differences pop up. Only the fanatics will care and demand blood.

      Every time pple say hardware limits on XB1 I think about FH2. The PS4 couldnt do that Forward + rendering on the same level. Keep in mind that Forward + is one of the rendering techniques that give you access to features that are normally GPU intensive, but gives better result as far as pixel quality. It is one of the techniques that will eventually replace deffered. You cant look at 3rd party games as “better” examples if they are NOT using NEWER ways to solve things. You’re not going to find smarter answers (currently) in 3rd party DX9-DX11 based games. It just wont happen. But in that space, PS4 wins by a smidgen, and is usually API/time related, not hardware related.

    • Psionicinversion

      Actually software and hardware go hand in hand. If the software is to demanding for the hardware you’ve got problems and vice versa. Like why can’t either console render hardline at 4k. Because the hardware isn’t powerful enough. On the other hand say its at 1080p but your throwing in really hi rez textures, full real-time raytraced lighting (not that weak version shadow fall used proper full blown raytracing) and tons of particle effects 30 million polys a scene the software is to demanding for the hardware.

      This is where 3rd parties struggle because the code has to remain generalised for multiplatform which drives up the hardware needed to power it which is why you’ll probably still see alot of the more demanding 3rd party titles stay at 900p

    • GHz

      You’re absolutely right! But that only applies if you’re still coding games 90’s style, where brute force of hardware would win the day. That h/e is changing. High res textures are possible on the XB1, if you’re willing to change how you code and use all the tools available via (DX12 + W10)(Tile resources + graphinesoftware)

      Same goes for the PS4 too, but we don’t know to what extent. Tile resources have been already proven to do much more than Partial resident textures. However, I am confident that the ICE TEAM will find hacks to bring PS4 up to speed. But keep in mind, while games today already used PRT in both XB1 & PS4, No games have employed TR, which is exclusive to XB1. And if we should go by what MSFT said, we are going to see higher res textures. No brute force needed.

      I’m not saying that consoles (out the box) will match what a high end PC can do (4k), but I can assure you, exclusive 1st party titles will make it look they can. This is all about developer approach. I’m 100% certain of this.

      Other than that, you are right!

    • Psionicinversion

      I know all about graphine software. Tiled resources was introduced in DX11.2 with windows 8 so xbox 1 will already be TR enabled.

      Tiled Resources is not exclusive to X1 hahah dont talk crap mate a whole DX version was made just for TR…

      TR isnt going to be the quick fix you think it is. They wont be able to make a whole game solely on tiled resources. They will be able to do parts… tiled resources is just basically PRT for DX which Infamous SS used for parts of the game

    • GHz

      “Tiled resources was introduced in DX11.2 with windows 8”

      I know. If it is enabled though in XB1 after they announced it @ build 2014 is a different story altogether. What I’m saying , XB1 has all sorts of features now, but they aren’t yet enabled.

      No crap spoken. MSFT and Nvidia discussed the differences. Just like there are are different types of AA with their respective companies backing each, so is the situation with TR and PRT. They have their similarities, but also have their differences. Hence the reason why NVidia backs TR in opposition of AMDs PRT. Do you know what the differences are? Do your homework please.

    • Psionicinversion

      yes there are things not actually in DX11 that GCN GPU’s can use so will get “activated” in the xbox only difference is PS4 can already do them all. Well apart from DX12.1 feature set which is what full DX12 is

    • GHz

      Again you’re right & back on track!

    • Psionicinversion

      I went on AMD site and theres a page with a few things on and 1 thing it explains async shaders and multithreaded command buffers really well. Its like async shaders for a child its that easy. maybe you should go look at it :d

    • GHz

      Yea I’ve read it. But honestly all that jargon bores me to death. And you should read what NVidia have to say too. They make it easy to understand.

    • Psionicinversion

      yeah but at least then you know what your talking about

    • GHz

      absolutley! Thats the only reason i read up on it. That goes for you too.

    • Psionicinversion

      yeah but to think TR is going solve all X1’s problems then your wrong completely. If halo 5 and Forza 6 dont use it then it its a waste of time

    • GHz

      I never said TR will solve ALL XB1’s problems. You mentioned high res textures and I gave an example in which MSFT claims that they have an answer to that problem in the form of Tile resources. Was I wrong?

      If Halo5 or Forza 6 doesnt use Tile resources will only mean that they had no use for it for those particular games. In the end, its up to the Devs. The idea is to supply them with a wealth of tools. The bigger the sandbox, the more options in how to approch a problem. Thats all it is. You cant speak as if you know how MSFTs 1st party Devs need to do their work. They can be coming up with all sorts of hacks for all we know.

    • Psionicinversion

      Yeah i suppose, i would like the day if x1 can make a game visually better than the best game of PS4. id be rubbing that in there faces like no tomorrow

    • GHz

      Well that’s subjective. + how a game looks texture wise isnt indicative of how powerful it is. Scale does. Asset wealth does. Any system from yr 2005 + can render visually arresting games. What you’re willing to give up to achieve polished textures is the other half of that story. Again, that is all abut what the Devs want, and how they want to present a game. The Order Vs Ryse (which is almost 2yrs old). Both great looking games. I prefer The Order art direction, but I also recognize that Ryse is larger in scale with the amount of FX and NPCs on screen. How you define visually rich is subjective in this case.

    • Psionicinversion

      Thats why the only game atm that the PC version wont be downgraded is Star Citizen, infact the only thing they are doing is upgrading it but its kind of different as they thought they would only have a budget of total of 27mill to do the entire thing but now there at 82+mill they can constantly upgrade everything from there but its looking glorious so far. Cant wait to see some real environment more or less finalised footage but that wont be for a fair few months yet.

      Apparently according to a german website witcher 3 runs ultra 60fps on a 770 no problem. The released recommended spec was a 770 for med-high. If it is true that a 770 can max the game (no gameworks turned on i think) no probs at 60fps….. CDPR have been very naughty devs. ive got a 280x which is similar… 27 1/2 hours left to find out

    • XbotMK1

      Take a look at this.

      Mantel and DX12 have very similar performance. There is nothing special about DX12.

      No sh*t Sherlok. Consoles allways feature custom APUs. They’re new consoles.

      And no sh*t Microsoft knew about DX12 when making Xbox One. Microsoft wouldn’t make an API without it supportting their console. Thanks for the obvious.

      The part where you’re delusional is that you think the Xbox One has custom components specially designed for DX12. DX12 was made for PCs and the Xbox One and PS4 are both PC architectures therefore their is nothing special in the Xbox One. The Xbox One doesn’t have a special architecture like the PS3 had.

      It looks like Xbox fan boys still believe in fairies. They get more delusional by the day no matter how many facts you put in their face. This is getting more and more pathetic.

    • GHz

      But when the games drops in the 1st party space, where games take full advantage of their respected hardware, we see XB1 winning as far as bench pressing assets. How do you explain this? How do you explain, XB1 having the ability to do 4xMSAA @ 1080p while the PS4 couldn’t? And that’s according to Sony 1st party DEVs RAD. Let the games do the talking! Otherwise its all talk.

      “The part where you’re delusional is that you think the Xbox One has custom components specially designed for DX12.”

      What planet you live on? Not only has this fact been verified, but is a fact that you need new components to take full advantage of DX12. So now you’re trying to say that when MSFT custom design, its not in preparation for their own tech? And you call me delusional? Now you’re being stupid. According to you, it should be obvious! But apparently its not for pple like you, hence the reason i share THE OBVIOUS INFORMATION. Because of fanatics like you who’ll try to undermine the facts.

      Good for PS4 and all of its good tech. But as far as it goes right now, Forward + rendering 4xMSAA @ 800p 30fps most of the time on the PS4, Vs Forward + rendering 1080p 4xMSAA 30fps locked all the time on the XB1 under horrendous DX11 VS a better API on the PS4. You should be asking How!? What are we missing? Dont you want to know?

      Don’t let the facts kill you, let it educate you. Ask the right questions! D8mb @ss sony jihadist!

    • Terminator

      “What planet you live on?”

      Dud you are talking to one of Sony’s biggest Sony Pony *ick Rider drones that ever walked this Earth. All he says is pure garbage and misinformation. He twists everything to make it sound like he is right. He is just one of those Basement Dweller “Engineers” who think they know what they are saying. I can tell you he always says the same thing but with new words.

    • d0x360

      Don’t pay any attention to him he doesn’t understand how hardware or software works at all he just reiterates nonsense he reads in other places to make himself feel good

    • GHz

      Its just amazing how all these Sony nutsballs just sound the same as if they all graduated from the same Stupid Camp with honors. All they do is babble incoherently like drunk parrots w/o a clue. They stick to one timeline and never update their info as we progress. I think its the same guy, signing on under multiple accounts. Sony wack job.

    • d0x360

      Itis the same guy. I know him well.

    • Cinnamon267

      “How do you explain, XB1 having the ability to do 4xMSAA @ 1080p while the PS4 couldn’t?”

      Anything, even the 360 did with the original Horizon, can do 4xMSAA. Using a racing game isn’t a great way to make a point.

      “on the XB1 under horrendous DX11”
      We’ve known since last year that a DX12/GNM-do it yourself API when the Metro dev talked about it.
      Jesus christ console gamers are completely crazy.

    • GHz

      360 did but not @ 1080p with more cars, superior textures, off road etc.

      No matter how you look at it, that game was built w/o xb1 True API. Playground made it look easy despite all the trouble dx11 caused 3rd party devs. They proved that Xbox One can when most was saying it can’t.

      Yup, I called that fool a Sony jihadists. some of these gamers act like fanatics and love to evangelize their rhetoric. If you’re going to inform be honest. Don’t twist the truth like some shady evangelist on cult mission. We cash do w/o that.

    • Cinnamon267

      “No matter how you look at it, that game was built w/o xb1 True API”

      Again, Metro dev already stated there is a dx12-GNM style do-it-yourself API available and we already knew a bunch of dx12 features are available in the gutted dx11 toolset they have access too.

      They proved that Xbox One can when most was saying it can’t.”
      You can do a lot when you only have to target 1 platform.

    • GHz

      Yes they did, but is the Xbox One running fitted with its true API? No.

      And for the 2nd part, absolutely. I’ve been saying that for quite some time. When you don’t have to factor in ps4 development, you can do some pretty advanced stuff on the xb1. Just look at quantum break.

    • Cinnamon267

      We barely know anything about Quantum Break.

    • GHz

      And that’s what makes it so intriguing. The little they’ve shown revealed some amazing code. I don’t care what res that games running in. It just looks phenomenal. Remember when everyone was saying that all that was cgi? Come to find out it was running on Xbox One.

    • Cinnamon267

      They showed some pretty looking faces. But the rest is every other video game. Dark with a lot of brown and grey.

    • GHz

      Well I’ve never seen anything like it. I cant wait for Dev diary explaining how they accomplished all that physics and lighting. It immediately stands out. Never seen it done on that scale .

    • ageofspace_world

      try age of sapace the new browser game where you could be empereor of the universelogin from this linke and you will recive the double initial resources:

      if you get registred and share the game with your referal link you will earn the 25% of your recruited player spend on the game,

    • Tyson225

      the biggest thing about getting dx12 is not just boosts..its using the esram properly

    • Pops

      Xbox one is Dx12 compatible thats what makes it special over mantle.vulkan isnt ps4.if it was they would have annouced it by now

  • XbotMK1

    Too bad that it’s too late. The PS4 has a %50 stronger GPU and faster RAM. Nothing will save Xbox One. The only thing Microsoft can do is try to get developers to lower the PS4 version or pray that devs don’t fully utilize the PS4.

    Anyone who thinks DX12 will close the gap is delusional or a fan boy that doesn’t understand what an API is or how an API work. Xbox One, PS4, and PC are all X86. There is no special hardware inside Xbox One that makes DX12 special. DX12 is made to work with PCs and Xbox One and PS4 are using PC components.

    The reason why DX12 gives such a boost over DX11 is because it offers asynchronous compute which is what gives a huge boost to frame rate and visuals. Not only does the PS4 have a %50 stronger GPU, it has 4 times the asynchronous compute engines over Xbox One.

    The PS4 had a low level API from the beginning that is faster than DX12. DX11 has been holding everyone back.

    • Psionicinversion

      it may have 4x the compute engines but doesnt mean it can use them. Finding enough idle time on an already overloaded GPU will be pretty hard to execute all that stuff. 2 ACE’s should be enough for the x1

    • Guest

      Man, I can’t believe how stupid Sony fans are.

    • Psionicinversion

      yeah, the 8 ACE queues will be for PC GPU’s. that 4,096 SP chip incoming should have 64 compute units (based on 64SP’s per compute unit). Thats what the 8 queues are for to slot stuff in to our amazing GPU’s not console trash GPU’s

    • Guest

      Correct, it’s a completely unbalanced archtecture.

    • Psionicinversion

      the sony fans will spin round and round chasing that sony tail until they fall flat on there face

    • Guest

      They already have, they just don’t realize it yet because their brains haven’t caught up with the facts.

    • Psionicinversion

      well thats what running at 24fps will get you, your trying to get there but you struggle

    • Modi Rage

      It’s quite obvious that you don’t understand what an ACE unit is and what they do. ACE units also help with a weak CPU. The PS4 and Xbox One have weak CPUs and much stronger GPUs. The Xbox One only has 2 ACE units which which means the CPU will continue to bottleneck the Xbox One. You’re trying to defend the Xbox One’s weaker hardware. The more ACE units, the better the frame rate.

    • Psionicinversion

      I know what an ace unit does and how a multithreaded command buffer feeds tasks in between idle time on graphics pipeline without waiting to go. AMD have a really good explanation on there site about asynchronous shaders. Fact is 8 ACE units may really struggle to find enough idle on an 18 compute unit gpu like ps4 to slot 8 tasks in. Each compute unit processes 1 task. When that compute unit is done and render output is waiting for the rest to finish you can slot a compute task into the idle unit to process speeding up performance. For a 12 compute unit gpu like xbox 2 ACE queues will be enough. For ps4 3-4 would be enough.

      For a 64 compute unit monster you’ll need all 8 going in overtime

    • Orion Wolf

      Seriously tough I don’t get the overdesigning of the ps4 in some parts and yet creating bottlenecks in some others.
      Not only do they have enough ACEs for a 64 compute monster i.e. 390x, but they have 18CUs, but only 140GB/s of actual bandwidth to feed them – and the more the CPU is used the less bandwidth is left to the GPU. According to AMD to feed 32CUs you would need 700GB/s to eliminate the memory bandwidth bottleneck (that is not for HBM tough).

    • Psionicinversion

      well AMD would of been developing the 8 ACE units for Hawaii… gcn 1.1 so sony probably snuck in and had that slapped on to it.

      I think the 8 ACE queues will help more for there server cards anyway also will help it along when there cards are used for cloud streaming services and stuff

    • GHz

      “There is no special hardware inside Xbox One that makes DX12 special. DX12 is made to work with PCs and Xbox One and PS4 are using PC components.”

      Right, Sony had access to what MSFT was building even though AMD themselves admitted not to know what custom features MSFT had implemented. And your conclusion is PS4 have access to all features XB1 have. OK riiiight. Someone here is lying and I bet its you.

      I’ll stick with what AMD said late last year

      “Every API has unique features that set it apart from the others, and it’s difficult to say what those features will be in DX12 in 2015”


      Read more:,microsofts-redemption-directx-12.aspx#ixzz3aRAYzkR8

      and then more recently….

    • Terminator

      “is delusional or a fan boy that doesn’t understand” You should look in a mirror.

      Are you Tha Truth? John Doe? I’m Flat Out Right? You all hate Microsoft/Xbox One, and talk the same way (100%). Now that’s the Truth.

      Btw you should stop upvoting yourself it doesn’t help make your comments any more credible. (I wouldn’t have said anything but the fact that you get up voted by the same people in most of your comments its strange but not surprising since this isn’t the first time)

  • jadedcorliss

    How much secret sauce does the Xbox One have?
    Anyways, I think they’ll be able to get a lot out of it, but still it’s funny.

    • Psionicinversion

      so you think sony hasnt hired people to optimise the CPU and there GPU to….. get real dude SN systems do there CPU and the ICE team work on the GPU .

      Is that secret sauce to or does it just mean there trying to get the most out of it???

  • d0x360

    Indeed it does have a less powerful GPU…on paper. On the real world the difference in visual quality is pretty damn small. On double blind studies 9/10 people can’t tell the difference between a game running on one or the other

  • Pops

    Wow making the gpu dance like its in a music video.interesting.i guess this is were dx12 comes in when using the 8 graphics context the gpu has making it dance with the CPU with no bottle neck


Copyright © 2009-2015 All Rights Reserved.