Interview With Brad Wardell: PS4/Xbox One Differences, DirectX 12, Ashes of the Singularity and More

Stardock CEO talks upcoming game and how it uses DirectX 12.

Posted By | On 01st, May. 2015 Under Article, Interviews | Follow This Author @DanteandSpardaX

Brad Wardell has seen a fairly illustrious life. On top of developing AI and various mechanics for real time strategy games, Wardell has served as the CEO of Stardock Entertainment. His latest game – Ashes of the Singularity – will look to redefine the RTS genre by embodying a world war at play. This doesn’t mean just hundreds of units but literally thousands of soldiers scurrying about the screen.

GamingBolt had a chance to speak to Wardell about Ashes of the Singularity along with various other topics like DirectX 12, how well the RTS benefits from the API and whether it scales across numerous systems.

ashes of singularity

"The big thing here is that Nitrous has what you call an Asynchronous Scheduler. In a normal engine, like pretty much every engine that's been released, commands that are going to be sent to your graphics card are serialized."

Kurtis Simpson: At GDC, you guys announced Ashes of the Singularity, which plans to redefine what people think about large strategy games. Can you please share more details about it? 

Brad Wardell: The idea behind Ashes of the Singularity is to do a real time strategy game that takes place across the world. So if you think of every RTS you’ve played in the past you’re fighting individual and maybe hundreds of units in the world or even on a single map. What we want to do with Ashes of the Singularity is to have a RTS in which players clearly see that this is a world war, with thousands and thousands of units that are fighting it out on screen.

I realise that some people will worry about controlling all those units and the way we approach that is through this Meta unit concept, which is where individual units can be controlled, if you want to, but it gets a little unmanageable at a certain point. The interface makes it really trivial to combine these units together where they work as an army. Now the player is acting as General commanding armies to go do their thing while the units are taking care of all the grunt work.

Kurtis Simpson: The demo at GDC was extremely impressive with over 5500 units with each of them having their own light source. What can you say about the kind of changes you did in the Nitrous Engine to support this level of rendering? 

Brad Wardell: The big thing here is that Nitrous has what you call an Asynchronous Scheduler. In a normal engine, like pretty much every engine that’s been released, commands that are going to be sent to your graphics card are serialized. You send them to the scheduler, it serializes them up, and they go one at a time. What we’ve managed to do with Nitrous is that it’s done in parallel. So it’s Asynchronous, every single core on your CPU can talk to your graphics card at the same time.

If we were running on a single-core machine we wouldn’t be any better, the performance of Ashes…We couldn’t do it. It would be the same as any of the RTS’ that have came out in the past but now a days since we’ require at least a four-core CPU, which most people have. Unless you’re running a laptop some people only have two but pretty much most people have four or more, you get literally four times improvement in performance.

Kurtis Simpson: Is this due to the benefits of DX12?

Brad Wardell: DX12, Mantle and Vulkan make it really practical, so even under DirectX11 we’re doing a lot of crazy stuff with all the cores but the problem with DirectX 11 is that even with our scheduler, DirectX11 still serializes up a lot of our commands so we lose a lot of benefits. Not all of it but you know, a substantial amount, so we have to turn down a lot of our cool effects. But we’re still able to do thousands of units on-screen at once, we just can’t show them at quite the same glory. On DirectX 12 though they get out of our way entirely and we can have complete control of the GPU.

Kurtis Simpson: I see. DirectX12, it looks to bring some really important and really cool things to the table.

Brad Wardell: It’s too bad that Microsoft and others have kind of spent their goodwill with past DirectX releases and now that Microsoft’s releasing something genuinely revolutionary people are like “Ye, ye, we’ve heard this before”. And one of the reasons why I’ve been so vocal about it is this…DirectX 12 really is different than anything we’ve ever seen before.

Kurtis Simpson: Talking more about the game itself the demo was being shown on the game’s smallest map. Does this mean the larger areas of the game will have even more units on the screens? 

Brad Wardell: The map we had running at the show had 7700 hundred units. When you get to bigger maps you’re talking 15-20,000 units potentially. The larger maps are designed primarily for people who are playing it with a group of friends. If they’re playing multiplayer that could take place over days or weeks.

Kurtis Simpson: That’s really impressive. From your own testing what’s the maximum number of units you’ve been able to push on-screen at once?

Brad Wardell: Oh well, a lot. I actually don’t know, we haven’t really played with it. We’ve only got to the point where the art assets are at a level where they would be comparable to something you would have on screen. Even like six weeks ago, the unit’s art wasn’t sophisticated enough, it was meaningless. Their art was so primitive whereas only in the last few weeks we’ve got to the point where it’s started to get pretty sophisticated, but we’ve easily got it to over 20,000 units without a hiccup. So you’re talking a couple orders of magnitude higher than any other RTS. These are actual individual units with their own guns, their own minds, and their own everything!

Though I should clarify one thing because I’ve read what people are talking about what  games they like, games like Total War and Supreme Commander which are outstanding games. But what they’re doing is that their actual numbers of units are far less and they are using instancing which are that they’re really just the same units being mirrored on the screen. Whereas In Ashes of Singularity, these are actual individual units with their own guns, their own minds, and their own everything! It’s not that it’s better or worse, it’s just that it’s different because the hardware capabilities are so much more now.

Ashes of Singularity

"One of things are the light sources, we have real light! You know your typical PC game or even a console game might have four or eight light sources. On DirectX11 we'll have four, maybe eight real light sources. But on DirectX 12 we can have thousands of light sources."

Kurtis Simpson: Speaking on hardware, the game clearly scales well across modern CPUs and hardware. But how well does it fair on CPUs from say three to four years ago?

Brad Wardell: Ironically at the Microsoft booth at the show [GDC], it was running on the worst DirectX 11 and DirectX 12 performance you could get, because they’re running on the fastest CPU they could get. And DirectX 12 actually shows up better on a slower machine that has lots of cores, so if you are running on a Intel Core-i5 with say four cores, it would destroy a DirectX 11 machine running on the highest-end hardware you can get.

Kurtis Simpson: So the performance of the same hardware is doubling if not tripling you would say?

Brad Wardell: Every core makes a huge difference. I think most people realise this but it hasn’t really been simplified in such a way that’s put in black and white so to speak. It’s that your video cards for years have been monsters, I think most people realise that for a while the video cards have been quite a bit more powerful than the CPUs themselves. That’s why you get this high end card and it sounds like a there’s a jet engine in your machine.

Kurtis Simpson: So it’s all down to the scaling.

Brad Wardell: Exactly. If I have one core that does let’s say a speed of one, I am still better off with four cores at the speed of four.

Kurtis Simpson: Is Ashes of the Singularity CPU-intensive in anyway, in terms of how well the Nitrous Engine renders the game? Or is it more GPU-bound?

Brad Wardell: Yes, we’ll max out all your cores if we can. We try to do as much as we can regardless if you are running a slow CPU or a fast CPU we try to maximise every core.

Kurtis Simpson: Ashes of the Singularity is going to run on DX12 although not exclusively. What kind of benefits will the DX12 version have over DX11 version more specifically? 

Brad Wardell: One of things are the light sources, we have real light! You know your typical PC game or even a console game might have four or eight light sources. On DirectX11 we’ll have four, maybe eight real light sources. But on DirectX 12 we can have thousands of light sources. And that has a very subtle impact on how real the game looks, and I don’t mean real as in real life but you know, even if you watch a CGI movie like Toy Story they don’t look like videogames right. You look at, what’s a recent Pixar movie? You look at it and you know it’s not real because it’s their style but it definitely doesn’t look like a videogame.

Then you look at a video game and you go “There’s something video-gamey about it” and what it is that it’s mostly about how they’re rendered on the screen, about how they’re lit so to speak. How motion and depth-of-field are handled, and what games do although this gets too technical but everything on games these days is done through deferred rendering. Whereas in movies they’re done using what’s called object-space rendering, which is what we’re doing. So on DirectX11 we have to disable some of those effects like some of the true depth-of-field, some of the temporal anti-aliasing, and some of the things that make the world feel a little more…again not realistic as if you think you’re looking out of a window but more tangible.

Kurtis Simpson: More natural-looking would you say?

Brad Wardell: Yeah, I don’t know the right word to describe it,  like if you’re watching an animated movie you know, it looks more real even though it’s clearly not.

Kurtis Simpson: Exaggerated-realism.

Brad Wardell: Yes, there’s something more tangible when I’m watching a movie like say even Big Hero 6.

Kurtis Simpson: Perfect example. Ashes of the Singularity, are there any plans to do an open Beta for the game? Furthermore do you have a release window? 

Brad Wardell: Yes we’re planning to start going into early-access in the summer, probably somewhere mid-July. One of the things we’ve learned from Off World Trading Company which has got really good responses, is that we’re way better off having these Betas come out when they’re really more mature, but still give them plenty of time for user feedback. You know we don’t want people to get into this and then the game doesn’t even play. We want them to sit down and fully be able to play the game and be able to fully give us feedback on what they don’t like about the game, what they do like about the game, and things they would like to see change and that kind of thing.

Ashes of Singularity_02

"You know when those 5K monitors come out this year and a lot of people aren't even thinking about this yet but here's an easy sell-point right, you're not going to be running high-end games on DirectX11 on 4K or 5K monitors, you just can't do it."

Kurtis Simpson: With DX12 you’ve spoken on the potential of achieving movie-like quality in videogames. How fast do you think DX12 will become the norm when it launches later this year?

Brad Wardell: I think there will be a lot of pressure to get by consumers to get on it because you saw the demo of Ashes, and that’s a Pre-Alpha. Our art team did not want it shown yet because they didn’t think it was pretty enough, and even then it was something people haven’t seen before. So if you can imagine what it’s going to look like when it’s done or when the first games come out using really huge budgets with DirectX12. It’s just such a night and day difference.

Kurtis Simpson: I mean you say the art team wasn’t happy about what was being shown, but was has been shown is really impressive. So when the game comes out and DirectX12 really takes off, when things with the game are finalised it’s going to blow things out off the water.

Brad Wardell: Half of the ships were not textured, there were no trees or any other interesting things on the map and you know the lighting wasn’t in and the shadows. No one had seen a 4K game with thousands of units on-screen at once, DirectX11 couldn’t even attempt that.

Here you would have people walk in and although we were running on Mantle the same applies to DirectX12, there it is, a 4K game. No one had seen a 4K game with thousands of units on-screen at once, DirectX11 couldn’t, DirectX11 couldn’t even attempt that you couldn’t do it.

You know when those 5K monitors come out this year and a lot of people aren’t even thinking about this yet but here’s an easy sell-point right, you’re not going to be running high-end games on DirectX11 on 4K or 5K monitors, you just can’t do it.

Kurtis Simpson: As of right now it would seem like brute-force with graphic cards, I mean that seems to be the only plausible way right now with 4K.

Brad Wardell: Well I mean it is brute-force but you can’t even feed the graphics cards with DirectX11 or 9 fast enough. I could take a ten year old game or a five year old game even and run them at 4K, but if you want to do a modern production game with that sophistication running at 4K, you can’t do that running on DirectX11. There just isn’t the bandwidth between the CPU and the GPU because you’re just having one core talk to the graphics card.

Kurtis Simpson: DirectX11 is like a bottleneck that’s overstayed its welcome.

Brad Wardell: Yes and Microsoft’s pushing hard, Windows 10 is free to everyone who has Windows 7 and based on the Steam hardware survey most of the people have video cards that will upgrade to DirectX12. So it’s not like people have to, I mean the operating system’s free so it’s not like people have to buy new hardware. There’s already going to be a market, it’s not going to be like when Microsoft foolishly made DirectX10 Windows Vista only and then you had to get a new video card and then. They’ve learned their lesson.

Kurtis Simpson: That’s how it would seem. Staying on the topic of DirectX12 according to the majority of our previous interviews with other developers regarding the Xbox One, it seems like it already has a low-level API that’s similar to DX12. Do you think that DirectX12 will improve the Xbox One’s hardware that much given that it’s static, unlike the PC? 

Brad Wardell: It won’t have the same impact. There are a couple of things that are important in DirectX1 2 for Xbox One developers though. First of all Xbox performance is completely the result of the eSRAM feature and there isn’t a true or false thing with regards to one using eSRAM. You could use it well or you could use it poorly or somewhere in-between, and their API which is the current DirectX11 extension for the Xbox is really crappy for dealing with the eSRAM. That has resulted in what’s called Resolution Gate. 

I’ve never heard Microsoft just come out and, I mean they should just really come out and explain to people why they’re having problems getting games to run at 1080p. But maybe they don’t think their users will understand, basically it has to do with developers aren’t making effective use of the eSRAM API. So in DirectX12 they actually threw it away, they threw away the crappy one in DirectX11 and they’re replacing it with a new one. So that’s pretty huge.

They also released a new tool, it’s this optimization tool that will actually algorithmically try to come up with an optimization for the developer. So instead of the developer trying to hand set-up what uses eSRAM, they have their own app to try and do as much of it for them as they can. Third, DirectX11 still serializes stuff from the developer to the GPU. It is low-level but the fact is as low-level as it, it’s still serializing a lot of GPU calls. So it won’t be anywhere near…you won’t get the benefit on Xbox One that you’re getting on the PC.

It’s  completely different but you are going to get a substantial benefit. The part I think that users will care about is that it should address the resolution stuff for most people. That’s what I think is the most glaring thing that people are upset about. But it won’t do anything magically. The developers still have to use it, it’s not like your old games will magically be faster.

Ashes of Singularity_03

"Now maybe in Unity or Unreal, one of the other guys will write their engines in such a way so that they make the most use of it, but that's going to take time. Whereas if they use something like Vulkan, it's not as low-level as their API, but Vulkan has the advantage that it's really easy to write for it."

Kurtis Simpson: So a lot of it is all on the developers and how they use it.

Brad Wardell:  I mean that’s the thing I like about being able to make a prediction, is that something that’s on a visual medium like this is that we’ll be able to revisit this discussion a year from now and it will be pretty obvious. You’ll see the games that run on DirectX 12 and you’ll be able to compare them with games that run on DirectX11 on the Xbox One and you’ll be like ‘Oh, yeah there’s quite a difference’.

Kurtis Simpson: With games such as Ryse: Son of Rome which looked pretty impressive when the Xbox One first launched over a year ago, it’s pretty difficult to imagine how much more the games will improve, given it looks that good already.

Brad Wardell: With the PlayStation 4 and the Xbox One they’re not even remotely scratching the surface for what people can do and there’s still…I mean on the PlayStation 4 and their low-level API, they’re all still very…they’re like written for last-gen but updated for this gen. I wouldn’t say they’re completely native yet, I mean they are native but you know these words all get misused, but this gen’s graphics are still very far behind where they’re going to be. 

Kurtis Simpson: That’s something for people to look forward too then.

Brad Wardell: Yeah they’re not even scratching the surface.

Kurtis Simpson: With the public perception of Xbox One and since the announcement of DirectX12 and its impact on the Xbox One, Microsoft has seemed pretty silent. They appear to have taken a silent approach as to how DirectX12 will affect the Xbox One. What are your thoughts on this as to why they’re so silent? Whereas other people and developers such as you have been quite outspoken about its benefits and what kind of impact it’s going to have.

Brad Wardell: With the Xbox One we’re being pretty speculative right because there isn’t a game that’s using DirectX 12 on the console at this point in time, so I can’t even do a side by side comparison. Whereas on the PC we have Ashes of the Singularity. It is a game that’s been optimized for DirectX 11 and updated for DirectX 12, and you can run them side by side on the same hardware and get a 70% boost on DirectX 12 over DirectX 11.

So it’s pretty easy for me to say yes you’ll get a huge impact on PC, but on the console it’s all a theory. They have nothing, they don’t even know. I mean I’ve talked to the development team there on this subject for a while and it basically boils down to, we don’t know how much of an effect it will have because so much of it is in the hands of the developer.

Kurtis Simpson: Right I see. One of the interesting features of DX12 which was showcased at GDC was ExecuteIndirect which allows multiple draw calls with a single API call. Do you see the developers using this functionality to improve performance on the Xbox One? 

Brad Wardell: That I couldn’t say. I mean you could argue that comes in to the bundles feature. My guys at Oxide they’re not too keen on bundles themselves. So I don’t know, I don’t feel comfortable to say yes or no on that. I’m not familiar with that enough to speak on it.

Kurtis Simpson: Okay that’s perfectly fine. Not too long ago you tweeted that PS4 owners will have something to look forward as well. As far as a lot of people know, Sony is pretty secretive regarding their API technology. What kind of information could you share regarding the PS4’s API and how rapidly it will progress in the upcoming months? 

Brad Wardell: What I was referencing at the time was Vulkan. We’re part of the Khronos Group and now it depends who you talk to at Sony and this gets in to a debate. Sony has a very low-level API already for the PlayStation 4. The problem I have with it is that if you want to make use for it you’re writing some very specific code just for the PlayStation 4. And in the real world people don’t do that right. I write code generally to be as cross-platform as I can.

Now maybe in Unity or Unreal, one of the other guys will write their engines in such a way so that they make the most use of it, but that’s going to take time. Whereas if they use something like Vulkan, it’s not as low-level as their API, but Vulkan has the advantage that it’s really easy to write for it. So you’re more likely to get developers to code to that and get more games on to Sony then you would otherwise.

PS4 Xbox one

"Yes, because DirectX12 is such a game changer for everyone. So first of all everyone's going to use that for Xbox One. What will be interesting to see with Vulkan is that every hardware vendor is going to support DirectX12. Now the question will be how quickly Nvidia and Intel will support Vulkan and at what level."

Kurtis Simpson: Right. So as everyone more or less knows the PS4 has better hardware than the Xbox One due to its GDDR5 Ram architecture. Having said that do you think the DDR3 in the Xbox One is somewhat compensated enough by its fast eSRAM? 

Brad Wardell: That depends on who you talk to. In my personal opinion the eSRAM does not quite make up for it but it makes it really close. The real problem is that the Xbox One only has 12 what you would call cores on their GPU and I think the PlayStation 4 has around 18. The hardware on the PlayStation 4 in my opinion is better than the hardware on the Xbox One.

So you end up in an impractical manner with games that are going to be of a similar capability. I’d rather write for Windows, as an example I’d rather write for Windows than Micro Controller (laughs) or something where I have to know things that are a little less standard or a little more arcane. Of course what’s arcane is always a matter of where you’re sitting though. As a Windows developer I find the Xbox One more familiar to me, whereas if you’re a Linux developer you might find the PlayStation 4 a little more familiar. That’s just an example.

Kurtis Simpson: Speaking more on the eSRAM itself, it’s considered to be the major cause behind the resolution gate on the Xbox One. Do you think DX12 can help remedy the resolution issue for the Xbox One? 

Brad Wardell: Yeah, it should do, because in DirectX11 it’s really a pain to make good use of the eSRAM. Whereas supposedly in DirectX12 and this is all theory, I haven’t used it myself but the new API is supposed to make it a lot easier to optimize your use of the eSRAM memory.

Kurtis Simpson: Okay because it sounds like, when you think about it now it sounds like it was built specifically DX12 and not DX11.

Brad Wardell: The API is there for me to use as a tool for the piece of hardware. And the one that was in DirectX11 was not easy, it was a very trial and error process to make use of the eSRAM. In DirectX12 they’ve tried to make it easier to make use with and the easier it is to use, the more likely you’re going to get developers who optimize for it correctly.

Kurtis Simpson: Right, well switching topics just briefly over to Mantle and Vulkan. Many people are claiming that Mantle is dead in the water now due to Vulkan. What are your thoughts to this? Do you think Mantle’s dead and that Vulkan and DX12 are the next big thing? Or is there more to Mantle still being adopted? 

Brad Wardell: Vulkan, literally is a derivative of Mantle. So in terms of whether AMD will long-term continue a separately commanded Mantle API is something that remains to be seen. I think they’re certainly going to continue for the near-term, we’re going to continue to support it. The thing is if I’m writing for Vulkan, I’m also writing for Mantle effectively. I think there’s a lot of confusion on that, I don’t think people realise where Vulkan came from. I think they imagine “Oh! It’s OpenGL”  but has a different architecture.

Kurtis Simpson: Do you think Sony may drop its own custom API and switch completely to Mantle for the PS4 in response to DirectX 12 on Xbox One? 

Brad Wardell: No, because their low-level API is still lower level than Mantle and Vulkan. So what I’m hoping is that they will support Vulkan.

Kurtis Simpson: So there’s no real benefits would you say? And that’s its best they stick with their own custom API?

Brad Wardell: Let’s say I write a game for the Steam Box and the PlayStation 4 supports Vulkan, the Steam Box supports Vulkan. It wouldn’t be that much more work for me to have my game work on the PlayStation 4. Whereas right now if I want to develop the game for the PlayStation 4, I have to learn their special custom API, that has shader languages that are different than what I’m used to, and I’m pretty sure that I have to send stuff in text instead of binary form.

I hate OpenGL (laughs). They’re old, their current one is just archaic. I don’t want to have to learn that, my brain is already full of OS2 and Linux crap, I don’t want to learn yet another short-term API. If I can just learn Vulkan then I can get to a lot of platforms, I don’t want to have to learn Sony’s special API, even if I would gain a few frames-per-second in doing so.

Kurtis Simpson: One of the reasons why DirectX has been able to stay relevant all these years is that Microsoft has invested in the API’s research and development. Having said that, with Vulkan do you think they’ll still be able to sustain that level with DX12? 

Brad Wardell: Yes, because DirectX12 is such a game changer for everyone. So first of all everyone’s going to use that for Xbox One. What will be interesting to see with Vulkan is that every hardware vendor is going to support DirectX12. Now the question will be how quickly Nvidia and Intel will support Vulkan and at what level. What we’re going to see is that in the next few years we’re going to see a lot of benchmarking wars.

Because you’re going to see a lot of games coming out with both Vulkan and DirectX12 support, it’s going to be a lot like the old days. You load up Quake and you can play in OpenGL (laugh) and DirectX, and then there will be demos and benchmarks you can look up to.

xbox one amd

"Every time I see screenshots of Ashes, I wince. Because it's harder to take a screenshot of something in movement like that because everything's a little blurred. We are going to have to come up with a screenshot-mode. It’s easy with a normal game because every frame is a discreet frame."

Kurtis Simpson: Most definitely. Moving back to the Xbox One, Microsoft have made bold claims about using the Cloud with the Xbox One. Do you think it’s a possibility that the console’s processing power can be increased especially with DX12?

Brad Wardell: That is a…yes and a no. I don’t want to weasel out on that because there are specific cases where yes you can. Microsoft just needs to make a case. I don’t want it to be my job to make the case, but let me give you a few examples of where it would come in to play, since to my knowledge Microsoft has not actually put out any examples.

Procedurally Generated Terrain is one of the most expensive things that you could do. You do not need to do it in real-time but it takes a heck of a lot of CPU power. Let’s give you an example, let’s say I’m playing a role playing game and I want a really sophisticated, we’re talking next Elder Scrolls game. This is obviously not, I have no idea what they’re doing but it’s just an example of a game that might use something like this. And I want to have incredibly sophisticated terrain that is going to support rivers and streams, forests and mountains, and I want it to be very detailed.

Now I don’t need to procedurally generate all that stuff on the fly, and you’re going to need to have to procedurally generate it. With that amount of detail you can’t have some map editor guy with some art tools, making stuff like you used to. You’re going to procedurally generate it to give it that level of detail. Your machine is not going to be powerful enough, certainly not the Xbox One or the PlayStation 4 or even most PCs.

They’re not powerful enough to generate that sort of thing easily to that detail. You could put that in the Cloud, and the results of that procedural generation will be sent back over to your Xbox One, so you could get these amazing scenes without any loading screens. Remember the old days we used to have “Loading next area of the game”? This sort of thing could prevent that kind of stuff.

Kurtis Simpson: I’m rather sceptical of the Cloud myself. Microsoft doesn’t seem to have shown much since the Xbox One’s launch.

Brad Wardell: Microsoft needs to make the case for it, I mean that’s the thing they blew with the Kinect. I could tell you how it could be used, how I would use it if I were Microsoft and I had the money. But I’m not Microsoft and I don’t have the money to do something that sophisticated.

But I can tell you like in Ashes of the Singularity, our terrain in that game is procedurally generated. We we’re able to do that by having the GPU procedurally generate the terrain, otherwise it would take hours (laughs). But it sure would be handy if I had Microsoft’s resources to toss all that procedural generation in to the cloud.

Kurtis Simpson: Most definitely. With Ashes of the singularity only being announced for the PC will it be coming to the consoles?

Brad Wardell: We don’t know, I don’t want to do any sacrifices on the game itself to support other platforms. We’ve seen strategy games try to do that in the past and they end up kind of gimped. We are looking at porting Nitrous Engine to the PlayStation 4 and the Xbox One in the future.

Kurtis Simpson: Just to wrap things up on DirectX12. Do you think the graphics difference between the PS4 and Xbox One will become trivial, as DX12 is adopted by more developers as time goes on?

Brad Wardell: Yeah. I mean to the people who are really hardcore they’re always going to find a difference in it, to the average person they’re not going to notice a difference. That’s why when I see these people battling on Twitter about, even as is, I understand the differences but I can’t really tell much difference between the two.

Kurtis Simpson: Earlier on we were talking about the movie-like qualities, the differences between games and movies and such. I recall you in an interview with TIC podcast (The Inner Circle), and you were speaking on rendering a scene from Star Wars Episode 1: The Phantom Menace.

Brad Wardell: Right and we’re doing that right now. Ashes of the Singularity as is, in its Alpha, is running more sophisticated scenes that would you’ve seen in The Phantom Menace. I mean let me give you an example, did you watch that final battle of the Gungans versus the Battle Droid? Those explosions don’t even cast light. I mean their explosions are basically glorified cartoons.

Because you knew even at the time it was CGI you just couldn’t explain why and because it didn’t have real light sources, and we’re able to do that right now on today’s hardware, you know on a much smaller budget and on a Pre-Alpha. What will have to happen is that you’re going to have stop these 3D-engines doing this deferred-rendering stuff and move to an object-space model, setup. Basically the way CGI does stuff.

Kurtis Simpson: Yeah, watching that scene now and thinking on the topics you’ve just mentioned, all that stuff becomes a lot clearer and stands out as such. But back then when you first see it, it really is impressive and you say you’re able to do something that’s more sophisticated in Ashes of the Singularity. It’s massive step-up even though it was all those years back.

Brad Wardell: And you know what’s ironic? Every time I see screenshots of Ashes, I wince. Because it’s harder to take a screenshot of something in movement like that because everything’s a little blurred. We are going to have to come up with a screenshot-mode. It’s easy with a normal game because every frame is a discreet frame. Where as in Ashes because things move with temporal anti-aliasing it’s always a little blurred as it’s actually moving. You know it’s like taking a snap-shot off a movie.


"I don't know my skeptical side says it's a fad but we have an Oculus Rift, we've played around with it and if they can just nail down some of the visual experience issues, we should be able to be done in time. I think there's going to be types of games for it, I don't want to play a current style game with it."

Kurtis Simpson: Yeah it would appear that it’s best seen in motion. Just to end this on an off-topic question, what are your thoughts on Virtual Reality and Valve’s VR headset? Do you think it’s going to be widely adopted?

Brad Wardell: I’m pretty excited about it! It doesn’t really apply to our type of game so I don’t know as a developer, I’m not sure on how I would use it yet, but as a player I’m very excited for it. I’m hoping…I would love to see them do Half-Life 3 for it, that’ll get me to buy it.

Kurtis Simpson: Is it confirmed? Is that Half-Life3 confirmed? 

Brad Wardell: Oh no (laughs) I would have no idea. I mean that’s the dream right?  If we could make it true by just by saying that then I would absolutely say, yes.

Kurtis Simpson: Well I’m taking that as a confirmation and it’s going on Twitter very soon. So VR,  it’s not going to be fad would you say? It’s not going to be 3D. Do you think it could replace people’s TV screens?

Brad Wardell:  I don’t know my skeptical side says it’s a fad but we have an Oculus Rift, we’ve played around with it and if they can just nail down some of the visual experience issues, we should be able to be done in time. I think there’s going to be types of games for it, I don’t want to play a current style game with it. I can see someone making a new type of game that we can’t currently think of, but someone will make it. It just needs a Killer-App, the question is whether someone will make a Killer-App.

Kurtis Simpson: That’s what it seems to boil down as people have been looking at primarily for first-person-horrors and first-person-shooter. But first-person-shooters specifically don’t seem to be an ideal match due to the control scheme and the headset. You’re not actually moving anywhere so it seems to contradict it.

Brad Wardell: I’m not sure. I’m not creative enough to think on how they’ll do that. The thing that makes Valve’s VR very interesting to me is the fact that it does recognize, it does track real-world space. So you could be walking around and if you do come up to a real-world object it will kind of show up in your virtual world.

Which is potentially interacting with your real world and interacting with the actual game itself. I can’t think of examples yet. It’s kind of like Kinect, the question is whether VR will end up like it. Not so much like 3D but as much as Kinect, where you go “Oh! It just needs a Killer-App but what is it I don’t know!”

Kurtis Simpson: Hopefully not. Kinect seemed like one big lie; although it had potential it died.

Brad Wardell: Yea,  they sacrificed a lot for the Kinect. You’d think they’ll have Killer-App that shipped with the Xbox One.

Kurtis Simpson: Thanks so much Brad it’s been great talking to you on DirectX12 and Ashes of the Singularity. I’m really looking forward to it.

Brad Wardell: It’s great talking to you also and thank you.


Awesome Stuff that you might be interested in

  • Cenk Algu

    The PS4 does not have %50 more raw GPU power.4CU is seperated as GPGPU command to be able to prevent the death of weak CPU.×607.jpg

    CPU is very weak plus horrible writing speed and horrible latency of GDDR5 is killing it.Another thing is API.GNM is an OGL based API which requires a quiet powerful CPU.Please someone tell me how such a weak CPU can respond more than 14CUs while there are cumbers like GDDR5 and API?

    • Michael Norris

      GDDR5 isn’t the issue it’s the CPU.You act like the Xbone has a better CPU which in fact it is only about 4% better.Look at GTA5 the Ps4 version used to drop frame rate in heavy traffic areas now that has been fixed and the firefights run at a locked 30fps compared to 25fps on Xone.Dying Light got a update on Ps4 that fixed lighting,pop in and added AF x8 while running at a better frame rate and resolution.

    • Psionicinversion

      the last patch removed effects on ps4 from gta5 thats why it performs better. I can get more performance if i drop settings to. omg ps4 amazings hahah

    • MPTheGreek

      They already added it back and kept the performance improvent. It was done to xbone as well but they didn’t get a performance improvement lol.

    • Psionicinversion

      That massive improvement to run it at 30 lmao mean while 60fps is amazing. Guess you’ll have to wait to ps5 to experience that

    • You Are Flat Out Wrong

      In the special HD Remastered editon. 60 extra frames at High settings still using FXAA. Pay another $60.

      I’m convinced consoles still exist simply because it’s so easy to milk their consumer base.

    • MPTheGreek

      No pc users are the ones paying 60 fir a 3 year old game. Lolololol you’ve been played.

    • You Are Flat Out Wrong

      You bought the inferior version. You are the one being played :^)

    • MPTheGreek

      No I’ll use my pc if I want to play online with hackers ruining the experience. Thanks though.

    • Psionicinversion

      what hackers??? BF yeah sure not alot of other games have them. Yesterday was the first time i went online in GTA5 and a blast killingn other players. This one dude was awesome he baited me and used proximity mines to kill me hahah.

      I did the same though i saw some guy go in Los Santos Customs so i proximity mined the entrance hahahah blew him to smitherines, was amazing.

      Driving around laughing my head off… lvl 4 hahah

    • You Are Flat Out Wrong


      (How’s 60FPS GTA V treating you. I love it. 60FPS is love, 60FPS is life)

    • Psionicinversion

      60fps nearly maxed out graphics = awesome well worth the wait. Shame about the last gen version on x1/ps4.

    • Asmodai

      The PS4 does have 50% more raw GPU power. The fact that 4CU is separated doesn’t mean it’s not part of the GPU and doesn’t make it unavailable. It’s still part of the GPU and still CPGPU counts as part of GPU power. Now it doesn’t have 50% more GRAPHICS powers.

      The CPU is very weak on both the XBox One and PS4. They are virtually identical with Xbox One just getting a last minute clock boost. Brad spends a good chunk of this interview talking about how multi-CPU makes a huge difference now though so sure while each of the 8 cores is weak there are 8 of them so their combined power isn’t bad.

      GNM is NOT OGL based. GNM is a thin wrapper on the specific hardware in the PS4. OGL is high level, like DX11 and lower. Vulkan is lower level like DX12 but even they are higher level then GNM. GNM doesn’t require a powerful CPU, it just exposes the hardware capabilities of the PS4 to developers. Brads complaint about it is that it’s so low level and so tuned to the PS4 that it doesn’t work for cross platform development (which is true but Sony doesn’t care about cross platform development). He’d rather learn something you can use on PS4 AND other things. Now for people who don’t want to go that low there is a different high level API from Sony called GNMX which is similar to DX11/OpenGL but that’s not mentioned at all in this interview (likewise there is a high level DX11.3 coming out with DX12)

    • JerkDaNERD7

      Point is we will NEVER see that 50% difference in game.

    • Asmodai

      Of course not but that’s not what the OP said. More goes into a game performance than just the raw GPU power. The CPU, the Memory, etc. The OP claimed it doesn’t have more 50% raw GPU power though and that’s false. He did NOT say it doesn’t equate to a full 50% more gaming performance (which is true but again not what he said)

    • Lestat87

      Thankyou for the unbiased response. I hate fanboys. We are all in it for the games. They are what is important.

    • Guest

      So you think that a old 2012 debunked by Mark Cerny himself VGLeak slide proves anything? All those old slides where from before the system was even finalized. And its already been said that the system can use any CU configuration the dev would like. But I love how you fanboys just cant except the truth and continue to spread these lies because you cant except that the X1 is inferior to the PS4. And what do you think? the X1 still only has 12 CUs vs the PS4’s 18, that only have 2 ACEs vs the PS4s 8 (ACE) that can only do 16 queues vs PS4’s 64. Along with lacking the PS4s volatile bit and direct bus to the RAM that bypasses the GPUS cache.
      And “horrible writing speeds and horrible latency of GDDR5”? So tell me genius, then why are there so many more PS4 games with higher res than on the X1? And whats its latency smart guy? Whats the X1? Bet you cant tell me. The GDDR5 RAM in the PS4 runs at more than twice the speed of the X1’s RAM (5500MHz vs 2133Mhz), so even if the GDDR5 latency was say 20CL vs 13 on the X1. Since its running at more than twice the speed it would be lower than DDR3. And you would get way more bandwidth with DDR3. And bandwidth is way more important than latency. And why don’t you talk about the bubble [dead cycle] in the ESRAM that only allows it to reach peak bandwidth in only 7 out of 8 cycles?
      You now what, im not even going to bother going on since its obvious what your agenda is. You are just trying to downplay the PS4’s technical advantages cuz you’re a hater. And the X1 has the same CPU and performs even worst. Hence why so many PS4 games have a better framerate than the X1.

    • Cenk Algu

      So tell me genius,do you have any idea what you are talking about? What is ACE? How and for what can you use it? If GDDR5 bringing that amount of advantage why not we use it instead of DDR3 or 4 on PC? DDR3 has almost zero latency on CPU side but not efficient for GPU due to it’s bandwith and data transfer speed.PS4 can reach at 1080p easily than XO because there are plenty of RAM if you compare it 32mb eSRAM and eSRAM need to be optimized with it’s own API.That is the issue that XO struggling to reach at 1080p

      And eSRAM is faster and has more bandwith than GDDR5 can offer plus DDR3 is bringing an extra amount of bandwith.PS4’s pikes peak real bandwith is 135gb/s while XO’s 205gb/s.That’s a fact.

      PS4 games have better frame rate? DAI running better on XO,FC4 better on XO,GTA V better on XO,BFH better on XO etc…Plus XO was designed for DX12.DX11 is not a proper API for it even locking the hardware what it has to do.

      Debunked by Mark Cerny? He admits that it is true.Saying from 2012 is stupid because you think console hardware been created one month before released? The PS4 been announced 20.02.2013 and the hardware was already ready and the devs were developing games on it.Both consoles hardware came out the same what vgleaks leaked.

      You Sony fans crying like a little girl when you see the facts and this is so funny:)

      “Digital Foundry: Going back to GPU compute for a moment, I wouldn’t call it a rumour – it was more than that. There was a recommendation – a suggestion? – for 14 cores [GPU compute units] allocated to visuals and four to GPU compute…

      Mark Cerny: That comes from a leak and is not any form of formal evangelisation. The point is the hardware is intentionally not 100 per cent round. It has a little bit more ALU in it than it would if you were thinking strictly about graphics. As a result of that you have an opportunity, you could say an incentivisation, to use that ALU for GPGPU.”


    • JerkDaNERD7

      You also forgot about the single pool of memory PS4 fanboys like to toss around as if XOne doesn’t have HSA, lol! It’s an APU why wouldn’t they have it? And why XOne has a larger “pool” at 30GB/s bandwidth they call “CPU Cache Coherency” while PS4 has only a 20GB/s bus.

    • You Are Flat Out Wrong

      That single pool and added architecture is still tied to a weak and underpowered hardware. It’s like attaching a yacht to an AMC Gremlin and expecting the gremlin to tow it without problems.

      An AMD R270 has the same architecture minus some ACE’s and 1.5 the terreflops and can pull 60FPS in games the PS4 struggles to get 30FPS in. Turns out the ACE’s on the PS4 can’t do jack if the hardware is already so tasked it can’t properly utilize it or they have to use those ACE’s for basic tasks because the load is so much they have to spread everything thin. Meanwhile the R270 comfortably spreads the workload with less ACE units because the architecture is backed up by a significantly more powerful overall architecture that can spread workflow evenly. When DX12 and Vulkan come in. It takes more load off the GPU and distributes to the CPU so the GPU can work even more on Graphics instead of having to do everything. That is where the real magic will come in

      Every time one of these articles crop up. They just go into MisterPlaystationMedia rants about Cerny’s secret sauce. It’s hilarious and pathetic.

    • JerkDaNERD7

      Had NOTHING to do with my argument, lol! xD

    • You Are Flat Out Wrong

      Xbone is weak. PS4 is weak and Cerny lied about GPGPU, HSA and hUMA compatability (All possible on PC with good configurations. Not on weak consoles with garbage CPU’s) and stuffed on redundant architecture that can’t even be utalized from a garbage hadware set up

      Why can’t you just accept it, loser?

    • Guest

      Sony is lies, overhype and underdeliver.

    • Guest

      PS4: 18CU +50% more
      Xbone: 12CU

      PS4: 1.843TFLOPS +40%
      Xbone: 1.31TFLOPS

      PS4: 176.0GB/s 8GB GDDR5 5500MHz +258%
      Xbone: 68.3GB/s 8GB DDR3 2133MHz
      ~150.0GB/s to 204GB/s peak 32MB eSRAM

      PS4: 1152 Shaders (cores) +50.5%
      Xbone: 768 Shaders (cores)

      PS4: 72 Texture units +50%
      Xbone: 48 Texture units

      PS4: 32 ROPS +100% more
      Xbone: 16 ROPS

      PS4: 8 ACE/64 queues +400%
      Xbone: 2 ACE/16 queues

      PS4: 25.60GPixels/s +88%
      Xbone: 13.65GPixels/s

      PS4: 57.60GTexels/s +40%
      Xbone: 40.90GTexels/s

    • You Are Flat Out Wrong

      Both still pale in comparison to a $130 R270X which has 2.70TF and all the units utilized properly instead of wasted by the PS4’s garbage bandwidth. A $130 GPU is 50% better than the PS4. Why would you want to buy a console :^)

    • Guest

      Ok Mr.X keep spreading more lies.

    • You Are Flat Out Wrong

      MisterPlaystationMedia here telling people to stop lying. My sides hurt laughing so much at you

  • GHz

    @ Kurtis Simpson

    Great interview dude! I enjoy every bit of it. Wardell as usual ishonest as well as diplomatic with respect to what his peers are doing, separating his opinion from his factual experience. I love his honesty, insights, opinions and enthusiasm. I hope my fellow gamers know the difference between them.

    Now to address some of the confusion about MSFT making a case about cloud powered games on the XBox one and how they will look. They did that @ E3 2014 with the CrackDown reveal. Phil confirmed it was early crackdown work, originating from the destruction demo they revealed earlier in 2014.

    @Xone_br33 Yes, build demo was early crackdown work.— Phil Spencer (@XboxP3) June 10, 2014

    That was their example to us of what we should expect in realtime. No one just want to accept that it is in game, even if it was scripted, because its happening on the XB1. But we know it is most likely because we’ve seen simialr example of destruction on XB1 without use of cloud running in realtime in Quantum Break. So its all coming together. The games are doing the talking.

    MSFT addressed what power we should expect already. They have told us for every physical XB1, their will be the equivalent of 3 in the cloud to help power it via cloud compute. And of course no one believe them. The engineers also told us that the XB1 was designed to better take advantage of cloud. Its cloud enabled & will get more powerful over time. It is the reason why MSFT say 10yr lifespan for XB1. And of course no wants to believe them.

    Since then other companies who know the potential of cloud based game streaming had put out their own demos. We can look at their examples of what they’ve accomplish to clarify our expectations. Keep in mind these companies don’t even have the immense resources MSFT have.

    Xbox One is ready.

    Your interview is stellar work. Good job! 😀

    • 😉 Thanks Dude.

    • Exposure Levels

      So basically Xbox fan boys really just want a PS4 because Sony is ahead of Microsoft in cloud computing.

      Xbox One isn’t ready. They’re behind and Microsoft lied about the cloud and 3x performance.

    • GHz

      Why you so insecure? Pple who are backward thinkers usually are. 200yrs ago, you’d be the type of dude to believe that cameras can steal your soul.

      I’ll quote something very import that Kaz Hirai said just LAST YEAR! And your’e going to consider the fact, that the PS4 was already selling like hotcakes. PS4 was already a runaway train “sales” wise. I put the word sales in quotes because you still believe in the backward thinking that sales wins the day. This is what Kaz said….

      “If we cannot execute all the things we need to get done this fiscal year and generate REAL RESULTS, it will be all but impossible to envision a growth strategy for the mid- to long term,”

      Why would he say real results & not acknowledge the success of the PS4? I’ll tell you why. Because this aint the 90’s and how you make money in this business have changed since 2005. Why do you think Sony as a Company have been struggling over the years? Because they have been investing in the wrong areas! That’s why Kaz said this also @ the same meeting! And I qoute,

      “For PlayStation Plus and the PlayStation Network Business in General, but also the video and music businesses and the networking in those areas, the market there is going to grow hugely, but at the same time as we see the growth of the business, we have to be prepared with a solid enough network infrastructure. We have to be able to accommodate the increase in subscribers by having a solid enough network system and infrastructure and therefore we believe that now is the time of investment”

      So in 2014, Sony realizes that they MUST INVEST IN A NETWORK INFRASTRUCTURE. I had to capitalize cause you all don’t get it. These guys are over a decade late! And don’t mention Gaikai because he said this after the fact they bought that company & STILL No real results where it mattered. NETWORK INFRASTRUCTURE! You need it! And who is the biggest in that area today, spreading the globe? MICROSOFT! Thats how your’re going to win. Kaz said it, and you better believe it! That’s the real world. Not these click bait articles about PS4 sales dominance.

      Get your thumb out your mouth and grow up. 😉

    • Psionicinversion

      theyve been saying there going to invest in PSN for years yet they never do. Instead they buy up onlive and gaikai to rob you out of even more money.

    • Psionicinversion

      how is sony ahead of cloud computing… the arent. Cloud streaming is different

    • You Are Flat Out Wrong

      You are arguing with a ‘Tardo (Actually John Derp replying to himself since the same accounts upvote him all the time) that believes Sony has it’s own cloud when the reality is they are renting from Rackspace Hosting. Same company that does Nintendo. Sony are too cheap (Or more likely, don’t have the money) for Amazon AWS any more.

      Yes, they are THAT deluded and grasping for anything they claim Sony has it’s own cloud when it’s been renting and never will have one.

      Poor dears are going to be in pieces when MS switches that cloud to PC gaming full time.

    • Exposure Levels

      Keep trying PC fanatic.

      All three companies Sony, Nintendo, and Microsoft rent servers. Microsoft has data centers. Microsoft doesn’t have actual servers that distribute the data.

      PlayStation Now runs in Sony data centers. Sony is cheap? Sony has been spending a lot on cloud and data centers which is part of the reason why they’ve posted losses the past few years. Sony also leases servers.

      Microsoft has been hurting PC gamers for years and you’re happy about Microsoft having a monopoly on PC? Do you look forward to Microsoft restricting your games to their operating system, and their cloud, and their paywalls? Have you not learned anything from Microsoft’s actions?

      You’re naive.

    • You Are Flat Out Wrong

      A) I dual boot with Linux, loser
      B) You are an idiot since that’s an application to host MS products like MS Word and Outlook and not Microsoft’s Servers
      C) Sony rents Data centres from Rackspace in North America and British Telecom in Europe which are basically hired rooms with a lot of PS3’s. They don’t own their own data centres.
      D) Sony are even worse for DRM and actually spied on users.
      E) Microsoft never had a monopoly on gaming thanks to Linux
      F) You are hilariously deluded if you think any PC gamer is paying for anything from MS when we can just go to Steam.

      You are a pathetic loser as usual, John Derp, no matter how many accounts you make.

    • Exposure Levels

      Sony is actually using dedicated servers for cloud rendering while Microsoft can’t.

    • Psionicinversion

      you mean streaming games through PS Now? something yu have to pay to use. You wont be paying to offload calculations to sonys servers so nah they wont offer it for free

    • Orion Wolf

      What he doesn’t mention is that Sony is using ps3s as their “dedicated” servers … yeah lets talk about weak and old HW
      – but what can you do when you have no money to invest into actual server solutions and you want to keep the cost as low as possible (especially in the case of Sony).

      He also doesn’t mention that MS is constantly upgrading their HW – he found an article from the beginning of 2014,
      but didn’t mention the deal with Dell at the end of 2014.

      He’s clearly not a ps fanboy /sarcasm

      What he also doesn’t tell you (but we all know) is the difference between MS and Sony when it comes to server /cloud tech and the investments into said tech, nor the expertise of MS when it comes to cloud services. MS has a dedicate OS (starting back in 2003) for the server market and will be releasing the new iteration in early 2016 – what does Sony have in that regard?

      Sony has decided on game streaming that has a (besides the opurtunistic pricing scheme) substantial latency issue. That’s because the whole game is being rendered on some remote ps3 “server”. MS on the other hand has been proposing to offload only latency insensitive data, you know, things that won’t impact gameplay, something you can’t say for ps now and yet for some reason ps now is better.

      Then there’s project DeLorean where the main point is to reduce the latency.

      Microsoft has invested $15 billion already and are going to invest even more as Satya Nadella (side note: before taking
      the position of CEO was in the servers and tools division) is pushing for cloud services much more than any of his predecessors.

      What’s “funny” is a Sony/ps fanboys saying that cloud was BS not that long ago – that is until the media started covering Sony’s forays into said field. Now? Well now Sony is far better than MS when it comes to the server/cloud market, even though they are still reporting losses and considering all the debts, they’ll be doing that for a while.

      Reading this persons next reply is priceless (to red his full reply just scroll down):

      “All three companies Sony, Nintendo, and Microsoft rent servers. Microsoft has data centers. Microsoft doesn’t have
      actual servers that distribute the data.”

      So MS is building multibillion dollars’ worth datacenters that house 50k to 100k servers just so that they can rent
      servers from Rackspace? Sure why not.

      Not saying that they don’t have agreements with Rackspace and that they might rent servers from them (that’s beside the fact the they’re actually competitors), but to say that they have datacenters without actual servers is like me saying that I have a car, but I’m using a cab.

    • Exposure Levels

      It’s obvious that you don’t know anything about Azure or Sony’s cloud capabilities. You’re just screaming, “BUT BUT BUT MICROSOFT MONEY AND AZURE MICROSOFT THE BESTEST HAS MOST MONEY”

      The PS3 cell processor and GPU are stronger than the servers that Microsoft has. Go look up what Azure is in stead of reading Microsoft fan boy articles. Each server Microsoft has uses a dual core CPU. Xbox Live only uses a fraction of Microsoft’s Azure Cloud. Azure is made of 1,000,000 servers but only 300,000 are for Xbox Live. The rest is for business web applications for Windows such as Office 365.

      Sony has been investing billions in to the cloud which is part of the reason why they posted losses the past couple of years. How much do you think it costs Sony to build 1,000,000 servers housing 1,000,000 cell processors and GPUs so that people can use PS Now?

      You have been owned. You know nothing about Azure or Sony’s cloud infastructure.

      Most of the money Microsoft spends on building Azure is for the space and labor. Just because Microsoft spent 10 billion on Azure doesn’t represent the processing power of Azure. Microsoft has to spend billions each year to maintain Azure, pay for data transfers, pay for electricity, and to pay the employees who work at the data centers.

      You missinterpreted my other comment. Microsoft doesn’t have the servers to distribute. Microsoft has data centers which store data and crunch data. Data centers alone aren’t what actually distributes the data world wide. You need more than data centers to distribute data which is why Microsoft rents servers from Rackspace.

      Keep droolling over Microsoft’s money while you’re waiting for the cloud to help your console that struggles to output 900p D*MB*SS! lol

    • Orion Wolf

      “It’s obvious that you don’t know anything about Azure or Sony’s cloud capabilities.”

      Yeah clearly, you do especially considering this:

      “Microsoft has 1,000,000 servers EACH made up of dual core CPUs.”

      Have you actually looked at the picture in the article you have posted?


      “Sony has been investing billions in to the cloud which is part of the reason why they posted losses the past couple of years.”

      Where are you getting this info?

      “How much do you think it costs Sony to build 1,000,000 servers housing 1,000,000 cell processors and GPUs so that
      people can use PS Now?”

      I have no idea, but I am sure you do.

      Btw I must have missed the announcement about Sony having 1 million servers.

      “Most of the money Microsoft spends on building Azure is for the space and labor.”

      Woah. Not only do you know what Sony is doing, but you’re in touch with MS too?! My god mate, can I get an autogram?

      “Microsoft doesn’t have the servers to distribute.”

      Where … ah right, Nadella.

      Drooling over Microsoft’s Money? Nah mate. However,
      1 million phantom servers would be a good reason to droll.

      Yeah mate you owned me, btw could you say hay to Hirai and Nadella for me?

    • Exposure Levels

      No, I’m referring to dedicated servers.

    • Psionicinversion

      hahahaha. MS will be doing a cloud streaming service seeing as there researching network path prediction, or they could be using it for offloading cloud processing as the servers predict where your going so it pre-calculates it ahead of time so reduces the time for calculations significantly

    • XbotMK1

      So basically you really just want Microsoft to copy Sony, which is basically what he stated.

    • Psionicinversion

      MS copy sony which copies someone else, they copy some one else etc etc Sony isnt first with anything

    • XbotMK1

      Psionicinversion is a Microsoft fan boy. He contradicts himself all of the time.

    • Psionicinversion

      actually that Procedurally generated terrain is EASILY doable on X1, he says procedural terrain is extremely expensive on the CPU but hes doing it on the GPU but that is still wasted cycles. The X1 for example could have the server generate the terrain whilst the x1 is “loading” the game, that right there will save a ton of time.

    • You Are Flat Out Wrong


      This is the delusion Sony fanboys have started to get to. Sony’s TVTVTV cloud services.

      Which aren’t even owned by Sony. They rent them from Rackspace. Same company Nintendo rents from. And Nintendo has better and more stable online play. And is free.

      Someone is being ripped off here and it’s not the Nintendo fans but then Sony fans bought a PS4. They are too stupid to ever realise they are being ripped off.

    • Psionicinversion

      MS wins PS4 fails. Sony refuse to even do dedicated servers for multplayer… theres no way there going to allow dedicated servers specifically for cloud processing

    • artistofwar

      So you are a Microsoft fan boy. Why does MCC not have dedicated hosting servers? What about Killer Instinct? lol

      Sony uses dedicated servers for Driveclub, Planetside 2, and many other games.

      You obviously don’t know what dedicated servers are and what they’re used for. Dedicated hosting is mostly useless for smaller player counts.

  • Psionicinversion

    So to summarise the article…. get a PC its better for your health!!!

    • You Are Flat Out Wrong


    • Psionicinversion

      … always tell it D:

  • MPTheGreek

    This guy says ps4 is more powerful. If you have a problem with that take it up with him not me.

    • Orion Wolf

      And yet when asked which was superior the xb1 with dx12 or the ps4 with Vulcan he still went with the xb1.

      “If you have a problem with that take it up with him not me.”

    • MPTheGreek

      Please list where he uses superior.

    • Triton

      In another article.

    • MPTheGreek

      Right. In another dimension.

    • Orion Wolf

      It’s from twitter.

      I’ve put the twitter response up, but he’s damage controlling it to the max.

    • Triton

      Which was posted in another article =)

    • MPTheGreek

      No. If you read this interview which is more recent he says ps4 hardware is superior and Sony api is lower overhead. Lol such denial

    • Orion Wolf

      First let me play your game

      “Please list where he uses superior.”

      The ps4 HW is “superior” and they have a tremendous API so why did he go with xbo+dx12 then? Yeah denial mate.

    • MPTheGreek

      He tells you why if you read the interview. He thinks it will be less of a learning curve compared to Sony api. He also says you could squeeze out a few more fps with sony api But may require more work. Read the article.

    • Orion Wolf
    • MPTheGreek

      Wrong he never said it was superior And this interview is more recent. He thinks it will be easier to program for. That is all.

    • MPTheGreek

      You said he uses the word suoerior. Still wAiting. He does say ps4 has more powerful hardware and a lower level api.

    • Orion Wolf

      Where did I say HE uses the word superior?

      He was asked which of the two consoles will be superior – the xbo+dx12 or ps4+glnext and he went with the xbo.

  • Xbot&PhilSpencer69

    We’ve all ready been through this countless times before. You’re spamming these articles and repeating the same stuff ad nauseum..

    Brad Wardell has been lying and hyping up DX 12 and Xbox One over the past year. He’s the same id*ot who said DX 12 will double the GPU in the Xbox One and then in a recent interview he said DX 12 won’t help GPU bound games. Brad Wardell keeps speaking out his *sshole and contradicting himself.

    Brad Wardell: “I want the PS4 to adopt Vulcan so I don’t have to work and learn the PS4’s lower level API even if it gives me a performance boost.”

    Kurtis: “Do you think the PS4 will adopt Mantel”

    Brad Wardell:” No, but I hope it gets Vulkan”

    He said that he doesn’t want to learn the PS4’s API yet he’s willing to learn Vulkan and DX12 over DX11. Then he says DX12 is revolutionary which isn’t true because once again, Mantel and the PS4 API came before it.

    Also, he started receiving attention because he started hyping up DX12 and it’s effects on Xbox One specifically. Now he’s saying that he doesn’t even know what it will do for Xbox One. And once again, why did he not show the same enthusiasm for the low level API that the PS4 had since the beginning? The PS4 API is not hard to learn or develop for. Devs have been using Sony’s API it for decades. The PS4 API also supports asynchronous compute. You can port between all three.

    It’s clear he’s a Microsoft tool speaking on Microsoft’s behalf because Microsoft can’t do it themselves because they would look like *sshats.

    He says that he wants the PS4 to use Vulkan. However, the Xbox One doesn’t use Vulkan either, so why is he willing to use Microsoft’s low level “proprietary” API but not Sony’s? Also, if the PS4, Xbox One, and PC all used Vulcan, would that not achieve the same effect that he wants? But we all know why DX12 exists. It exists so Microsoft can force you to buy Windows devices. How interesting.

    Funny how excited he is for improved performance with Direct X12 when Direct X is the reason why performance has been held back in the first place because it is a proprietary API controlled by Microsoft. Mantel is based off the PS4’s API so why haven’t devs been using the PS4’s low level API? Because Direct X needs to die.


    • Modi Rage

      Brad Wardell is a liar. He even admitted that he doesn’t know anything about the PS4 API. The PS4 and Xbox One are the same architecture. What ever software improvements running on Xbox One can also run on the PS4.

    • Exposure Levels

      Well, Brad Wardell is defending Direct X 12 because his company thrives off Windows. It’s quite easy to see why he’s been speaking out his b*tt in regards to DX12. If PlayStation becomes dominant, it negatively effects his company.

    • You Are Flat Out Wrong

      Why would it? GNMX is just a forked version of OpenGL. The PC industry would tell Sony to go screw themselves.

      Doesn’t matter as playstation will never ever become dominant. 1 Billion PC Gamers will see to that :^)

    • R Valencia

      GNMX is not based from OpenGL.

    • Xbot&PhilSpencer69

      Yeah, that is exactly what I’m thinking. He’s been full of sh*t this whole time.

    • Psionicinversion

      What??? he said the PS4 API is lower level than vulkan which also means DX12.

      Software improvements on Xbox can only run on PS4 IF it has the hardware which means DX12 tier 3. PS4 should not be able to do that. If PS4 has tier 3 dx12 compatibility MS is going to lose it as they will of come up with the solution alongside nvidia/AMD

    • Asmodai

      All AMD GCN based GPUs (including PS4) are DX12 resource binding tier 3 capable. Now PS4 won’t actually run DX12 because it’s proprietary Microsoft software but the PS4 hardware does have the necessary capabilities. Plus DX12 is higher level than GNM so it doesn’t offer anything performance wise that GNM doesn’t already do.

    • You Are Flat Out Wrong

      He doesn’t need to bother. GMNX = OpenGL 4.5.

      No secret sauces. All feature sets will be available to PC with Vuklan. Your hardware is still garbage. Try not to be obvious replying to yourself and upvoting yourself John Derp :^)

    • R Valencia

      GMNX doesn’t equal OpenGL 4.5 e.g. GLSL != PSSL.

    • You Are Flat Out Wrong

      John Derp you really are a moron.

      Devs will use both to get the most out of PC games. Devs have been using PS4’s low level API GNMX (Which can’t help its garbage hardware) witch is basically OpenGL plus some low level already and Vulkan will do NOTHING for the POS4 and its garbage tablet grade hardware.

      And you are a massive Sony shill so you are a rather big pot accusing the kettle of being black :^)

    • Asmodai

      GNMX is NOT Sony’s low level API and it’s NOT based on OpenGL.
      The PS4 has TWO APIs not one. One is low level and one is high level (just like there was for the PS3). The high level API is made to be similar to DirectX 11 (but it’s not a fork) for ease of porting. That HIGH LEVEL API is called GNMX.
      The low level API isn’t like anything on PC. It’s lower level than DX12, Mantel, Vulkan, OpenGL, etc. and it’s called GNM (I know it can be confusing because there is only one letter difference). It’s an evolution of the super low level API called libGCM on the PS3 (PS3’s high level API was PSGL and it was based on OpenGL but the PS4 replaced it with GNMX)

  • GHz

    Some fanatic on here with multiple accounts having conversations with themselves all for the love of Sony and the PS4, spreading FUD and hate. -____-

  • mato

    This is very good interview. It’s great to talk to someone who is not afraid to voice their opinion. Some other interviews I’ve read here recently were full of empty phrases and “can’t comment”‘s. This one’s different and I thank you both.


Copyright © 2009-2015 All Rights Reserved.