The Big Interview: AMD On PS4/Xbox One, Graphics Technologies, PC Gaming And More

We talk to AMD about the future of gaming, including the immediate future that is the PS4 and Xbox One.

The next generation of gaming is going to be huge. Whether it’s on PC, PS4 or Xbox One – heck, even for the Wii U – there is no denying the utter range and diversity of the platforms on display. However, they all have one thing in common: Each one of them is influenced by Advanced Micro Devices or AMD. The Big Three of this console generation all have AMD GPUs and CPUs, and games like Battlefield 4 are optimized to work best on AMD systems. That’s not taking into account the strides made in the smartphone market.

We recently spoke to various individuals over at AMD – with answers compiled by Robert Hallock, PR Lead for Gaming and Enthusiast Graphics – about the PlayStation 4 and Xbox One, including the company’s role in designing the consoles, and on what the future holds for PC gaming as well what the company is doing to innovate in graphics.

Rashid Sayed: Next generation seems to be firmly in AMD’s grasp, since all the three platform holders–Sony, Microsoft and Nintendo–are using AMD tech. What benefits will this bring to the company as a whole in, say, the next 5-6 years?

Coming into this latest generation, our flagship Graphics Core Next architecture is the common fabric for any game developer looking to publish.

Robert Hallock: It’s tremendously difficult to predict where things will go in 5-6 years, but we can talk about the near-term with more confidence: the game development industry now uses AMD Radeon graphics for six shipping platforms: Nintendo Wii, Nintendo Wii U, Sony PlayStation 4, Microsoft Xbox 360, Microsoft Xbox One and the PC.

For five of those six platforms, Radeon is the only choice in the development process. And coming into this latest generation, our flagship Graphics Core Next architecture is the common fabric for any game developer looking to publish. It’s all a bit oblivious to believe or assert that this situation won’t have a positive and obvious effect on the overall level of optimization games demonstrate for AMD architectures.

battlefield 4 AMD

Rashid Sayed: It has recently been revealed that DICE will be optimizing Battlefield 4 for AMD hardware. Which, given that AMD powers the PS4 and Xbox One, seems to make sense. However, as more next generation titles release for the Xbox One and PS4, we could see more games being optimized for AMD hardware. How will this trend affect the company going forward in the PC market, which is still dominated by Nvidia?

Robert Hallock: The vast majority of the AAA PC titles released in 2012 and 2013 were already optimized for AMD Radeon, including Tomb Raider, Crysis 3, Hitman: Absolution, Far Cry 3, Battlefield 4 and more. That was through our ongoing AMD Gaming Evolved program.

By this measure, I think it’s difficult to characterize a market dominated by NVIDIA; the proof is not in the pudding. And when you also take into account that AMD’s GCN Architecture is now the singular architecture of focus for any developer looking to do business in consoles, then it’s perfectly logical to conclude that Radeon is the force to be reckoned with.

Rashid Sayed: Since Sony and Microsoft have [different] philosophies towards the videogames market, how difficult was it creating a custom solution that catered to their needs?

Robert Hallock: I can’t personally speculate on the difficulty of creating such solutions, and I’m sure Sony, Nintendo Microsoft would like to tell their own story. But what I can say is that AMD’s semi-custom business is an excellent example of our engineering prowess, our world-class IP portfolio, and our dedication to our customers.

I think the gaming community would do itself an injustice by rushing to crown a winner or a loser. I don’t see it as that kind of race, because we all win—users and, yes, AMD alike–when the market provides a big matrix of choices that can appeal to gamers of every stripe.

We were able to collaboratively design and bring up several unique solutions that, as you say, catered to their needs. Nobody else in our space is offering this manner of flexibility, and our full house of design wins is proof that this strategy is working.

Ravi Sinha: Much has been made about the power of the PlayStation 4 and Xbox One, and how one is more powerful than the other. Given that the AMD technology powering them is the same, how far does technology actually influence the success of either console in the initial period? The Wii U, for instance, is technologically inferior to both, but has still had a fairly decent initial period, however understated, in the market.

Robert Hallock: Success is an awfully nebulous recipe, and I would argue that it’s literally impossible to isolate how much “technology” weighs on the final product. Other ingredients are just as important to many people: price, game library, content partnerships, accessories and the like.

However, I think the gaming community would do itself an injustice by rushing to crown a winner or a loser. I don’t see it as that kind of race, because we all win—users and, yes, AMD alike–when the market provides a big matrix of choices that can appeal to gamers of every stripe. We’re thrilled to be the beating heart of these amazing CE devices.

Forza Motorsport 5 (6)

Ravi Sinha: Following up on the previous question, will there ever be a time where the architecture is no longer a factor in the improvement of one console generation? Microsoft, for instance, announced that it would be able to use Cloud Computing to continuously improve one’s gaming experience, effectively adding on to the base experience (Forza Motorsport 5’s Drivatar is one such example). It could become a matter of “why get a new console when your current games are continually updated?”

Choosing AMD over NVIDIA is an obvious choice for a consumer electronic device. We offer x86 and powerful state-of-the-art GPU solutions in a single chip with our Accelerated Processing Units (APUs).

Robert Hallock: I think such a development is best answered by the console makers. It’s only natural that they would know best the future of their respective businesses. Whatever that future might be, however, we are in the business of creating solutions to meet the needs of our customers. In the arena of cloud gaming, for example, we have the AMD Radeon Sky Series of GPUs, which are server-grade graphics processors specifically engineered to render, compress and stream games in the cloud.

Rashid Sayed: How deeply was AMD involved in building/suggesting the PlayStation 4 and Xbox One’s architecture? Also, how did Sony and Microsoft go about selecting AMD over Nvidia?

Robert Hallock: Choosing AMD over NVIDIA is an obvious choice for a consumer electronic device. We offer x86 and powerful state-of-the-art GPU solutions in a single chip with our Accelerated Processing Units (APUs). This is what the console makers demanded, and only AMD has the ability to deliver on that demand. With respect to our level of involvement, the design of the APUs in these consoles was collaborative.

Rashid Sayed: Are there any differences in terms of GPU design and raw horsepower on the PS4 and Xbox One?

Robert Hallock: Sony and Microsoft would be in a better position to comment on the relative merits of their hardware.

xbox one amd

Rashid Sayed: Nvidia recently said that the profit margins were low in the console business, and they didn’t think it was something they should aggressively pursue. What are your thoughts on this?

Robert Hallock: The position seems a bit like sour grapes to me. The reality, according to industry legends like John Carmack (citation), is that the standardization of console hardware will, in his words, ‘make it cheaper and easier to develop games for multiple platforms.’ And, he continues, that will improve the quality of games as devs spend time polishing them, rather than juggling architectural particulars.

We are very proud to help enable this sort of ecosystem for game developers, and excited that such an ecosystem runs almost unilaterally on our hardware. I can’t imagine why anyone would willingly cede such a favorable situation.

A similar upward trend [for PC gaming] was predicted by JPR for 2012, and it bore out positively. Obviously none of us have a crystal ball, but analysts seem quite bullish on the market.

Rashid Sayed: The PS4 and Xbox One have exotic architectures, but do you think they will be able to stand the test of time and give stiff competition to the ever evolving PC platform in the future?

Robert Hallock: I think Sony and Microsoft would be better suited to answer how they intend to keep their platforms healthy over the long term.

Ravi Sinha: Tell us a bit about the PC platform. We know you guys are heavily invested in it, so how do you see it shaping up in the next few years after the PS4 and Xbox One are released?

Robert Hallock: Peddie Research just released some excellent data on this topic, illustrating a continuous rise in PC gaming hardware–$20.7 billion by 2016. A similar upward trend was predicted by JPR for 2012, and it bore out positively. Obviously none of us have a crystal ball, but analysts seem quite bullish on the market.

VIZIO tablet amd

Ravi Sinha: One of the major bastions for Nvidia is in the mobile market, with its upcoming Tegra 4. Intel has also been making waves in the market with its Atom series and the upcoming Haswell. AMD has recently announced its next generation processors for tablets and laptops, but are there any plans for implementing these APUs in smartphones? Also, how far will the mobile APUs go towards combating the likes of not only Nvidia but Qualcomm, which will release the Snapdragon 800 in the near future for supporting 2K resolutions on smartphones?

Robert Hallock: We are tremendously proud of our APUs and the form factors we enable, such as that VIZIO tablet I mentioned. Ditto our sweep of the next-gen consoles. But Lisa Su, our SVP and GM of AMD’s Global Business Units, recently noted in a call with Gulf News that we have no plans to enter the smartphone market.

I’m continually fascinated by the ways you can creatively reinterpret graphics APIs (like DirectX) to come up with cool effects like TressFX Hair. Who knows what AMD’s game developers are cooking up?

Ravi Sinha: AMD introduced TressFX with Tomb Raider earlier this year that lent to realistically modeled and shaded hair on characters. Will we be seeing any other games taking advantage of this technology in the future, or maybe an advanced version of the same? Also, what other areas of graphical design is AMD working towards for creating visual realism?

Robert Hallock: Technologies like TressFX Hair enter into our portfolio of in-game effects that we can offer developers when we collaborate with them as part of the AMD Gaming Evolved program. Other effects in that portfolio include High-Definition Ambient Occlusion (HDAO), Forward+ rendering, or sparse voxel octree global illumination (SVOGI). We don’t force game developers to take any or all of these technologies, rather we open the buffet table to enable them with the tools to meet their vision of the game.

It is absolutely conceivable that TressFX Hair will appear in future games, but it’s too soon to comment on when or what those games might be. With respect to what we’re working on going forward, I’m continually fascinated by the ways you can creatively reinterpret graphics APIs (like DirectX) to come up with cool effects like TressFX Hair. Who knows what AMD’s game developers are cooking up? (Yes, AMD has game developers on staff!)

amd tressfx

Ravi Sinha: With the release of the Richland APU for PCs, AMD is beginning to move towards the next generation of PC technology. What does this entail for PCs, and what will be the focus areas for AMD in the next few years?

Robert Hallock: I’m not in a position to forecast the future for our business, but here and now we are intensely focused on gaming technologies. Our workstation team has the AMD Radeon Sky Series of GPUs, designed to stream games from the datacenter to a thin or light client. Our CPU business continues to push the envelope with chips like the 5GHz AMD FX 9590.

Our APU and semi-custom businesses are firing on all cylinders with the “Jaguar” core for consoles, and the Richland, Temash, and Kabini chips for a diverse range of PC form factors.

Our APU and semi-custom businesses are firing on all cylinders with the “Jaguar” core for consoles, and the Richland, Temash, and Kabini chips for a diverse range of PC form factors. And of course our graphics IP is intimately woven throughout most of these products, demonstrating a real harmony amongst our teams and within our technology portfolio.

Ravi Sinha: Bottlenecks for PCs have always been around, with significant hurdles still existing in utilizing the full power of multi-core CPUs and connections between the RAM and CPU in a system. What is currently being done to reduce those bottlenecks and take full advantage of the power that AMD provides?

Robert Hallock: Programming for multi-threaded platforms is an inherently challenging task, as many PC enthusiasts have probably heard. Therefore it seems unwise to offer products that explicitly depend on such optimizations. Coming from that perspective, we offer a portfolio that enables excellent performance regardless of the developer’s approach to threading. If the developer happens to be particularly good at it, then our architectures are scalable to accommodate as well.

unreal engine 4 infiltrator demo amd

Rashid Sayed: Last question. Chris Doran the founder of Geomerics recently said to us that CGI level graphics is still an active area of research in video games and at the moment artists do not have the freedom to put their stuff at will while they move around the world. Now, we have seen some amazing tech demos from AMD or the recent Infiltrator demo from Epic Games, which seems to indicate that the power to integrate such high end graphics is there but we still don’t see retail games taking advantage of that. What is your take on the same?

Robert Hallock: Tech demos are a very strange and often misunderstood beast. While some demos, like our “Leo” demo, were designed to demonstrate technologies we’re making available in games today (Partially Resident Textures and Forward+ rendering), other demos are very aspirational, painting a picture of where the industry could go in a few years’ time.

The truth is that these aspirational demos are the ideal, pristinely perfect environment: they’re designed for a fixed hardware target, using the best possible implementation of a technology, allocating virtually all GPU resources to rendering that technology.

It is the latter kind of demo that often causes people to wonder why a technology can’t be in a game if it can be played in a demo. The truth is that these aspirational demos are the ideal, pristinely perfect environment: they’re designed for a fixed hardware target, using the best possible implementation of a technology, allocating virtually all GPU resources to rendering that technology.

But spending your GPU resources to rendering just one effect is not how games work. To be blunt, you cannot blow your entire performance budget on one effect. So the industry must advance a few years to a time when the effect and the quality of yesteryear’s demo now consumes a manageable portion of the GPU’s performance, rather than the whole.

TressFX Hair is a powerful example, because it wasn’t until we combined sufficient compute and triangle rendering capabilities in a single chip with the GCN Architecture that real-time hair physics could be a reality. Many companies have done aspirational tech demos for hair, but we alone were able advance the industry by making it a reality.

I don’t envision a time when aspirational demos like this will ever fall out of favor, because it is fascinating to not just imagine, but see the possibilities the future holds.

A big thank you to Roy Taylor, Robert Hallock, Christine Brown and the entire team at AMD for making this interview happen.


  • Kamille

    so now they don’t know which is more powerful or they just can’t say because then someone will smack them?

    • Saryk

      They would get a serious smack!

    • unclesam

      Don’t listen to the horrible little PC troll “DAKAN45″ below, I’ve played Crysis 3 on a high end PC and saw Killzone running in the flesh on a PS4 at PAX. Killzone looked BETTER, and game played fantastically with the new PS4 controller.

      NO $350 PC IS GOING TO EVEN CLOSE TO THE POWER OF A $350 PS4 (I have taken out the $50 for the Dualshock 4) – FACT. INFACT YOU ARE GOING TO HAVE TO SPEND CLOSER TO $1000 TO BUY EQUIVALENT GAMING PC WITH THE POWER OF THE PS4.

      PC GAMING IS EXPENSIVE HASSLE, constant hassle of hardware upgrades ($400 for the latest graphics cards), configuration updates, windows driver updates, and the worst part of it are the cheaters, hackers, mods and aimbots that kill all the enjoyment on any multiplayer PC game.

    • Dakan45

      Why would they told you? You dont bash someone you work with.

      THey are not gonna say “hey ps4 is more powerfull, it has a better gpu” Nor despute the GDDR5 claim since DDR3 is beter for cpu calculations due to lower latency. Nor they going to admit they gone on consoles becuase they were doing bad finacially and unlike nvidia and intel, they were gonna make hardware and sell them for scrap.

      The result does show however with ps4 running bf4 on medium wiht 720p and xbox one not showing the xbox one veresion but attaching a controller to a weak pc to show 8v8 matches.

      Welcome to next gen. The gen of cheapness and average pcs packed as “consoles” with browsers and social features.

    • GettCouped

      They can sell it for cheaper because they are the only company that can unify x86 and quality GPU architecture on the same die. Be more informed and read the article.

    • Dakan45

      The only reason its sold for so litle is because consoles are always sold at a loss and amd hardware is always weaker.

      Also the previous consoles did not have x86, so why you think they needed x86 on a console is beyond me.

      That is the reason its cheap, so get your facts straight.

    • GettCouped

      You are basing your facts on what you heard about the previous generation.
      You also do not see the obvious benefits of using x86.
      Saying AMD hardware is always weaker is incorrect. AMD is better in multithreaded applications with CPU and with GPU AMD and Nvidia have been trading blows for years.
      Again, the reason AMD was chosen was because only AMD can do what was done.

    • Dakan45

      Again you cluless uninformed moron, why was x86 needed?

      Sony gone to nvidia first, they turned it down because it didnt worth the price they were offering.

      Amd made wii and wiiu.

      The only reason they gone to amd was becasue they are cheap, not because they are powerful.

      Amd cpus are ALWAYS weaker than intel and they got a huge price diffirence.

      So amd is better multithreaded applications? Ofcourse they are, since amd doesnt have actual cores, but it has threaded cpus.

      Intel has ACTUAL cores thats why intel doesnt have 8 core cpus yet. Those “inferior” cpus are actually better than amd 8 core cpus. Most programs are written on intel cpus and made to utilize intel cpus.

      Where exactly has amd better perfomance on multithreaded applications with the GPU? You better not talking about SLI, thats Nvidia’s front.

      Amd claimed that they are perfomance kings on far cry 3, yet nvidia offered the same perfomance for the same price.

    • NotAScrub

      AMD CPUs are always weaker than intel? How about back in 2005, when AMD’s weakest Sempron was absolutely dominating Intel’s fastest Pentium Extreme? Or how AMD’s legendary FX-5x and FX-6x had no competition whatsoever? There was a point in time during which AMD had a much larger relative performance gap over Intel than Intel has over AMD today, and the only reason why Intel survived was through dirty business tactics and money made through its years before AMD dominance. Know your facts.

      In regards to cores, you are absolutely mistaken as well; it is Intel that is enamored with their “hyperthreading” technology. Current AMD processors do not offer true multicore, as each integer processor shares certain components with a sister core, but it is the better part of a physical core. If you want to degrade AMD in this way, you have to compare it to what its competitor is doing.

      Historically, AMD GPUs are much more powerful than their closest Nvidia competitor. You are completely wrong with regards to power. However, processing power isn’t everything. Nvidia’s programmers make fantastic drivers that optimize and compensate for their weaker hardware, and the result is a very solid match between the two spanning from the turn of the century until now.

      There have only been two significant upsets in the history between AMD and Nvidia: In 2002, ATI released its legendary r300 architecture Radeon 9700. This chip was not only faster than Nvidia’s fastest chip by easily 80% in many games, but it single-handedly allowed ATI to rest on its laurels for 3 generations without introducing a new architecture. The second upset went in favor of Nvidia, with its introduction of the Geforce 8800 GTX in 2007. This chip did to AMD what AMD did to Nvidia half a decade before, and AMD has since recovered the performance gap, and today, is again offering a higher price/performance ratio than Nvidia is doing as of now. You can obviously see that in the debate between AMD and Nvidia only depends on whether one enjoys the color red or the color green more.

      As for Sli/Crossfire, both companies have major and similar problems with their technology; AMD is, however, ahead with regards to multi-GPU solutions for single-monitor setups.

      Know your stuff, scrub.

    • Dakan45

      Yes i remember a time amd cpus were more powerfull.

      However amd has fake cores not actual cores like intel. Intel doesnt have 8 cores to beat FX but their cpus are actually more powerfull than amd’s fake cores.

      I buy amd cpus because they are cheap and games dont need em, but try emulators and you will see how weak they are compared to intel.

      I did not claim ati was weak, i said they are on par. Ofoucrse nvidia has better drivers but ati has better backwards compaibility with old games.

      Nvidia is still beating amd, amd does not offer better price and perfomance ratio, which is sad. On sli, nvidia is better but both are crap.

      Thats the stuff i know.

    • nitro_feen

      his point is that nvida does not make cpu’s or apu’s

    • Dakan45

      and my point is that sony gone to nvidia first but they turned sony down.

    • Krakn3dfx

      They already said it at Gamescom and then got in trouble and had to redact., lol.

  • HelterSkelter

    I have an i7 2600 but now that AMD is in every next-gen console, I’m starting to worry if I’ll have performance issues when next-gen kicks in full throttle.

    • taz

      Doubtful as long as you have a decent GPU in your PC . Either gtx 770 upwards for Nvidia or AMD 7950 onwards would likely see you through this next console cycle. An I7 2600 is still only about 20% slower than a 4770k for gaming or thereabouts so definately no slouch.

    • Dakan45

      what taz said.

38 queries. 0.366 seconds