When it comes to facial animation, we naturally assume that it needs the highest grade processors with rendering farms for days to just make that little CG character smile. Advances in technology have made it much simpler over the years, to the point where even next gen consoles like the PS4 and Xbox One are capable of realistic facial animations without the need to overtly tax the system. Of course, there are other ways to achieve results – such as with OC3 Entertainment’s FaceFX, which sounds so unbelievable that it could only be true.
GamingBolt spoke to OC3’s chief executive officer Doug Perkowski about the benefits of FaceFX, how it came into being, facial animation on consoles vs. PC and much more.
Rashid Sayed: To begin with can you please tell us a bit about FaceFX and the initial inspiration with which the company was formed?
Doug Perkowski: FaceFX is a facial animation solution that creates a realistic talking character from an audio file. It has been used in hundreds of video games to help create some of the past decade’s most memorable video game characters and experiences.
10 years ago, games were getting more realistic with faster processors and more memory, but Moore’s law doesn’t apply to characters. You can model a high polygon face, but without realistic movement, it is lifeless. The ability to generate convincing facial animation performances without breaking the budget became a real bottleneck for video games. It was (and still is) one of the hardest problems in 3D because humans are so good at reading into the details of the human face.
Like any creative medium, games are ultimately judged by the characters they create, the stories they tell, and the emotions they inspire. You need realistic facial animation for all of those things, so facial animation is an important problem, not just a hard one. We founded OC3 Entertainment to help solve it.
Rashid Sayed: FaceFX have announced a new Facial Animation Runtime for the Xbox One and PS4. What kind of changes have you made to the SDK to accommodate the new consoles?
Doug Perkowski: We were able to add multi-threading support, reduce memory fragmentation, and boost the overall performance by a wide margin. Our file loading code was completely re-written to achieve an order of magnitude increase in performance. In general, everything was redesigned from the ground up to be more efficient and use less system resources.
"The bar for facial animation has definitely gotten higher on the new consoles. Characters are more realistic than ever and they require higher fidelity facial animation to match. So the main thing developers want is quality animation without a lot of cleanup."
Rashid Sayed: Killzone Shadow Fall is one of the early next gen benchmarks in terms of facial animation. What can you tell us about your collaboration with Guerilla Games for creating realistic facial animation?
Doug Perkowski: We worked closely with Guerrilla Games. They were one of the first studios we visited after introducing FaceFX Studio back in 2004, and we saw their new offices and met with some of their team more recently in 2012. They have put together an incredible game and an amazing universe. We are thrilled that FaceFX technology contributed to it.
Rashid Sayed: Some developers are now creating their own proprietary sound technology. A good example is the Madder Technology used in Killzone Shadow Fall. How easy or difficult it is to integrate FaceFX with such custom engines/tech?
Doug Perkowski: There is a pretty loose coupling between facial animation and sound engines. You just need to keep them in synch. So as long as the sound engine can tell you how far along the sound is, that’s really all you need.
Rashid Sayed: Talking about lip synching, does the new tool set makes it easier for localization? [Example: a person speaking in Japanese will have a different lip syncing compared to the same person speaking in English]
Doug Perkowski: FaceFX has always handled localization very well. Even a modest amount of dialog can require a very robust facial animation when you multiply it by all the languages that need to be supported. FaceFX supports English, French, German, Italian, Spanish, Korean, Japanese, Mandarin Chinese, and Czech. We are working on adding Russian, European Portuguese, and Brazilian Portuguese. Other languages can be used if you use transliterated text or without providing any transcription of the audio.
Rashid Sayed: Given that both new consoles have a pretty high memory footprint are developers demanding more features/tools from you guys? If yes what is the most wanted next gen feature that the devs are demanding from FaceFX?
Doug Perkowski: The bar for facial animation has definitely gotten higher on the new consoles. Characters are more realistic than ever and they require higher fidelity facial animation to match. So the main thing developers want is quality animation without a lot of cleanup.
"FaceFX is very efficient with memory. Our animation data is highly compressed and is only expanded fully when a particular animation is playing. As a result, the underlying memory architecture of the platform does not impose any constraints for us."
Rashid Sayed: Can you please let me know what do you mean by a ‘lot of clean up’?
Doug Perkowski: With any high volume facial animation pipeline, you need to generate lots of animations in a limited amount of time. You can use audio to generate the animation (like FaceFX does), or you can try to capture an actor’s face with mocap or video. Once you have the audio-generated data or the raw mocap or video-generated data, clean-up would entail any steps require to process the data to get it into a form suitable for driving a character in your game. As a general rule, FaceFX does not require a lot of clean-up, because the data is generated as opposed to being captured. But artists may still want to add expressions or other emotions that are difficult to generate from the audio file, so even audio-based solutions can have “clean-up”.
Rashid Sayed: Both the Xbox One and PS4 have similar architecture but the former lacks a unified memory. Many developers are apparently facing issues due to eSRAM resulting into lower resolution in certain games. As someone who has extensive hands on the Xbox One and the internal tid-bits what is your take on this issue and does it affect FaceFX tools development in anyway?
Doug Perkowski: FaceFX is very efficient with memory. Our animation data is highly compressed and is only expanded fully when a particular animation is playing. As a result, the underlying memory architecture of the platform does not impose any constraints for us.
Rashid Sayed: When compared to a PC with a high end GPU/CPU combination what kind of challenges does the development team behind FaceFX in developing the right tools specific to next gen consoles given that their respective CPUs haven’t kept pace?
Doug Perkowski: A high-end PC with the latest GPU/CPU gives you the most graphics horsepower, but ultimately I think iPads and tablets are a greater competitive threat/opportunity for consoles. High end PC games will always be a niche market that appeals to the graphics enthusiast, but the average consumer will choosing between a console game and new gaming app on their tablet, or perhaps some hybrid.
Rashid Sayed: Actually, I was asking the question in terms of technology you guys are building on [in terms of clock speed of CPUS]. Since high end PCs have a higher clock rate, and next gen consoles still lagging behind a high end CPU, does it hold your technology back in any way? You guys could have probably done more if these next gen machines would have boasted a higher clock rate.
Doug Perkowski: Are you talking about the speed of the CPU or the GPU? It doesn’t matter because facial animation technology is not constrained by either really. But with a faster graphics card, you could potentially have higher polygon characters, but even in that case, you would probably use the same skeleton rig. And FaceFX calculates transforms for the skeleton rig, and the engine will then use the graphics card to drive the polygons with the bones, so FaceFX’s calculations would be the same. At the end of the day, you just don’t get better facial animation with faster CPUs. You can get better physics, particles, explosions, and simulations, but facial animation can only get better with more animator hours or better technology. That’s what makes it such a hard problem.
OK. And just an additional clarification, there are many facial animation methods that are definitely limited by the CPU and GPU like skin solvers and light simulations for the skin. FaceFX could even drive those systems, but most games (regardless if they are targeting high-end PC’s or consoles) will use more traditional animation techniques that are not bound by the CPU or GPU.
"The trends in the casual game development market are the ones to watch. There is a democratization that is happening as game development tools and services get more powerful and less expensive. The cost of developing a casual 3D application is dropping rapidly and more and more developers are getting involved."
Rashid Sayed: The PS4 and Xbox One consoles are here to stay. There are no doubts about that. We are probably going to see a long road map for both of these consoles. Where do you see FaceFX in the new few years as the next gen console cycle unwinds?
Doug Perkowski: The trends in the casual game development market are the ones to watch. There is a democratization that is happening as game development tools and services get more powerful and less expensive. The cost of developing a casual 3D application is dropping rapidly and more and more developers are getting involved. It’s similar what happened to the World Wide Web 15 years ago. To make a web page, you used to have to hire a “web developer”. I think you can argue that we are all web developers today. There are still lots of professional web developers, and there will always be professional game developers, but I’m looking forward to seeing game development technology become more ubiquitous. We want to make sure that FaceFX enables the next wave of 3D applications and games.
Rashid Sayed: The next big thing for the new consoles is that developers will offload some of the tasks of the CPU to the GPU. Is the team at FaceFX already doing that or is it something that the game developers have to look out for themselves?
Doug Perkowski: There is a limit to how many characters you can fit on screen at once, so unlike a particle simulation or physics calculation, there is only so much you can do with the massively parallel computational power of the GPU. Complex skin solvers could use that horsepower of course, and FaceFX can drive skin solvers just as easily as it drives bones or normal maps.
Rashid Sayed: One last question before we wrap up. Can you tell us what next gen games you are working on and the challenges you have faced with each of them so far?
Doug Perkowski: We can’t really talk about the specifics, but I think it is safe to say that the new runtime does an excellent job addressing the issues we have identified with next-gen facial animation pipelines.