Yebis Engine Interview: Developing Physically Accurate Optical Post-Effects

Lead architect of Yebis, Masaki Kawase on how Silicon Studios are taking the engine to the next level.

Posted By | On 16th, Sep. 2014 Under Article, Interviews | Follow This Author @GamingBoltTweet


Silicon Studios’ Yebis Engine has been used in several games such as 3D Dot Game Heroes, Fighter Within, Moto GP 14 and in the E3 2013 trailer for Final Fantasy 15.  The engine specializes in providing post processing, depth of field and lens effects, motion blur and physically accurate optical effects.

GamingBolt got in touch with lead architect of Yebis 2, Masaki Kawase (whose answers were transcribed by GM and R&D engineer Colin Magne) from Silicon Studios. to know all about Yebis 2, its implementation in the new consoles and the next iteration of Yebis 2 i.e. Yebis 3. Check out his response below.

Rashid Sayed: Tell us a bit about yourself and Silicon Studio?

Silicon Studio is a global minded company located in Tokyo, Japan. It was established in year 2000 with the mission to deliver innovation in the digital entertainment technology and contents industry. We aim to provide the highest standard in areas including rendering technology, research and development methods, game content development and online game solutions. To achieve highest productivity, we created four studios working in synergy: MiddlewareStudio, ContentsStudio, SolutionStudio and AgentStudio.

"YEBIS can be applied to any type of rendering engines .We talked about Ray tracing before. Again in the case of cloud computing where the power of the cloud can render very realistic worlds, it does make sense to use YEBIS. The more you get close to realism, the more seeing this world through a physically simulated camera becomes needed."

Rashid Sayed: Tell us about the different features in Yebis 2 that have been specifically implemented to cater the new consoles [PS4, Xbox One]?

With PS4 and Xbox One came great power. Yebis’ performance on those platforms is close to 10 times their respective previous generations. So users can now use quality settings that were not possible before for many of our effects, the end result is a truly new world of visual expression.

Rashid Sayed: I was going through your preview for the rendering capabilities of Yebis 2 and I must say it has one of the best Bokeh Effects I have seen. Can you tell us about it?

We have spent a lot of time studying about the physics of real camera lenses. We made models to reproduce as physically plausible as we can use those bokeh effects while keeping high performance.

Rashid Sayed: What kind of improvements have you made to blur motion in order to avoid choppy frames?

We have two types of motions blurs that are optimized for different situations. One specialized for fast camera rotation movement, and one for translation and scene objects movement. We can also use both together for fast and good looking motion blur.

Rashid Sayed: Tell us about how Yebis 2 handles AA. Are you guys implementing a custom solution for AA for the engine? I read that you guys are using post anti-aliasing as well?

We use the now infamous FXAA based technique. We actually have plans of new techniques under development right now, but we can’t tell you details yet at this moment.

Rashid Sayed: YEBIS 2 supports three types of HDR rendering architecture. Which one suits the best for a. PS4 b. Xbox One and c. high end PC?

Definitely true HDR for all of them for maximum realism. The other modes were designed for platforms with lower performance and are good at imitating true HDR.

Rashid Sayed: Do you think there is potential for Ray Tracing on the PS4 and Xbox One using Yebis 2?

You mean if a rendering engine was a ray-tracer, can we apply YEBIS on it? The answer is yes. Actually Otoy has been using YEBIS in their real-time path-tracing engine on PC. Moreover, YEBIS includes algorithms based on ray tracing when it comes to close simulation on camera lenses. But they are optimized so ray tracing is not needed for every pixel.

"We have two types of motions blurs that are optimized for different situation. One specialized for fast camera rotation movement, and one for translation and scene objects movement. We can also use both together for fast and good looking motion blur."

Rashid Sayed: Yebis 2 supports DirectX 11 which obviously means that it will support DirectX 12 next year. What kind of benefits will it bring, especially on post anti-aliasing and HDR, when DX 12 will be used in conjunction with Yebis 2?

DX12 is helping gain CPU performance while sending commands to the GPU. YEBIS is actually very fast on the CPU even with DX11. DX12 will accelerate that even more.

Rashid Sayed: I am sure you guys must have seen Microsoft demonstrating the cloud technology at the recent Build 2014 event [link here]. What are your thoughts on this and do you think there is potential for improving visuals and other parameters through cloud? Furthermore do you think Yebis 2 could greatly benefit from something like this?

YEBIS can be applied to any type of rendering engines .We talked about Ray tracing before. Again in the case of cloud computing where the power of the cloud can render very realistic worlds, it does make sense to use YEBIS. The more you get close to realism, the more seeing this world through a physically simulated camera becomes needed.

Rashid Sayed: Tell us how Yebis 2 will accordingly scale for each of the new consoles? Two sub questions related to this are: a. How does it use Xbox One’s eSRAM’s high bandwidth of 204 GB/s and b. How does it use PS4’s unified memory architecture?

We started develop YEBIS 10 years ago. For each new platform (18 as of today) we ported it to, we have always been keen on extracting highest performance by taking advantages of the specificities of the platforms. So again for Xbox One, of course we use ESRAM for internal calculation and internal buffers. The unified memory architecture is less of need for us since everything happens in the GPU.

Rashid Sayed: I am intrigued to know how Yebis 2 scales down to mobile and tablets. What kind of advancements have you done there?

Yebis run in OpenGLES 2.0. We have a benchmark application available here if you want to see it in action. More recently we made an OpenGLES 3.0 port that you can try here. The OpenGLES 3.0 adds a lot of quality taking advantages of the half float buffers.

Rashid Sayed: Yebis 2 is being used in Final Fantasy 15. Can you talk a bit about how Yebis 2 is being used in the title?

You must have made a mistake. There was no announcement that YEBIS will be used for FF15. Yebis was however used in the FINAL FANTASY XV 2013 E3 Trailer from Square Enix. There is nothing more we can say for now 🙂

"For Xbox One, of course we use ESRAM for internal calculation and internal buffers. The unified memory architecture is less of need for us since everything happens in the GPU."

Rashid Sayed: What kind of requirements does a game studio comes up with when they approach you guys for Yebis 2? What is their number one demand?

Actually there are 2 big types of demands that are both important:

  • People with a very realistic render engine (with for example physically based material shader and such) for whom the power of the optics simulation in YEBIS will just put the final touch, the emotion in the picture that is necessary to have a dramatic impact on the audience.
  • People with a “old style” or less advanced game engine and who wants to give a fresh boost in terms of looks to their rendering. YEBIS can transform a very bleach picture into a very organic or emotional picture. Not only toward realism, but also toward non-realistic artistic renderings.

Rashid Sayed: What is the potential of Yebis 2 when used with AMD’s Mantle?

Very similar thoughts as with DX12.

Rashid Sayed: Last question, what kind of new features are you working on the next update for Yebis 2?

We are developing very attractive features and will release several of them this year. Don’t miss our future announcements. 🙂

Rashid Sayed: What are the features that developers will only find in Yebis 3 but not in Yebis 2? 

  • Depth of Field.
  • Greatly improved quality.
  • More precise physical simulation of the internal lense optical compnenents and diaphragm by taking in account bokeh chromatic aberration, color frindging, and apperture shape.
  • Camera animations and focus point animations are now much smoother and also taking in account diapragm movements.
  • In addition we can now simulate full range of bokeh size , from extremely small to basically fullscreen.
  • SSAO is a new addition to our feature set.
  • Added Color Space such as AdobeRGB to be able to display the right colors on special displays.

Awesome Stuff that you might be interested in

  • Mark

    This is why I come here. Great tech coming up in our games! Yebis 3 will probably be used in movies, and Yebis 2 for current gen systems. I’m aight with tha! “In the case of cloud computing where the power of the cloud can render very realistic worlds”. Darn he’s just blowing my mind. Some developers are already moving toward the cloud (Respawn, Otoy etc.). And when this tech moves to compute servers, there’s no hardware restrictions, so who knows what’s being cooked up right now. I know the Brigade Engine’s being leased out to studios moving toward Amazon’s servers. It’s gonna get real next year….people r asleep.


 

Copyright © 2009-2015 GamingBolt.com. All Rights Reserved.