<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Celtoys &#8211; Video Game News, Reviews, Walkthroughs And Guides | GamingBolt</title>
	<atom:link href="https://gamingbolt.com/tag/celtoys/feed" rel="self" type="application/rss+xml" />
	<link>https://gamingbolt.com</link>
	<description>Get a Bolt of Gaming Now!</description>
	<lastBuildDate>Mon, 29 Jan 2018 17:32:22 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.8.3</generator>
<site xmlns="com-wordpress:feed-additions:1">185493399</site>	<item>
		<title>Consoles Can Potentially Use The Power of The Cloud, But It&#8217;s A Very Grey Area: Ex-Lionhead Dev</title>
		<link>https://gamingbolt.com/consoles-can-potentially-use-the-power-of-the-cloud-but-its-a-very-grey-area-ex-lionhead-dev</link>
					<comments>https://gamingbolt.com/consoles-can-potentially-use-the-power-of-the-cloud-but-its-a-very-grey-area-ex-lionhead-dev#comments</comments>
		
		<dc:creator><![CDATA[Pramath]]></dc:creator>
		<pubDate>Fri, 26 Jan 2018 19:00:25 +0000</pubDate>
				<category><![CDATA[News]]></category>
		<category><![CDATA[Celtoys]]></category>
		<category><![CDATA[Microsoft]]></category>
		<guid isPermaLink="false">https://gamingbolt.com/?p=321550</guid>

					<description><![CDATA["In the long term, the idea of air-gapped gameplay is unfortunately losing out."]]></description>
										<content:encoded><![CDATA[<p><a href="https://gamingbolt.com/wp-content/uploads/2017/07/crackdown-3-sdcc-screenshot.jpg"><img fetchpriority="high" decoding="async" class="aligncenter wp-image-301696" src="https://gamingbolt.com/wp-content/uploads/2017/07/crackdown-3-sdcc-screenshot.jpg" alt="" width="620" height="349" srcset="https://gamingbolt.com/wp-content/uploads/2017/07/crackdown-3-sdcc-screenshot.jpg 1010w, https://gamingbolt.com/wp-content/uploads/2017/07/crackdown-3-sdcc-screenshot-300x169.jpg 300w, https://gamingbolt.com/wp-content/uploads/2017/07/crackdown-3-sdcc-screenshot-768x432.jpg 768w" sizes="(max-width: 620px) 100vw, 620px" /></a></p>
<p>If you remember Microsoft&#8217;s original pitch for the always online Xbox One that had weaker hardware than the PS4, one of the things Microsoft proposed was &#8220;the power of the cloud&#8221; compensating by supplementing the Xbox One with additional power from server farms. It was an interesting, but entirely unrealistic, idea, especially at the time- and to this day, we haven&#8217;t seen it being implemented to the extent that Microsoft promised, the way they promised (though apparently, <em>Crackdown 3</em> might change that).</p>
<p>That is because it might not be the easiest thing to do in the world. Speaking to GamingBolt in an exclusive interview, Don Williamson, the founder of Celtoys, and an ex-Lionhead developer who has worked on <em>Fable</em> previously noted that &#8220;the power of the cloud&#8221; for video games is a very &#8220;grey area&#8221;.</p>
<p>&#8220;It’s a very grey area,&#8221; Williamson said. &#8220;We’ve been “using the cloud” for decades to build multiplayer games and some games have used remote servers to chew through expensive calculations for single player games that can be shared. I’ve recently spoken to a few startups built around the idea of moving more and more onto remote servers. I think some of them will have issues selling the implementation to developers while the others haven’t shared enough information with me to make that call. In the long term, the idea of air-gapped gameplay is unfortunately losing out so sharing more calculations remotely is logical.&#8221;</p>
<p>One day in the far future, there may be a time when very little computing is done locally, and cloud supplemented computing is the norm- but in the present day, given our current internet infrastructure worldwide, as well as the continued evolution and advancements of processors, I don&#8217;t think cloud powered gaming will be viable any time soon.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://gamingbolt.com/consoles-can-potentially-use-the-power-of-the-cloud-but-its-a-very-grey-area-ex-lionhead-dev/feed</wfw:commentRss>
			<slash:comments>23</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">321550</post-id>	</item>
		<item>
		<title>Xbox One X And Xbox One Cross Compatability Will Definitely Result Into Compromises &#8211; Ex-Lionhead Dev</title>
		<link>https://gamingbolt.com/xbox-one-x-and-xbox-one-cross-compatability-will-definitely-result-into-compromises-ex-lionhead-dev</link>
					<comments>https://gamingbolt.com/xbox-one-x-and-xbox-one-cross-compatability-will-definitely-result-into-compromises-ex-lionhead-dev#comments</comments>
		
		<dc:creator><![CDATA[Pramath]]></dc:creator>
		<pubDate>Thu, 25 Jan 2018 18:47:03 +0000</pubDate>
				<category><![CDATA[News]]></category>
		<category><![CDATA[Celtoys]]></category>
		<category><![CDATA[Xbox One]]></category>
		<category><![CDATA[xbox one x]]></category>
		<guid isPermaLink="false">https://gamingbolt.com/?p=321365</guid>

					<description><![CDATA[Which makes complete sense, from a development perspective.]]></description>
										<content:encoded><![CDATA[<p><a href="https://gamingbolt.com/wp-content/uploads/2017/09/Xbox-One-X.jpg"><img decoding="async" class="aligncenter wp-image-307059" src="https://gamingbolt.com/wp-content/uploads/2017/09/Xbox-One-X.jpg" alt="Xbox One X" width="620" height="349" srcset="https://gamingbolt.com/wp-content/uploads/2017/09/Xbox-One-X.jpg 1620w, https://gamingbolt.com/wp-content/uploads/2017/09/Xbox-One-X-300x169.jpg 300w, https://gamingbolt.com/wp-content/uploads/2017/09/Xbox-One-X-768x432.jpg 768w, https://gamingbolt.com/wp-content/uploads/2017/09/Xbox-One-X-1024x576.jpg 1024w" sizes="(max-width: 620px) 100vw, 620px" /></a></p>
<p>While the idea of iterative consoles is great in theory, the fact of the matter is that Microsoft and Sony have both hamstrung the potential of these new machines by tying them inextricably to their respective base systems. The Xbox One X especially, which represents a larger leap over the Xbox One, being tied to the base Xbox One will lead to its full potential not being realized- at least, that&#8217;s what Don Williamson, the founder of Celtoys, and an ex-Lionhead developer who has worked on <em>Fable</em> previously, thinks.</p>
<p>Speaking to GamingBolt in an exclusive interview, Williamson said, &#8220;<b></b>If you spend engineering time on two platforms rather than one then there will always be compromises. This has been the same since the dawn of computer games.&#8221;</p>
<p>Willamson also confirmed that the Xbox One X <em>should</em> be able to render native 4K in at least <em>some</em> cases- though he also professed that he would rather that power go towards HDR implementation instead. &#8220;For certain classes of renderer, most definitely. I would much rather see a focus on HDR with higher quality pixels.&#8221;</p>
<p>Additionally, according to him, the Xbox One X doesn&#8217;t match up to gaming PCs- though, on the other hand, he does note that with it being a console with fixed hardware, it has a benefit in terms of optimization that PC games do not enjoy.</p>
<p>&#8220;Desktop PCs have always come out ahead of consoles by an observable margin,&#8221; he said. &#8220;However, a console’s fixed hardware with custom modifications and matched APIs makes it easier to get predictable performance from. PC engines are a mess of compatibilities that make focusing on performance much harder. The attribution of “yet another sloppy PC port” from some gaming circles is a visible reflection of this.&#8221;</p>
<p>All of which sounds fairly reasonable and sensible. Nothing that he has said is particularly controversial. In the future, maybe a couple of years from now, I hope Microsoft rescinds the diktat that all Xbox games need to target the base system in addition to the One X.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://gamingbolt.com/xbox-one-x-and-xbox-one-cross-compatability-will-definitely-result-into-compromises-ex-lionhead-dev/feed</wfw:commentRss>
			<slash:comments>34</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">321365</post-id>	</item>
		<item>
		<title>Interview With Celtoys Founder: 4K Scaling and Xbox One X Thoughts</title>
		<link>https://gamingbolt.com/interview-with-celtoys-founder-4k-scaling-and-xbox-one-x-thoughts</link>
					<comments>https://gamingbolt.com/interview-with-celtoys-founder-4k-scaling-and-xbox-one-x-thoughts#respond</comments>
		
		<dc:creator><![CDATA[Ravi Sinha]]></dc:creator>
		<pubDate>Thu, 25 Jan 2018 05:30:29 +0000</pubDate>
				<category><![CDATA[Article]]></category>
		<category><![CDATA[Interviews]]></category>
		<category><![CDATA[Celtoys]]></category>
		<category><![CDATA[dauntless]]></category>
		<category><![CDATA[Google DeepMind]]></category>
		<category><![CDATA[WonderWorlds]]></category>
		<guid isPermaLink="false">https://gamingbolt.com/?p=321272</guid>

					<description><![CDATA[Former Fable engine lead Don Williamson talks about the Xbox One among other trends.]]></description>
										<content:encoded><![CDATA[<p><span class="bigchar">C</span>eltoys isn&#8217;t a company that&#8217;s brought up very often but it has a lot to its credit. Founder Don Williamson has worked on engine and pipeline optimization throughout his career, including being the lead for Fable&#8217;s engine. The last time <a href="https://gamingbolt.com/celtoys-interview-with-don-williamson-former-fable-engine-lead-looks-ahead">we spoke to Williamson</a> was about DirectX and the impact of cloud computing. With the Xbox One X now available worldwide and VR gaming picking up, GamingBolt chatted with him once more to learn his take on the industry at present.</p>
<p><a href="https://gamingbolt.com/wp-content/uploads/2016/12/embermane-island-vista-screenshots-dauntless.jpg"><img decoding="async" class="aligncenter wp-image-284209" src="https://gamingbolt.com/wp-content/uploads/2016/12/embermane-island-vista-screenshots-dauntless.jpg" alt="" width="620" height="349" srcset="https://gamingbolt.com/wp-content/uploads/2016/12/embermane-island-vista-screenshots-dauntless.jpg 1920w, https://gamingbolt.com/wp-content/uploads/2016/12/embermane-island-vista-screenshots-dauntless-300x169.jpg 300w, https://gamingbolt.com/wp-content/uploads/2016/12/embermane-island-vista-screenshots-dauntless-768x432.jpg 768w, https://gamingbolt.com/wp-content/uploads/2016/12/embermane-island-vista-screenshots-dauntless-1024x576.jpg 1024w" sizes="(max-width: 620px) 100vw, 620px" /></a></p>
<p class="review-highlite" >"Scaling to 4k can be a difficult problem and each team likes to explore or avoid it differently."</p>
<p><b>It’s been close to two years since we last interviewed you. How far has Celtoys come along in that period?</b></p>
<p>Business is going really well; we’ve helped ship some games, including <i>No Man&#8217;s Sky, Dauntless and WonderWorlds</i>, helped Google DeepMind with some of their AI efforts and completed various other smaller consulting projects to get new devs off the ground.</p>
<p><b>Can you tell us what happened with Celtoys’ PC Project?</b></p>
<p>Development was stalled for 9 months to bring in a bit of extra cash. It will start back up again in January with an eye to release a few months later.</p>
<p><b>With the VR industry growing more and more, how has Celtoys’ growth kept up in that industry in terms of support and features?</b></p>
<p>VR hasn’t been a priority as I’m genuinely terrified of the distraction. The nature of our own game makes it perfectly suited for VR and I’d get carried away implementing loads of features without bringing in any money. Nevertheless, I make it a point to stay on top of the state of the art.</p>
<p><b>The advent of iterative consoles has put more burden on developers specially given how 4K adoption is still on the lower side. What are your thoughts on this and how is Celtoys dealing with it?</b></p>
<p>Scaling to 4k can be a difficult problem and each team likes to explore or avoid it differently. We’ve prototyped some 4k techniques and implemented scanline rendering for an iOS title that needed PBR with realtime AO/GI performance on Retina displays. The technique goes all the way back to what I learned on <em>Fable 3</em> when we were one of the first to ship a 360 title with Temporal AA. I managed to stumble upon some history buffer rejection sampling patterns that were suited to the bandwidth-constraints of iOS9-era devices that I haven’t had the time to share widely.</p>
<p><a href="https://gamingbolt.com/wp-content/uploads/2015/11/DirectX-12.jpg"><img loading="lazy" decoding="async" class="aligncenter size-full wp-image-249251" src="https://gamingbolt.com/wp-content/uploads/2015/11/DirectX-12.jpg" alt="" width="620" height="349" srcset="https://gamingbolt.com/wp-content/uploads/2015/11/DirectX-12.jpg 620w, https://gamingbolt.com/wp-content/uploads/2015/11/DirectX-12-300x169.jpg 300w" sizes="auto, (max-width: 620px) 100vw, 620px" /></a></p>
<p class="review-highlite" >"The goal of DX12 was not to be as ubiquitous as DX11; the target audiences are different."</p>
<p><b>You have worked with Microsoft before. Do you have any thoughts on why DX12 hasn’t become a huge success as initially planned?</b></p>
<p>I’m not sure how you would measure success here. The goal of DX12 was not to be as ubiquitous as DX11; the target audiences are different. After working on iOS I actually prefer Metal’s balance of API but Apple have it a lot easier with far less devices to support. Writing D3D12 code can be a bit like writing graphics driver code and requires a lot more programmer attention than D3D11.</p>
<p>The step from D3D9 to D3D11 was difficult for many developers because of a few paradigm changes; there was a whole bunch of engineering that needed to be costed to make the change that wouldn’t necessarily bring in equivalent sales. The step from D3D11 to D3D12 is larger. With legacy engines, smaller developers will be hard pressed to make the change at all. But legacy graphics drivers were getting horrendously complicated and the effort to simplify this and make a better experience for gamers is appreciated.</p>
<p><b>Another aspect was cloud gaming and the so called “power of the cloud”. Do you think consoles will ever see a future where they can use the power of the cloud to improve their processing capabilities?</b></p>
<p>Potentially but it’s a very grey area. We’ve been “using the cloud” for decades to build multiplayer games and some games have used remote servers to chew through expensive calculations for single player games that can be shared. I’ve recently spoken to a few startups built around the idea of moving more and more onto remote servers. I think some of them will have issues selling the implementation to developers while the others haven’t shared enough information with me to make that call. In the long term, the idea of air-gapped gameplay is unfortunately losing out so sharing more calculations remotely is logical.</p>
<p><b>Microsoft have made it compulsory for developers to make their Xbox One games compatible with Xbox One X and vice versa. Do you think this will hold back Xbox One X’s true performance in any way?</b></p>
<p>If you spend engineering time on two platforms rather than one then there will always be compromises. This has been the same since the dawn of computer games.</p>
<p><a href="https://gamingbolt.com/wp-content/uploads/2017/09/Xbox-One-X.jpg"><img loading="lazy" decoding="async" class="aligncenter wp-image-307059" src="https://gamingbolt.com/wp-content/uploads/2017/09/Xbox-One-X.jpg" alt="Xbox One X" width="620" height="349" srcset="https://gamingbolt.com/wp-content/uploads/2017/09/Xbox-One-X.jpg 1620w, https://gamingbolt.com/wp-content/uploads/2017/09/Xbox-One-X-300x169.jpg 300w, https://gamingbolt.com/wp-content/uploads/2017/09/Xbox-One-X-768x432.jpg 768w, https://gamingbolt.com/wp-content/uploads/2017/09/Xbox-One-X-1024x576.jpg 1024w" sizes="auto, (max-width: 620px) 100vw, 620px" /></a></p>
<p class="review-highlite" >"A console’s fixed hardware with custom modifications and matched APIs makes it easier to get predictable performance from."</p>
<p><b>Do you think that as a whole, the Xbox One X is powerful enough to render AAA and graphically demanding games at native in 4K without resorting to techniques such as dynamic scaling or checkerboarding?</b></p>
<p>For certain classes of renderer, most definitely. I would much rather see a focus on HDR with higher quality pixels.</p>
<p><b>In your opinion, how does the Xbox One X compare to a high end modern gaming PC?</b></p>
<p>Desktop PCs have always come out ahead of consoles by an observable margin. However, a console’s fixed hardware with custom modifications and matched APIs makes it easier to get predictable performance from. PC engines are a mess of compatibilities that make focusing on performance much harder. The attribution of “yet another sloppy PC port” from some gaming circles is a visible reflection of this.</p>
<p><b>Have you gone hands on with the Switch? What are your thoughts on it? Many developers and publishers have been taken aback by its success. Do you have any thoughts behind its success?</b></p>
<p>I think it’s great that Nintendo are still relevant with the younger generation, even though I was a SEGA fan at heart. For large corporations this is a non-trivial task and they should be commended.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://gamingbolt.com/interview-with-celtoys-founder-4k-scaling-and-xbox-one-x-thoughts/feed</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">321272</post-id>	</item>
		<item>
		<title>PS4 Pro&#8217;s Color Compression Technology Is Definitely A Net Gain, Says Former Lionhead Developer</title>
		<link>https://gamingbolt.com/ps4-pros-color-compression-technology-is-definitely-a-net-gain-says-former-lionhead-developer</link>
					<comments>https://gamingbolt.com/ps4-pros-color-compression-technology-is-definitely-a-net-gain-says-former-lionhead-developer#comments</comments>
		
		<dc:creator><![CDATA[Pramath]]></dc:creator>
		<pubDate>Mon, 14 Nov 2016 17:22:00 +0000</pubDate>
				<category><![CDATA[News]]></category>
		<category><![CDATA[Celtoys]]></category>
		<category><![CDATA[LionHead]]></category>
		<category><![CDATA[ps4]]></category>
		<category><![CDATA[ps4 pro]]></category>
		<guid isPermaLink="false">http://gamingbolt.com/?p=282631</guid>

					<description><![CDATA[Greater than the sum of its parts.]]></description>
										<content:encoded><![CDATA[<p><a href="https://gamingbolt.com/wp-content/uploads/2016/09/PS4-Pro-1.jpg"><img loading="lazy" decoding="async" class="size-full wp-image-276970 aligncenter" src="https://gamingbolt.com/wp-content/uploads/2016/09/PS4-Pro-1.jpg" alt="ps4-pro" width="620" height="349" srcset="https://gamingbolt.com/wp-content/uploads/2016/09/PS4-Pro-1.jpg 620w, https://gamingbolt.com/wp-content/uploads/2016/09/PS4-Pro-1-300x169.jpg 300w" sizes="auto, (max-width: 620px) 100vw, 620px" /></a></p>
<p>The PS4 Pro has a variety of new hardware and software tricks, all of which work to make the console punch above its weight, and deliver games that look substantially better than they do on the standard PS4 system. One of these tricks is delta color compression, which helps to maximize memory bandwidth during games development.</p>
<p>In a recent interview we had with Don Williamson, former developer at <em>Fable</em> development studio Lionhead, and currently the Owner and Consultant Engine Programmer at Celtoys, we decided to ask him about the delta color compression technology, and how it might impact the performance of games running on PS4 Pro.</p>
<div id="demographics" class="demographic-info adr editable-item">&#8220;This is a very old compression technology made possible for runtime bandwidth compression through sheer horsepower available today,&#8221; he said. &#8220;It&#8217;s difficult to give exact numbers on how much better it is as it&#8217;s based on matching tiles to dictionaries of patterns that may not show up often in your game. Some games will get lucky and get a nice boost, others not so much. The increased throughput of GPU cores means this extra bandwidth can also eaten up pretty quickly. It&#8217;s definitely a net gain but one that feels an inevitable side-effect of the increasing power of GPUs.&#8221;</div>
<p>We also asked him about the other features that are making their way to the PS4 Pro from the AMD Roadmap, such as the ability to run two FP16 operations concurrently, and the integration of a work scheduler for increased efficiency. According to Williamson, these are fairly low level improvements that shouldn&#8217;t have much of an impact on the performance of games.</p>
<p>&#8220;These are just a bunch of low-level technical details that will allow us to squeeze more out of the hardware. Not too much; but enough to gain maybe half a millisecond here and there,&#8221; he said.</p>
<p>On the whole, then, it sounds like the cumulative impact of smaller improvements like these is what makes the PS4 Pro tick, and makes it a far more impressive console than you would expect it to be.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://gamingbolt.com/ps4-pros-color-compression-technology-is-definitely-a-net-gain-says-former-lionhead-developer/feed</wfw:commentRss>
			<slash:comments>31</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">282631</post-id>	</item>
		<item>
		<title>Celtoys Interview With Don Williamson: Former Fable Engine Lead Looks Ahead</title>
		<link>https://gamingbolt.com/celtoys-interview-with-don-williamson-former-fable-engine-lead-looks-ahead</link>
					<comments>https://gamingbolt.com/celtoys-interview-with-don-williamson-former-fable-engine-lead-looks-ahead#comments</comments>
		
		<dc:creator><![CDATA[Ravi Sinha]]></dc:creator>
		<pubDate>Wed, 06 Jan 2016 17:41:58 +0000</pubDate>
				<category><![CDATA[Article]]></category>
		<category><![CDATA[Interviews]]></category>
		<category><![CDATA[Celtoys]]></category>
		<category><![CDATA[fable]]></category>
		<category><![CDATA[LionHead Studios]]></category>
		<category><![CDATA[ps4]]></category>
		<category><![CDATA[Splinter Cell Conviction]]></category>
		<category><![CDATA[Xbox One]]></category>
		<guid isPermaLink="false">http://gamingbolt.com/?p=252975</guid>

					<description><![CDATA[Celtoys owner and engine/performance specialist discussed DX12 and cloud computing among other things.]]></description>
										<content:encoded><![CDATA[<p><span class="bigchar">T</span>he industry has been full of professionals who talked to us about the present state of gaming and what&#8217;s to come in 2016. Recently, GamingBolt had a chance to hear another expert&#8217;s views on the innovations and technologies to come &#8211; Celtoys owner Don Williamson, who not only worked on the renderer for Splint Cell: Conviction but was also an engine lead for the Fable engine. As such, Williamson has extensive experience with engine and pipeline optimization having worked on a number of different games in the past. What is his take on the current and emerging challenges of technologies like DirectX 12 and Cloud computing? Let&#8217;s find out.</p>
<p><a href="https://gamingbolt.com/wp-content/uploads/2010/02/Splinter_Cell_Conviction_4_X10.jpg" rel="attachment wp-att-6870"><img loading="lazy" decoding="async" class="aligncenter wp-image-6870" src="https://gamingbolt.com/wp-content/uploads/2010/02/Splinter_Cell_Conviction_4_X10.jpg" alt="Splinter_Cell_Conviction_4_X10" width="620" height="349" srcset="https://gamingbolt.com/wp-content/uploads/2010/02/Splinter_Cell_Conviction_4_X10.jpg 1024w, https://gamingbolt.com/wp-content/uploads/2010/02/Splinter_Cell_Conviction_4_X10-300x168.jpg 300w" sizes="auto, (max-width: 620px) 100vw, 620px" /></a></p>
<p class="review-highlite" >"I started off extending the Splinter Cell engine and ended up writing a new one from scratch to work on Xbox 360. I got something like 45fps on a high-stress level where the old one was hovering around 3fps."</p>
<p><strong>Could you tell us a bit about yourself and what you do?</strong></p>
<p>I’m an engine and pipeline developer of some 20 years now. I sold my first game as shareware at the age of 16, creating many games and engines beyond then as Programmer, Lead Programmer, R&amp;D Head and Project Lead. I created the Splinter Cell: Conviction engine, small parts of which are still in use today, lead the Fable engine and co-created Fable Heroes, bending the Xbox 360 in some very unusual ways! I now run my own little company, building a crazy game and helping other developers make theirs better.</p>
<p><strong>You have a great pedigree in graphics programming, having worked at Lionhead and Ubisoft. What was the inspiration behind starting Celtoys, your own?</strong></p>
<p>I wanted to create my own game and explore technology ideas that nobody in their right mind would pay me to explore. Getting to spread my wings and go back to helping other developers improve their games was an exciting way to help fund this.</p>
<p><strong>What can you tell us about the work you did at Ubisoft and Lionhead? What were your day to day activities like and what were your biggest challenges during those years?</strong></p>
<p>My roles at Ubisoft and Lionhead were very different. I’d spent some 4 years as a project lead before joining Ubisoft and wanted to break away and dedicate as much time as possible to pure code; it’s what I loved most.</p>
<p>I started off extending the Splinter Cell engine and ended up writing a new one from scratch to work on Xbox 360. I got something like 45fps on a high-stress level where the old one was hovering around 3fps. You could also iterate on code within a couple of seconds, as opposed to 4 or 5 minutes, so it was given the green light. Myself and Stephen Hill then spent 8 months porting features over, improving it, collaborating with technical artists and integrating with UE 2.5 (aka “making it work”). It’s, without doubt, the best engine of my career, although not without its faults. Such great memories!</p>
<p>Fable 2 was different. I came on-board about a year and a half before we shipped and directed the performance efforts, using help from Microsoft internally and external contractors. We had some amazing talent making the game faster so we got from sub-1fps on Xbox to a consistent 30fps, all while content creators were making the game bigger and bigger.</p>
<p>Fable 3 was different again as I was given 18 months to improve the engine and build a team for the Fables beyond. This was 50/50 management/code and by then I knew the engine enough to bend it to my needs. There was a lot of scheduling, Art/LD/Game co-ordination, career planning and feature evaluation, along with the usual deep-dive into code.</p>
<p>Although I didn’t realize at the time, my biggest challenge was swapping out the old renderer with the new one for SC5, while keeping all content authors productive. We were under intense scrutiny because it was such a risky undertaking, but we managed to do it while some 60 people used the engine/editor on a daily basis.</p>
<p><a href="https://gamingbolt.com/wp-content/uploads/2011/05/fable-3-visuals-pc.jpg" rel="attachment wp-att-31610"><img loading="lazy" decoding="async" class="aligncenter wp-image-31610" src="https://gamingbolt.com/wp-content/uploads/2011/05/fable-3-visuals-pc.jpg" alt="The superior draw distance and textures on PC bring the world of Fable to life" width="620" height="364" srcset="https://gamingbolt.com/wp-content/uploads/2011/05/fable-3-visuals-pc.jpg 655w, https://gamingbolt.com/wp-content/uploads/2011/05/fable-3-visuals-pc-300x176.jpg 300w" sizes="auto, (max-width: 620px) 100vw, 620px" /></a></p>
<p class="review-highlite" >"Since starting the company I’ve built my own engine. I’ve had many licensing enquiries about that as it is sufficiently different to what’s currently available, but you can safely say that the amount of work required to build an engine rises exponentially with the number of clients."</p>
<p><strong>Speaking about Fable 2 and 3, we read that a large amount of your code was used for the engine. Given the middle-of-the-road acclaim the franchise has received, what did you feel was the most distinguishing features of Fable&#8217;s engine? Though Fable Legends is fairly different from what Lionhead has done in the past, what excites you the most about its direction?</strong></p>
<p>There were so many talented people that helped with the engine it’s impossible to list all its distinguishing features without leaving lots out.</p>
<p>Visually and in motion it was quite beautiful. A large part of this was down to the work of Mark Zarb-Adami (now at Media Molecule) who did all the tree animation and weather effects. His work combined effectively with Francesco Carucci’s post-processing, who had a great eye for colour.</p>
<p>We had 3 amazing low-level engine programmers in Martin Bell, Kaspar Daugaard and David Bryson who ensured stability, designed a very sane overall architecture and added many features like character customization/damage, texture streaming and fur.</p>
<p>Later on the engine team acquired new blood with Patrick Connor and Paul New, who improved the water, created some incredible environment customization tools and built a new offline lighting engine to accelerate the work of our artists.</p>
<p>Some interesting distinguishing features include:</p>
<p>At the time I think Gears of War had something like 3 or 4 textures per character on-screen. Due to the way we randomized and allowed customization of our characters, some would have as many 90 textures. Getting this to work at performance for over 50 characters on-screen was no small undertaking.</p>
<p>By the time Fable 3 shipped, we could have up to 25,000 procedurally animating trees in view on Xbox 360. Unfortunately some of the most impressive levels we had to showcase this were cut down slightly before some of the bigger optimizations came online. Still, I wasn’t aware of any game that could get close to that at the time or since, while having an entire city and people rendering in the background.</p>
<p>We were one of the first games to take temporal anti-aliasing seriously and make it our primary filtering method. It looked better than MSAA, recovered more detail, required less memory, gained 3ms performance, and reduced latency and overall engine complexity. It was a massive win everywhere, although suffered from ghosting that haunted us to the very end. We had it mostly solved but weren’t allowed to check the final fixes in due to some minor problems they introduced elsewhere. Needless to say, we had a few hundred ideas on how to make it much better for Fables beyond.</p>
<p>There was also our offline light map baker. Fable 2 took up to 7 days to light the vertices of a single level, causing massive trouble during the final months. On Fable 3 we created a new GPU Global Illumination light baker that baked 100x as much data to textures, at 100x the performance, getting similar levels from 7 days down to a few minutes. By the time Fable 3 shipped, levels got much bigger so they ended up taking around an hour. That’s when we added GPU network distribution to the engine, bringing compile times down to around 5 minutes. It was orders of magnitude faster than any of the commercial options available at the time.</p>
<p>What interests me about Fable Legends is that it’s a completely new team and I’m excited to see what new directions they take the game in.</p>
<p><strong>Celtoys deals with improving performance of games, basically optimizing engines and the background framework. However was there every a thought of making your own engine and use it for licensing purposes? Furthermore, what are your thoughts on the various challenges in developing a custom engine?</strong></p>
<p>Since starting the company I’ve built my own engine. I’ve had many licensing enquiries about that as it is sufficiently different to what’s currently available, but you can safely say that the amount of work required to build an engine rises exponentially with the number of clients. At the moment you can still beat established engines for performance, visuals and iteration time if you lock your feature set to one game only. It won’t be long before this also will be impossible so I’m enjoying it while I can.</p>
<p>This also gives me the benefit of being able to come on-site to existing devs and use the tools I’ve developed these last few years to immediately be effective.</p>
<p>Developing a new engine is a huge topic as its potential clients can vary from tiny teams to massive teams; from those with money to those watching every penny; and from those who want to innovate on gameplay vs. engine features.</p>
<p>One broad observation relevant today is that if you don’t have expertise in building one, right from the low-level programmer to the high-level manager, you’re in trouble before you start. You used to be able to bungle your way through, making lots of mistakes and still innovating, but that time is long gone.</p>
<p>However, one thing I can’t stress enough is that building new game engines is easy. Easy, that is, compared with building the tools required to put the power in the hands of content authors. These are the guys who will sell it and break it, demonstrating to the world what it’s capable of and forcing you to improve the feature set in the process. If you don’t make their lives easy, you will languish. I don’t believe I have seen any custom engine pipeline get this even close to right.</p>
<p><a href="https://gamingbolt.com/wp-content/uploads/2013/12/total_war_rome_2_caesar_in_gaul_4.jpg" rel="attachment wp-att-180472"><img loading="lazy" decoding="async" class="aligncenter wp-image-180472" src="https://gamingbolt.com/wp-content/uploads/2013/12/total_war_rome_2_caesar_in_gaul_4.jpg" alt="Total War Rome 2 Caesar in Gaul" width="620" height="349" srcset="https://gamingbolt.com/wp-content/uploads/2013/12/total_war_rome_2_caesar_in_gaul_4.jpg 1280w, https://gamingbolt.com/wp-content/uploads/2013/12/total_war_rome_2_caesar_in_gaul_4-300x168.jpg 300w, https://gamingbolt.com/wp-content/uploads/2013/12/total_war_rome_2_caesar_in_gaul_4-1024x576.jpg 1024w" sizes="auto, (max-width: 620px) 100vw, 620px" /></a></p>
<p class="review-highlite" >"What’s interesting is that even the simplest of techniques, like basic shadow mapping and a single light with ambient background, can look very convincing when viewed in stereo. Aliasing is massively distracting and there is no doubt that resolutions need to increase and the quality of the pixel needs to improve."</p>
<p><strong>Can you please give us a few examples where Celtoys has helped game developers optimize the game’s performance?</strong></p>
<p>The first example is Fable 2; I was contracting for a while before Lionhead convinced me to come aboard full time.</p>
<p>After Fable 3, I helped Creative Assembly/SEGA with Total War: Rome 2 optimize performance across many PCs. A great example is the terrain system where within 3 months we had 100x performance with 10x the draw distance, blending hundreds of thousands of procedural/streamed terrain layers in real-time. I’ve been told the basic renderer is pretty much untouched for the games beyond Rome 2, such as Total War: Warhammer, being extended by the engine team to increase resolution in the near field.</p>
<p>Most of our current work is currently under NDA and I’d really love to shout about some of it but it’ll be at least a year before that. However as a result we have submitted some UE4 modifications that could potentially help all games that use the engine in a big way. The game I was working on had shader instruction count go from ~1800 to ~450 in one particular complicated effect.</p>
<p><strong>One thing we&#8217;ve been curious about for a while is how engine pipelines and rendering work with regards to virtual reality titles. Is there any fundamental difference between a VR game&#8217;s pipelines and a traditional game&#8217;s? What do you believe are some of the hurdles that VR games will face when it comes to graphics?</strong></p>
<p>Latency is everything with VR and this makes it really exciting for low-level engine programming. In the past there has been debate about whether players can feel the difference between 30fps and 60fps, despite the real problem being latency, not fps. This time round there is no denying that if you try to skimp on latency with VR, you are not going to attract the audience &#8211; they’ll all be too busy feeling sick and trying to regain the ability to stand up.</p>
<p>What’s interesting is that even the simplest of techniques, like basic shadow mapping and a single light with ambient background, can look very convincing when viewed in stereo. Aliasing is massively distracting and there is no doubt that resolutions need to increase and the quality of the pixel needs to improve.</p>
<p>So there is growing literature on how you can identify results that can be shared between each eye: e.g. how can you do approximate visibility rather than running the query twice? All-in-all, we need an almost 4x increase in engine performance and a simpler environment to render.</p>
<p><strong>Given your experience in the industry, Celtoys&#8217; PC project sounds interesting, especially since it has a &#8220;scope never seen before.&#8221; Could you tell us more about it or enlighten us to the overall goals you currently have for it?</strong></p>
<p>I’m sitting on the reveal for now but I’ve been dropping hints on my Twitter feed for a while. I’m one man who’s also feeding his family by contracting for others so I need to be sure I’m in a stable position to support the games players and manage expectations when I bring it out of hiding.</p>
<p>The gameplay loop is small and simple with many chances for emergent behaviour. However, the scope of the playing field is beyond anything that’s been seen before so I’m hoping to keep the punch line under cover until I can take advantage of it.</p>
<p><a href="https://gamingbolt.com/wp-content/uploads/2013/06/MGSV_e3_02.jpg" rel="attachment wp-att-160304"><img loading="lazy" decoding="async" class="aligncenter wp-image-160304" src="https://gamingbolt.com/wp-content/uploads/2013/06/MGSV_e3_02.jpg" alt="MGSV_e3_02" width="620" height="349" srcset="https://gamingbolt.com/wp-content/uploads/2013/06/MGSV_e3_02.jpg 800w, https://gamingbolt.com/wp-content/uploads/2013/06/MGSV_e3_02-300x168.jpg 300w" sizes="auto, (max-width: 620px) 100vw, 620px" /></a></p>
<p class="review-highlite" >"Modern-day consoles now come with all the complexity of a PC, their own operating systems that pale in comparison and seem to be scaring the consumer away for a not-insignificant cash price."</p>
<p><strong>We&#8217;re seeing more games these days reliably scaling towards lower-end PC hardware. Some examples include Metal Gear Solid 5: The Phantom Pain, Mad Max, etc. Given that the common complaint for PC development is optimizing for a wide range of hardware, how do you feel development is changing these days to accommodate a wider variety of configurations?</strong></p>
<p>I think it’s market driven and potentially a cause for concern for smaller companies. 15 years ago I was optimizing for and creating fallback paths for games that had to run on 50 or so distinct video cards with several different APIs. It was quite literally: video card A has a z-buffer, video card B doesn’t! We had extensive publisher-driven test regimes and they rarely let us off the hook with any bugs.</p>
<p>With the rise of mass-market console gaming and the odd belief that the PC was doomed, games became merely last-minute PC ports; not helped by the increase in software complexity required to build them.</p>
<p>It used to be consoles provided a consumer-oriented benefit to playing games: buy the game, stick it in the slot, play immediately! The PC at the time was wallowing with hardware, software and installation problems and wasn’t as convenient. Modern-day consoles now come with all the complexity of a PC, their own operating systems that pale in comparison and seem to be scaring the consumer away for a not-insignificant cash price.</p>
<p>On top of that you have mobile markets being saturated and the rise of freely-available commercial quality engines that everybody uses. It seems that the next big market is PC and everybody is chasing it, to the point where we might get saturation again.</p>
<p>At the moment I think you need to get in and get out while you can because there’s not long left.</p>
<p><strong>Similarly with Unreal Engine 4, it feels like it will take a while before we see games on PS4 and Xbox One whose visual quality approaches that of Infiltrator demo. When do you feel we&#8217;ll see visuals from the engine which can rival big-budget CGI in movies?</strong></p>
<p>Not for a while. Engine choice not-withstanding, we have many years before we’ll catch CGI, if at all. One look down the rabbit-hole of aliasing with an understanding of why movies look so much better at low resolutions than games at higher resolutions will show a little insight.</p>
<p>In terms of UE4, it’s massively flexible and gives power to many content authors who have been restricted in the past. While you will see much amazing creative output with limited playing fields in the next couple of years, a lot of that flexibility is being used to simply speed up game creation, rather than add more visual fidelity.</p>
<p><strong>The Xbox One and PS4 have now been on the market for two years and there are still more graphically impressive games releasing. What can you tell us about optimizing for the Xbox One and PS4 and do you feel the consoles will outlast the relatively short shelf-life of their components?</strong></p>
<p>These are powerful machines with fixed architectures and some astonishing debug tools. On one of the games I worked on we created some amazing effects for Xbox One that have given us ideas for how it could be visually much better and more performant. The main variable in all this is how many skilled people you have and how much you’re willing to spend.</p>
<p>These days the investment in great engine tech is no longer necessarily rewarding you with better sales. You have less developers going with their own solutions, instead investing money in programmers for the core game and those who can maintain a bought 3rd-party engine.</p>
<p>You have some established pioneers like Naughty Dog thankfully making leaps and bounds in this area, but I’m unsure right now whether we’ll see as much progress as we did with the previous generation.</p>
<p><a href="https://gamingbolt.com/wp-content/uploads/2015/11/DirectX-12.jpg" rel="attachment wp-att-249251"><img loading="lazy" decoding="async" class="aligncenter size-full wp-image-249251" src="https://gamingbolt.com/wp-content/uploads/2015/11/DirectX-12.jpg" alt="DirectX 12" width="620" height="349" srcset="https://gamingbolt.com/wp-content/uploads/2015/11/DirectX-12.jpg 620w, https://gamingbolt.com/wp-content/uploads/2015/11/DirectX-12-300x169.jpg 300w" sizes="auto, (max-width: 620px) 100vw, 620px" /></a></p>
<p class="review-highlite" >"I’m entirely convinced that some of the routes we are investing money in right now will prove to be financially and environmentally unable to scale. However, the cloud is here to stay and we will fail many times before we find ways to exploit it in the long run."</p>
<p><strong>There&#8217;s been lots of footage surrounding DirectX 12, which recently released for PCs alongside Windows 10, and how it&#8217;s possible to achieve a new standard for graphics with it. However, it still feels a ways off despite the strong popularity of Windows 10. What do you feel are some of the factors that will determine whether we see more DirectX 12-geared games in the coming years?</strong></p>
<p>Nobody really cared about DirectX 11 because there was assumed to be no money building your engine for it. With Xbox 360, you already had a lot of the features of DX11 and more, with lower level access and a DX9-like API. Porting this to DX9 for PC was simple enough, with the lion&#8217;s share of work going into compatibility and performance &#8211; adding a new API was just a recipe for trouble.</p>
<p>The Xbox One and PC seem to have more financial impact on sales than the previous generation and if Microsoft manage the Xbox One DirectX 12 API effectively, the up-take should be good. It’s more complicated to develop for than DirectX 11 but more and more people are using engines like UE4 and Unity so the impact should be lessened.</p>
<p><strong>DirectX 12 will be heading to the Xbox One and there is the belief that it can be used for better looking graphics. What are your thoughts on this and how could DirectX 12 be used to benefit the console hardware overall?</strong></p>
<p>The innovation in DirectX 12 is its ability to unify a large number GPUs with a single, non-intrusive API; it’s such a great move from Microsoft. I think the Xbox One software team will already have made their version of DirectX 11 very close to the metal so the main benefit will be from an engineering point of view, where you can realistically drop your DirectX 11 version altogether. The simpler platform will allow more time to be invested in optimizing the game.</p>
<p><strong>What are your thoughts on the PS4 API? Do you think it will hold on its own when DX12 launches?</strong></p>
<p>I have no experience with the PS4 API but based on many factors it should at least hold its own. The PS4 is a fixed console platform and as far back as the original PlayStation, Sony has been writing the rule book on how the most efficient console APIs should be implemented.</p>
<p><strong>Another area of interest these days is cloud computing, especially given Microsoft&#8217;s demonstration of its capabilities with Crackdown 3. Several other areas of gaming like PlayStation Now and Frontier&#8217;s Elite Dangerous use cloud computing in their own right. However, what do you think about it being used to fuel graphics and newer scenarios in the coming years? Will there be more reliance on the cloud and less on the actual hardware of the console?</strong></p>
<p>I’m entirely convinced that some of the routes we are investing money in right now will prove to be financially and environmentally unable to scale. However, the cloud is here to stay and we will fail many times before we find ways to exploit it in the long run.</p>
<p>If we break the typical game engine pipeline into pieces there are many parts that offer unique opportunities for distribution. Imagine the creation of this new cloud infrastructure how you would the evolution of 3D engines written in software, to their modern-day GPU accelerated counterparts: we’re trying to create a big GPU in the sky.</p>
<p>Even if players aren’t part of the same session, they’ll be exploring the same worlds. Work such as global illumination, visibility, spatial reasoning and complex geometry optimization can be factored into their low frequency contributors, clustered, computed and cached on machines thousands of miles away, to be shared between many players. Whatever client hardware is used to retrieve this data will augment that in ways that don’t make sense to distribute.</p>
<p>Of course the other element to cloud rendering is the ability to write an engine once for the target platform that can be distributed live to many varying devices on your wrist, in your pocket or on a screen in your lounge. We already have examples of this working really well on your local network (e.g. the Wii-U) but I think there’s a long way to go before the issues of latency and cost are solved.</p>
<p><a href="https://gamingbolt.com/wp-content/uploads/2015/12/Xbox-One-PS4.jpg" rel="attachment wp-att-251784"><img loading="lazy" decoding="async" class="aligncenter size-full wp-image-251784" src="https://gamingbolt.com/wp-content/uploads/2015/12/Xbox-One-PS4.jpg" alt="Xbox One PS4" width="620" height="357" srcset="https://gamingbolt.com/wp-content/uploads/2015/12/Xbox-One-PS4.jpg 620w, https://gamingbolt.com/wp-content/uploads/2015/12/Xbox-One-PS4-300x173.jpg 300w" sizes="auto, (max-width: 620px) 100vw, 620px" /></a></p>
<p class="review-highlite" >"The life of an engine developer is constantly mired in compromise that the average player will never see. Particles can be rendered at a lower resolution, volumes can be traced with less steps, calculations can be blended over multiple frames and textures can be streamed at a lower resolution."</p>
<p><strong>Do you think the emergence of cloud based graphics processing as in case of Microsoft’s Xbox One, hardware based console will be a thing of past? Do you think the next gen consoles, say the PS5 and Next Xbox will be services rather than actual consoles?</strong></p>
<p>The clients will certainly get thinner to a degree. As discussed above, it’s a case of figuring out which bits of the hardware will get placed where.</p>
<p><strong>I wanted to talk a bit about the differences between PS4 and Xbox One GPU and their ROP counts. Obviously one has more ROP count than the other but do you think these are just numerical numbers and the difference does not matter in practical scenarios?</strong></p>
<p>Data such as this really doesn’t matter as different hardware achieves the same effects in different ways. Coupled with different engines being built to take advantage of different hardware peculiarities, the numbers really only make sense if you want to build graphs of random data and strenuously imply one is better than the other. The obvious caveat here is that both are modified GCN architectures (Liverpool and Durango) so they’re closer to each other and easier to compare than previous generations.</p>
<p>A great example of this was the old PS2: you had effectively 2MB of VRAM left after you stored your frame buffer. The Dreamcast had 8MB total and the Xbox had 64MB shared RAM. I remember at the time everybody going crazy about this trying to demonstrate the inferiority of the PS2 in comparisons. What was hard to explain at the time was the PS2’s DMA system was so fast that you if you were clever enough, you could swap this out 16 times a frame and get an effective 32MB of VRAM.</p>
<p><strong>As a developer yourself, I am sure you are aware about sacrificing frame rate over resolution and vice versa. This seems to be happening a lot this gen. Why is resolution and frame rate connected to each other and why do you think developers are struggling to balance the combination?</strong></p>
<p>Resolution is only connected to frame rate in-so-much-as it’s one of the easiest variables to change when you’re looking to gain performance quickly. It’s also one of the most visible and easy to measure compromises. If you go from 1080&#215;720 to 1080&#215;640 you can reduce the amount of work done by 10%. If your pixels are one of the bottlenecks, that immediately transforms into performance, leaving the rest to a cheap up-scaler.</p>
<p>The life of an engine developer is constantly mired in compromise that the average player will never see. Particles can be rendered at a lower resolution, volumes can be traced with less steps, calculations can be blended over multiple frames and textures can be streamed at a lower resolution. It’s a hard road and making compromises that the player will never see is difficult and rife with potential side-effects that can take days or months to emerge. As such, making these deep changes late in a project is more risky and less likely to be accepted.</p>
<p>Changing resolution is easier with the most visible side-effect of more aliasing. Late in the game, if you’re going to be criticized anyway, frame rate with minimal tearing is king and most would rather take the punches on resolution.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://gamingbolt.com/celtoys-interview-with-don-williamson-former-fable-engine-lead-looks-ahead/feed</wfw:commentRss>
			<slash:comments>1</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">252975</post-id>	</item>
		<item>
		<title>Dev On Whether PS4/Xbox One Will Outlast Relatively Short Shelf-Life of Their Components</title>
		<link>https://gamingbolt.com/dev-on-whether-ps4xbox-one-will-outlast-relatively-short-shelf-life-of-their-components</link>
					<comments>https://gamingbolt.com/dev-on-whether-ps4xbox-one-will-outlast-relatively-short-shelf-life-of-their-components#comments</comments>
		
		<dc:creator><![CDATA[Ravi Sinha]]></dc:creator>
		<pubDate>Wed, 30 Dec 2015 17:01:08 +0000</pubDate>
				<category><![CDATA[News]]></category>
		<category><![CDATA[Celtoys]]></category>
		<category><![CDATA[Microsoft]]></category>
		<category><![CDATA[ps4]]></category>
		<category><![CDATA[sony]]></category>
		<category><![CDATA[Xbox One]]></category>
		<guid isPermaLink="false">http://gamingbolt.com/?p=253249</guid>

					<description><![CDATA[Celtoys founder talks about optimizing hardware for the coming years.]]></description>
										<content:encoded><![CDATA[<p><a href="https://gamingbolt.com/wp-content/uploads/2015/12/Xbox-One-PS4.jpg" rel="attachment wp-att-251784"><img loading="lazy" decoding="async" class="aligncenter size-full wp-image-251784" src="https://gamingbolt.com/wp-content/uploads/2015/12/Xbox-One-PS4.jpg" alt="Xbox One PS4" width="620" height="357" srcset="https://gamingbolt.com/wp-content/uploads/2015/12/Xbox-One-PS4.jpg 620w, https://gamingbolt.com/wp-content/uploads/2015/12/Xbox-One-PS4-300x173.jpg 300w" sizes="auto, (max-width: 620px) 100vw, 620px" /></a></p>
<p>For all the credit that the PS4 and Xbox One, there&#8217;s no denying that their components are already quite old. Such is the problem with consoles which are stuck with the same hardware for years on end. However, the games still keep looking better thanks to improved optimization but can this current generation of consoles outlast the relatively short self-life of their respective components.</p>
<p>We spoke to Celtoys founder Don Williamson, who has experience with engine and pipeline optimization (besides developing the renderer for Splinter Cell: Conviction and was an engine lead for Fable), who stated that, &#8220;These are powerful machines with fixed architectures and some astonishing debug tools. On one of the games I worked on we created some amazing effects for Xbox One that have given us ideas for how it could be visually much better and more performant. The main variable in all this is how many skilled people you have and how much you’re willing to spend.</p>
<p>&#8220;These days the investment in great engine tech is no longer necessarily rewarding you with better sales. You have less developers going with their own solutions, instead investing money in programmers for the core game and those who can maintain a bought 3rd-party engine.</p>
<p>&#8220;You have some established pioneers like Naughty Dog thankfully making leaps and bounds in this area, but I’m unsure right now whether we’ll see as much progress as we did with the previous generation.&#8221;</p>
<p>The last statement from Don is a bit interesting. The rate at which graphics are improving on current gen consoles, will developers eventually hit the <em>graphics wall</em>? Several games like The Order: 1886 and Star Wars: Battlefront are already closing the gap between in-game visuals and real life simulations. However the bridge is still far away.</p>
<p>Thoughts on the same? Let us know in the comments below.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://gamingbolt.com/dev-on-whether-ps4xbox-one-will-outlast-relatively-short-shelf-life-of-their-components/feed</wfw:commentRss>
			<slash:comments>72</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">253249</post-id>	</item>
		<item>
		<title>Will PS4 /Xbox One Graphics Quality Ever Approach Epic Games&#8217; Infiltrator Demo?</title>
		<link>https://gamingbolt.com/will-ps4-xbox-one-graphics-quality-ever-approach-epic-games-infiltrator-demo</link>
					<comments>https://gamingbolt.com/will-ps4-xbox-one-graphics-quality-ever-approach-epic-games-infiltrator-demo#comments</comments>
		
		<dc:creator><![CDATA[Ravi Sinha]]></dc:creator>
		<pubDate>Mon, 28 Dec 2015 15:23:09 +0000</pubDate>
				<category><![CDATA[News]]></category>
		<category><![CDATA[Celtoys]]></category>
		<category><![CDATA[ps4]]></category>
		<category><![CDATA[unreal engine 4]]></category>
		<category><![CDATA[Xbox One]]></category>
		<guid isPermaLink="false">http://gamingbolt.com/?p=253154</guid>

					<description><![CDATA[Celtoys founder weighs in on the future of video game visuals.]]></description>
										<content:encoded><![CDATA[<p><a href="https://gamingbolt.com/wp-content/uploads/2013/08/unreal-engine-4-infiltrator-demo-amd.jpg" rel="attachment wp-att-170704"><img loading="lazy" decoding="async" class="aligncenter size-full wp-image-170704" src="https://gamingbolt.com/wp-content/uploads/2013/08/unreal-engine-4-infiltrator-demo-amd.jpg" alt="unreal engine 4 infiltrator demo amd" width="620" height="349" srcset="https://gamingbolt.com/wp-content/uploads/2013/08/unreal-engine-4-infiltrator-demo-amd.jpg 620w, https://gamingbolt.com/wp-content/uploads/2013/08/unreal-engine-4-infiltrator-demo-amd-300x168.jpg 300w" sizes="auto, (max-width: 620px) 100vw, 620px" /></a></p>
<p>For all the great games we&#8217;ve seen on the Xbox One and PS4, it still feels like ages ago <a href="https://gamingbolt.com/ue-4-engine-demo-infiltrator-featured-tons-of-pre-computed-lighting">that Epic Games&#8217; Infiltrator demo</a>, developed using Unreal Engine 4, promised us something that would rival CGI-quality visuals. Is the day too far off that we&#8217;ll see such quality visuals on current gen consoles, the PS4 and Xbox One?</p>
<p>GamingBolt spoke to Celtoys owner Don Williamson, who is an engine/performance specialist, about when we&#8217;ll see visuals from the engine on consoles that could possibly rival big-budget CGI seen in movies.</p>
<p>&#8220;Not for a while,&#8221; Williamson said to GamingBolt. &#8220;Engine choice not-withstanding, we have many years before we’ll catch CGI, if at all. One look down the rabbit-hole of aliasing with an understanding of why movies look so much better at low resolutions than games at higher resolutions will show a little insight.&#8221;</p>
<p>&#8220;In terms of UE4, it’s massively flexible and gives power to many content authors who have been restricted in the past. While you will see much amazing creative output with limited playing fields in the next couple of years, a lot of that flexibility is being used to simply speed up game creation, rather than add more visual fidelity.&#8221;</p>
<p>That being said, there will be a number of titles which will still impress us with their visuals (like Star Wars: Battlefront, The Order: 1886 or Until Dawn or even Forza Motorsport 6) as the years roll on. The current gen consoles have a lot of potential but developers would need to find <em>out of box </em>methods to achieve optimization and improve performance parameters.</p>
<p>What are your thoughts on the matter? Let us know in the comments and stay tuned for our full interview with Don in the coming days.</p>
<p><iframe loading="lazy" src="https://www.youtube.com/embed/3EJC1edU3Y4" width="620" height="349" frameborder="0" allowfullscreen="allowfullscreen"></iframe></p>
]]></content:encoded>
					
					<wfw:commentRss>https://gamingbolt.com/will-ps4-xbox-one-graphics-quality-ever-approach-epic-games-infiltrator-demo/feed</wfw:commentRss>
			<slash:comments>81</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">253154</post-id>	</item>
		<item>
		<title>DX12&#8217;s Ability To Unify GPUs Is &#8220;A Great Move&#8221;, Xbox One Will Benefit From An Engineering Point of View</title>
		<link>https://gamingbolt.com/dx12s-ability-to-unify-gpus-is-a-great-move-xbox-one-will-benefit-from-an-engineering-point-of-view</link>
					<comments>https://gamingbolt.com/dx12s-ability-to-unify-gpus-is-a-great-move-xbox-one-will-benefit-from-an-engineering-point-of-view#comments</comments>
		
		<dc:creator><![CDATA[Ravi Sinha]]></dc:creator>
		<pubDate>Wed, 18 Nov 2015 15:26:19 +0000</pubDate>
				<category><![CDATA[News]]></category>
		<category><![CDATA[Celtoys]]></category>
		<category><![CDATA[DirectX 12]]></category>
		<category><![CDATA[Microsoft]]></category>
		<category><![CDATA[Xbox One]]></category>
		<guid isPermaLink="false">http://gamingbolt.com/?p=249143</guid>

					<description><![CDATA[Celtoys founder Don Williamson talks about the potential of DirectX 12.]]></description>
										<content:encoded><![CDATA[<p><a href="https://gamingbolt.com/wp-content/uploads/2015/11/DirectX-12.jpg"><img loading="lazy" decoding="async" class="aligncenter size-full wp-image-249251" src="https://gamingbolt.com/wp-content/uploads/2015/11/DirectX-12.jpg" alt="DirectX 12" width="620" height="349" srcset="https://gamingbolt.com/wp-content/uploads/2015/11/DirectX-12.jpg 620w, https://gamingbolt.com/wp-content/uploads/2015/11/DirectX-12-300x169.jpg 300w" sizes="auto, (max-width: 620px) 100vw, 620px" /></a></p>
<p>Even though DirectX 12 is available now with Windows 10 and the operating system itself has begun to integrate with the Xbox One with the New Xbox One Experience, it will still take time before all games are developed with the API in mind. That being said, DirectX 12 has a ton of potential in what it can achieve, especially when it comes to Microsoft&#8217;s gaming push.</p>
<p>GamingBolt spoke to Celtoys founder Don Williamson, an ex-Lionhead member who headed up the Fable engine and who currently specializes in performance and engine optimization, about some of the factors that will determine whether we see more DirectX 12-built games in the coming years. Williamson stated that, &#8220;Nobody really cared about DirectX 11 because there was assumed to be no money building your engine for it. With Xbox 360, you already had a lot of the features of DX11 and more, with lower level access and a DX9-like API. Porting this to DX9 for PC was simple enough, with the lions share of work going into compatibility and performance &#8211; adding a new API was just a recipe for trouble.</p>
<p>&#8220;The Xbox One and PC seem to have more financial impact on sales than the previous generation and if Microsoft manage the Xbox One DirectX 12 API effectively, the up-take should be good. It’s more complicated to develop for than DirectX 11 but more and more people are using engines like UE4 and Unity so the impact should be lessened.&#8221;</p>
<p>As for the belief that DirectX 12 could be used for better visuals on the Xbox One and how it could benefit the hardware overall, Williamson said, &#8220;The innovation in DirectX 12 is its ability to unify a large number GPUs with a single, non-intrusive API; it’s such a great move from Microsoft. I think the Xbox One software team will already have made their version of DirectX 11 very close to the metal so the main benefit will be from an engineering point of view, where you can realistically drop your DirectX 11 version altogether. The simpler platform will allow more time to be invested in optimizing the game.&#8221;</p>
<p>What are your thoughts on all this? Let us know in the comments below and stay tuned to our full interview with Don in the coming days.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://gamingbolt.com/dx12s-ability-to-unify-gpus-is-a-great-move-xbox-one-will-benefit-from-an-engineering-point-of-view/feed</wfw:commentRss>
			<slash:comments>133</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">249143</post-id>	</item>
		<item>
		<title>PS4 &#8211; Xbox One GPU ROP Differences Don&#8217;t Matter, Numbers Only Make Sense for Building Random Data</title>
		<link>https://gamingbolt.com/ps4-xbox-one-gpu-rop-differences-dont-matter-numbers-only-make-sense-for-building-random-data</link>
					<comments>https://gamingbolt.com/ps4-xbox-one-gpu-rop-differences-dont-matter-numbers-only-make-sense-for-building-random-data#comments</comments>
		
		<dc:creator><![CDATA[Ravi Sinha]]></dc:creator>
		<pubDate>Sun, 08 Nov 2015 19:10:06 +0000</pubDate>
				<category><![CDATA[News]]></category>
		<category><![CDATA[Celtoys]]></category>
		<category><![CDATA[ps4]]></category>
		<category><![CDATA[Xbox One]]></category>
		<guid isPermaLink="false">http://gamingbolt.com/?p=248256</guid>

					<description><![CDATA[Celtoys' Don Williamson says "different hardware achieves the same effects in different ways".]]></description>
										<content:encoded><![CDATA[<p><a href="https://gamingbolt.com/wp-content/uploads/2014/03/PS4-Xbox-one.jpg"><img loading="lazy" decoding="async" src="https://gamingbolt.com/wp-content/uploads/2014/03/PS4-Xbox-one.jpg" alt="PS4 Xbox one" width="620" height="349" class="aligncenter size-full wp-image-191539" srcset="https://gamingbolt.com/wp-content/uploads/2014/03/PS4-Xbox-one.jpg 620w, https://gamingbolt.com/wp-content/uploads/2014/03/PS4-Xbox-one-300x168.jpg 300w" sizes="auto, (max-width: 620px) 100vw, 620px" /></a></p>
<p>When it comes to the PS4 and Xbox One, much is said about the difference in RAM and GPUs that both consoles are touting. Of course, the exact nature of these GPUs hasn&#8217;t been revealed but judging by the quality of some third party titles on PS4 versus Xbox One, it&#8217;s not uncommon to think that one is better. But in which respects are they better?</p>
<p>GamingBolt spoke to Celtoys founder Don Williamson who is an engine and pipeline developer. Not only did he created the engine for Ubisoft&#8217;s Splinter Cell: Conviction but is also responsible for leading development of the Fable engine. Nowadays, Williamson is working on his own game which promises a never before-seen scope and helping other developers to better optimize their games. GamingBolt asked Williamson about the differences in the PS4 and Xbox One GPUs along with their ROP counts. How does these numbers figure into practical scenarios?</p>
<p>As Williamson states, &#8220;Data such as this really doesn’t matter as different hardware achieves the same effects in different ways. Coupled with different engines being built to take advantage of different hardware peculiarities, the numbers really only make sense if you want to build graphs of random data and strenuously imply one is better than the other. The obvious caveat here is that both are modified GCN architectures (Liverpool and Durango) so they’re closer to each other and easier to compare than previous generations. </p>
<p>&#8220;A great example of this was the old PS2: you had effectively 2MB of VRAM left after you stored your frame buffer. The Dreamcast had 8MB total and the Xbox had 64MB shared RAM. I remember at the time everybody going crazy about this trying to demonstrate the inferiority of the PS2 in comparisons. What was hard to explain at the time was the PS2’s DMA system was so fast that you if you were clever enough, you could swap this out 16 times a frame and get an effective 32MB of VRAM.&#8221;</p>
<p>There is more yet to know about each current gen console and the power they boast so stay tuned. In the meantime, let us know what you think of Willamson&#8217;s thoughts in the comments.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://gamingbolt.com/ps4-xbox-one-gpu-rop-differences-dont-matter-numbers-only-make-sense-for-building-random-data/feed</wfw:commentRss>
			<slash:comments>142</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">248256</post-id>	</item>
	</channel>
</rss>
