<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	
	>
<channel>
	<title>
	Comments on: Do PS4/Xbox One GPUs Feature Enough Compute Units to Utilize Full Potential of Tiled Streaming?	</title>
	<atom:link href="https://gamingbolt.com/do-ps4xbox-one-gpus-feature-enough-compute-units-to-utilize-full-potential-of-tiled-streaming/feed" rel="self" type="application/rss+xml" />
	<link>https://gamingbolt.com/do-ps4xbox-one-gpus-feature-enough-compute-units-to-utilize-full-potential-of-tiled-streaming</link>
	<description>Get a Bolt of Gaming Now!</description>
	<lastBuildDate>Tue, 29 Apr 2014 20:16:00 +0000</lastBuildDate>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>
	<item>
		<title>
		By: Cigi		</title>
		<link>https://gamingbolt.com/do-ps4xbox-one-gpus-feature-enough-compute-units-to-utilize-full-potential-of-tiled-streaming#comment-227493</link>

		<dc:creator><![CDATA[Cigi]]></dc:creator>
		<pubDate>Tue, 29 Apr 2014 20:16:00 +0000</pubDate>
		<guid isPermaLink="false">http://gamingbolt.com/?p=192178#comment-227493</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://gamingbolt.com/do-ps4xbox-one-gpus-feature-enough-compute-units-to-utilize-full-potential-of-tiled-streaming#comment-227415&quot;&gt;justerthought&lt;/a&gt;.

Surry but you are very wrong. You have to take the Move engines into account. 

Also as mentioned in this post there is a difference between PRT and TR. yes they are both hardware implementations of it, but the Xbox one has gone a step further and created the ENTIRE system to be low latency and capable of &#039;MOVE&#039;ing textures in a very low latency fashion.

You see, if anyone can go have a look at a very OLD game called RAGE, rage had one MASSIVE texture which was the entire Game, they used &#039;megatextures&#039; to &#039;STREAM&#039; in the textures as they were needed.

It was slow and also due to 2 factors was very limiting.

1) all textures were Extremely Compressed as having one large image for the entire game instead of using the same image over and over and over and over again (like normal games, which have boring sceneries) all the texture data was Unique, every single part of the game was not the same as anything else. so what this means was 100&#039;s of GB&#039;s of Texture data and they had to compress the &#039;crap&#039; out of it, so much so that it needed 3 DVD&#039;s to install on the xbox 360 version of the game.

*We have Blueray discs now which allow for much higher quality textures.

2)With higher quality textures would then mean longer &#039;pop in&#039; times for images on the screen because with megatextures it was all done with software.

*This is where the xbox one&#039;s move engines, low latency DDR3 ram and entire low latency system comes into consideration, as this then means very high quality textures can be streamed in without taxing the GPU (which on the PS4 it will be using the GPU to stream data in)

*secondly, because they are now only streaming the data that they need (as what you can see, instead of the entire textures (which is what is done currently) this also means that you are no longer needing bandwidth like GDDR5 and instead will only need a tiny fraction of it) which means that Not only is DDR3 better suited for CPU, it will also be better for Tiled Resources as it&#039;s latency is much better than GDDR5.

The other thing to note is that &#039;like with rage&#039; you could have 1080p graphics at 60 fps &#039;constant&#039; and have massive unique worlds and it used next to nothing of the GPU.

This is just one of the many reasons why I also said that when MS get the devs to use TR things are going to shine (and guess what, we are now seeing that sony will be using the likes of Granite... funny that, because they know it is the future, and the xbox one is already built to exploit this.

http://gamingbolt.com/granite-sdk-interview-delivering-next-gen-texture-streaming-and-compression-middleware#sF6TsVGHWtbtzM9Q.99

Perhaps the biggest benefits of using the Granite SDK is less taxation on the GPU side of things. This is especially relevant in case of PS4 and Xbox One who have far less compute units available [18 and 12 respectively] compared to a high end PC GPU. So the question of whether those GPUs have enough compute units to fully utilize the potential of Granite SDK is irrelevant according to Aljosha. 


So there goes you point of 18 vs 12 CU.]]></description>
			<content:encoded><![CDATA[<p>In reply to <a href="https://gamingbolt.com/do-ps4xbox-one-gpus-feature-enough-compute-units-to-utilize-full-potential-of-tiled-streaming#comment-227415">justerthought</a>.</p>
<p>Surry but you are very wrong. You have to take the Move engines into account. </p>
<p>Also as mentioned in this post there is a difference between PRT and TR. yes they are both hardware implementations of it, but the Xbox one has gone a step further and created the ENTIRE system to be low latency and capable of &#8216;MOVE&#8217;ing textures in a very low latency fashion.</p>
<p>You see, if anyone can go have a look at a very OLD game called RAGE, rage had one MASSIVE texture which was the entire Game, they used &#8216;megatextures&#8217; to &#8216;STREAM&#8217; in the textures as they were needed.</p>
<p>It was slow and also due to 2 factors was very limiting.</p>
<p>1) all textures were Extremely Compressed as having one large image for the entire game instead of using the same image over and over and over and over again (like normal games, which have boring sceneries) all the texture data was Unique, every single part of the game was not the same as anything else. so what this means was 100&#8217;s of GB&#8217;s of Texture data and they had to compress the &#8216;crap&#8217; out of it, so much so that it needed 3 DVD&#8217;s to install on the xbox 360 version of the game.</p>
<p>*We have Blueray discs now which allow for much higher quality textures.</p>
<p>2)With higher quality textures would then mean longer &#8216;pop in&#8217; times for images on the screen because with megatextures it was all done with software.</p>
<p>*This is where the xbox one&#8217;s move engines, low latency DDR3 ram and entire low latency system comes into consideration, as this then means very high quality textures can be streamed in without taxing the GPU (which on the PS4 it will be using the GPU to stream data in)</p>
<p>*secondly, because they are now only streaming the data that they need (as what you can see, instead of the entire textures (which is what is done currently) this also means that you are no longer needing bandwidth like GDDR5 and instead will only need a tiny fraction of it) which means that Not only is DDR3 better suited for CPU, it will also be better for Tiled Resources as it&#8217;s latency is much better than GDDR5.</p>
<p>The other thing to note is that &#8216;like with rage&#8217; you could have 1080p graphics at 60 fps &#8216;constant&#8217; and have massive unique worlds and it used next to nothing of the GPU.</p>
<p>This is just one of the many reasons why I also said that when MS get the devs to use TR things are going to shine (and guess what, we are now seeing that sony will be using the likes of Granite&#8230; funny that, because they know it is the future, and the xbox one is already built to exploit this.</p>
<p><a href="http://gamingbolt.com/granite-sdk-interview-delivering-next-gen-texture-streaming-and-compression-middleware#sF6TsVGHWtbtzM9Q.99" rel="ugc">http://gamingbolt.com/granite-sdk-interview-delivering-next-gen-texture-streaming-and-compression-middleware#sF6TsVGHWtbtzM9Q.99</a></p>
<p>Perhaps the biggest benefits of using the Granite SDK is less taxation on the GPU side of things. This is especially relevant in case of PS4 and Xbox One who have far less compute units available [18 and 12 respectively] compared to a high end PC GPU. So the question of whether those GPUs have enough compute units to fully utilize the potential of Granite SDK is irrelevant according to Aljosha. </p>
<p>So there goes you point of 18 vs 12 CU.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: justerthought		</title>
		<link>https://gamingbolt.com/do-ps4xbox-one-gpus-feature-enough-compute-units-to-utilize-full-potential-of-tiled-streaming#comment-227415</link>

		<dc:creator><![CDATA[justerthought]]></dc:creator>
		<pubDate>Tue, 29 Apr 2014 07:51:00 +0000</pubDate>
		<guid isPermaLink="false">http://gamingbolt.com/?p=192178#comment-227415</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://gamingbolt.com/do-ps4xbox-one-gpus-feature-enough-compute-units-to-utilize-full-potential-of-tiled-streaming#comment-223270&quot;&gt;Cigi&lt;/a&gt;.

What you fail to understand is that a tiled ESRAM stream is singularly linear. Any change in direction needs a restarted stream. Half built textures are scrapped to build the new one, or new ones cannot start until the previous one is complete. Processing is wasted resulting in latency and pop-in.


When a game requires multiple textures at the same time in response to where the player goes in an open world, it is clear that an ESRAM linear tiled stream is not going to be powerful enough, no matter how fast it runs. It might be OK for a linear tiled graphical tunnel racer like Forza where it can be predicted what textures are need in advance around the next bend, but not in an open world where the game cannot predict where the player will look next.

The PS4 runs all its data in parallel almost as fast as the 32MB ESRAM linear stream maxed out. And don&#039;t forget, all these textures you&#039;re building up with linear streams, where are they going to go while your rendering the frame. They cannot go back into the ESRAM. It&#039;s only a short term cache. On a PC they go into the GDDR5 RAM. They cannot float in thin air, so on XB1 they have to go into the slow DDR3 RAM with the GPU still waiting for data while drawing frames into the frame buffer.

ESRAM on the die as you put it is a major flaw. In order to fit it on, the GPU has had to be cut in size. PS4 GPU has 50% more compute units than XB1, 18 vs 12. That performance drop can never be reclaimed. Latency is not an issue for graphics. It deals with large data sizes, not lots of small data sizes in quick succession like you get with general computing on a PC where the DDR3 is put to best use. For gaming, speed and parallel processing is more important than latency, so that is why every gaming PC worth any merit has at least 2GB of GDDR5 RAM attached to the GPU.

Dream on buddy. You&#039;ll eventually get the message.  I&#039;m going to call you the fanboy because your sensitivity to the term is very telling.]]></description>
			<content:encoded><![CDATA[<p>In reply to <a href="https://gamingbolt.com/do-ps4xbox-one-gpus-feature-enough-compute-units-to-utilize-full-potential-of-tiled-streaming#comment-223270">Cigi</a>.</p>
<p>What you fail to understand is that a tiled ESRAM stream is singularly linear. Any change in direction needs a restarted stream. Half built textures are scrapped to build the new one, or new ones cannot start until the previous one is complete. Processing is wasted resulting in latency and pop-in.</p>
<p>When a game requires multiple textures at the same time in response to where the player goes in an open world, it is clear that an ESRAM linear tiled stream is not going to be powerful enough, no matter how fast it runs. It might be OK for a linear tiled graphical tunnel racer like Forza where it can be predicted what textures are need in advance around the next bend, but not in an open world where the game cannot predict where the player will look next.</p>
<p>The PS4 runs all its data in parallel almost as fast as the 32MB ESRAM linear stream maxed out. And don&#8217;t forget, all these textures you&#8217;re building up with linear streams, where are they going to go while your rendering the frame. They cannot go back into the ESRAM. It&#8217;s only a short term cache. On a PC they go into the GDDR5 RAM. They cannot float in thin air, so on XB1 they have to go into the slow DDR3 RAM with the GPU still waiting for data while drawing frames into the frame buffer.</p>
<p>ESRAM on the die as you put it is a major flaw. In order to fit it on, the GPU has had to be cut in size. PS4 GPU has 50% more compute units than XB1, 18 vs 12. That performance drop can never be reclaimed. Latency is not an issue for graphics. It deals with large data sizes, not lots of small data sizes in quick succession like you get with general computing on a PC where the DDR3 is put to best use. For gaming, speed and parallel processing is more important than latency, so that is why every gaming PC worth any merit has at least 2GB of GDDR5 RAM attached to the GPU.</p>
<p>Dream on buddy. You&#8217;ll eventually get the message.  I&#8217;m going to call you the fanboy because your sensitivity to the term is very telling.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Xtreme Derp		</title>
		<link>https://gamingbolt.com/do-ps4xbox-one-gpus-feature-enough-compute-units-to-utilize-full-potential-of-tiled-streaming#comment-223350</link>

		<dc:creator><![CDATA[Xtreme Derp]]></dc:creator>
		<pubDate>Mon, 07 Apr 2014 16:53:00 +0000</pubDate>
		<guid isPermaLink="false">http://gamingbolt.com/?p=192178#comment-223350</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://gamingbolt.com/do-ps4xbox-one-gpus-feature-enough-compute-units-to-utilize-full-potential-of-tiled-streaming#comment-223348&quot;&gt;GHz&lt;/a&gt;.

I don&#039;t think I said TR and PRT are the same thing, they&#039;re different implementations between DirectX and OpenGL. Didn&#039;t mean to give that impression.

There&#039;s a bit of confusion in that AMD calls its hardware function &quot;PRT&quot; but PRT is also sometimes referred to OpenGL&#039;s implementation. I think the &quot;official&quot; name in OpenGL is the &quot;sparse texture&quot; function or something like that.

Tiled Resources could be more advanced than PRT, I&#039;m not exactly sure. PS4&#039;s API is a custom derivative of OpenGL so Sony engineers could have written their own way of accessing the GCN feature set.]]></description>
			<content:encoded><![CDATA[<p>In reply to <a href="https://gamingbolt.com/do-ps4xbox-one-gpus-feature-enough-compute-units-to-utilize-full-potential-of-tiled-streaming#comment-223348">GHz</a>.</p>
<p>I don&#8217;t think I said TR and PRT are the same thing, they&#8217;re different implementations between DirectX and OpenGL. Didn&#8217;t mean to give that impression.</p>
<p>There&#8217;s a bit of confusion in that AMD calls its hardware function &#8220;PRT&#8221; but PRT is also sometimes referred to OpenGL&#8217;s implementation. I think the &#8220;official&#8221; name in OpenGL is the &#8220;sparse texture&#8221; function or something like that.</p>
<p>Tiled Resources could be more advanced than PRT, I&#8217;m not exactly sure. PS4&#8217;s API is a custom derivative of OpenGL so Sony engineers could have written their own way of accessing the GCN feature set.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: GHz		</title>
		<link>https://gamingbolt.com/do-ps4xbox-one-gpus-feature-enough-compute-units-to-utilize-full-potential-of-tiled-streaming#comment-223348</link>

		<dc:creator><![CDATA[GHz]]></dc:creator>
		<pubDate>Mon, 07 Apr 2014 16:47:00 +0000</pubDate>
		<guid isPermaLink="false">http://gamingbolt.com/?p=192178#comment-223348</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://gamingbolt.com/do-ps4xbox-one-gpus-feature-enough-compute-units-to-utilize-full-potential-of-tiled-streaming#comment-223300&quot;&gt;Xtreme Derp&lt;/a&gt;.

You&#039;re absolutely right about PRT GCN connection, and that PRT is supported in both consoles, but you&#039;re wrong in believing that TR and PRT are exactly the same. They share similar philosophies but TR is more advanced @ the moment. PRT is only limited to textures while TR can do other things besides textures, hence the reason they refer to it as &quot;resources&quot; and not &quot;textures&quot;. Implementing shadows  through shadow mapping is one example. 


But openGL in due time can probably add extensions to bring it up to speed to TR. In the meantime, from what graphine software and MS had to say, TR is still better deal for now.  


Thanks for the correction though! Much appreciated! :)]]></description>
			<content:encoded><![CDATA[<p>In reply to <a href="https://gamingbolt.com/do-ps4xbox-one-gpus-feature-enough-compute-units-to-utilize-full-potential-of-tiled-streaming#comment-223300">Xtreme Derp</a>.</p>
<p>You&#8217;re absolutely right about PRT GCN connection, and that PRT is supported in both consoles, but you&#8217;re wrong in believing that TR and PRT are exactly the same. They share similar philosophies but TR is more advanced @ the moment. PRT is only limited to textures while TR can do other things besides textures, hence the reason they refer to it as &#8220;resources&#8221; and not &#8220;textures&#8221;. Implementing shadows  through shadow mapping is one example. </p>
<p>But openGL in due time can probably add extensions to bring it up to speed to TR. In the meantime, from what graphine software and MS had to say, TR is still better deal for now.  </p>
<p>Thanks for the correction though! Much appreciated! 🙂</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Guest		</title>
		<link>https://gamingbolt.com/do-ps4xbox-one-gpus-feature-enough-compute-units-to-utilize-full-potential-of-tiled-streaming#comment-223333</link>

		<dc:creator><![CDATA[Guest]]></dc:creator>
		<pubDate>Mon, 07 Apr 2014 14:38:00 +0000</pubDate>
		<guid isPermaLink="false">http://gamingbolt.com/?p=192178#comment-223333</guid>

					<description><![CDATA[Microsoft has bet the farm on DX12 and to ensure a win they need to make it a HUGE install base. X1 sales alone are not going to achieve that, we can all agree, so by also releasing Windows Threshold and making D3D12 universal they exponentially increased the potential install base and relevance of DX to a point where the optimizations (e.g. tiled resources) will be something devs will take time to program. X1 was designed specifically for DX12 and those optimizations and we&#039;ll see that hopefully at E3 this year with announced games.


PS4 has the brute force and it will remain in the lead for the current foreseeable future, but I do believe parity is on the horizon...so long as MS can prove to dev&#039;s that optimizing through DX12 is worth their time.]]></description>
			<content:encoded><![CDATA[<p>Microsoft has bet the farm on DX12 and to ensure a win they need to make it a HUGE install base. X1 sales alone are not going to achieve that, we can all agree, so by also releasing Windows Threshold and making D3D12 universal they exponentially increased the potential install base and relevance of DX to a point where the optimizations (e.g. tiled resources) will be something devs will take time to program. X1 was designed specifically for DX12 and those optimizations and we&#8217;ll see that hopefully at E3 this year with announced games.</p>
<p>PS4 has the brute force and it will remain in the lead for the current foreseeable future, but I do believe parity is on the horizon&#8230;so long as MS can prove to dev&#8217;s that optimizing through DX12 is worth their time.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Johnny		</title>
		<link>https://gamingbolt.com/do-ps4xbox-one-gpus-feature-enough-compute-units-to-utilize-full-potential-of-tiled-streaming#comment-223319</link>

		<dc:creator><![CDATA[Johnny]]></dc:creator>
		<pubDate>Mon, 07 Apr 2014 12:44:00 +0000</pubDate>
		<guid isPermaLink="false">http://gamingbolt.com/?p=192178#comment-223319</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://gamingbolt.com/do-ps4xbox-one-gpus-feature-enough-compute-units-to-utilize-full-potential-of-tiled-streaming#comment-223277&quot;&gt;GHz&lt;/a&gt;.

Alot of people don&#039;t understand what you are saying sadly enough.  To many keyboard devs these days. ]]></description>
			<content:encoded><![CDATA[<p>In reply to <a href="https://gamingbolt.com/do-ps4xbox-one-gpus-feature-enough-compute-units-to-utilize-full-potential-of-tiled-streaming#comment-223277">GHz</a>.</p>
<p>Alot of people don&#8217;t understand what you are saying sadly enough.  To many keyboard devs these days. </p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: You are flat out wrong		</title>
		<link>https://gamingbolt.com/do-ps4xbox-one-gpus-feature-enough-compute-units-to-utilize-full-potential-of-tiled-streaming#comment-223305</link>

		<dc:creator><![CDATA[You are flat out wrong]]></dc:creator>
		<pubDate>Mon, 07 Apr 2014 05:00:00 +0000</pubDate>
		<guid isPermaLink="false">http://gamingbolt.com/?p=192178#comment-223305</guid>

					<description><![CDATA[This told us absolutely nothing. Thanks!]]></description>
			<content:encoded><![CDATA[<p>This told us absolutely nothing. Thanks!</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Xtreme Derp		</title>
		<link>https://gamingbolt.com/do-ps4xbox-one-gpus-feature-enough-compute-units-to-utilize-full-potential-of-tiled-streaming#comment-223301</link>

		<dc:creator><![CDATA[Xtreme Derp]]></dc:creator>
		<pubDate>Mon, 07 Apr 2014 03:42:00 +0000</pubDate>
		<guid isPermaLink="false">http://gamingbolt.com/?p=192178#comment-223301</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://gamingbolt.com/do-ps4xbox-one-gpus-feature-enough-compute-units-to-utilize-full-potential-of-tiled-streaming#comment-223270&quot;&gt;Cigi&lt;/a&gt;.

Latency is a non-issue, PS4 has hardware level support for PRT like all AMD GCN GPUs.

This is just typical ignorant fanboys grasping at straws on things they don&#039;t understand and will have little to no impact on the graphics performance gap.]]></description>
			<content:encoded><![CDATA[<p>In reply to <a href="https://gamingbolt.com/do-ps4xbox-one-gpus-feature-enough-compute-units-to-utilize-full-potential-of-tiled-streaming#comment-223270">Cigi</a>.</p>
<p>Latency is a non-issue, PS4 has hardware level support for PRT like all AMD GCN GPUs.</p>
<p>This is just typical ignorant fanboys grasping at straws on things they don&#8217;t understand and will have little to no impact on the graphics performance gap.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Xtreme Derp		</title>
		<link>https://gamingbolt.com/do-ps4xbox-one-gpus-feature-enough-compute-units-to-utilize-full-potential-of-tiled-streaming#comment-223300</link>

		<dc:creator><![CDATA[Xtreme Derp]]></dc:creator>
		<pubDate>Mon, 07 Apr 2014 03:40:00 +0000</pubDate>
		<guid isPermaLink="false">http://gamingbolt.com/?p=192178#comment-223300</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://gamingbolt.com/do-ps4xbox-one-gpus-feature-enough-compute-units-to-utilize-full-potential-of-tiled-streaming#comment-223277&quot;&gt;GHz&lt;/a&gt;.

Completely wrong. All AMD GPUs based on their GCN architecture have hardware support for PRT. That includes both consoles.

http://www.anandtech.com/show/4455/amds-graphics-core-next-preview-amd-architects-for-compute/5

http://www.amd.com/en-us/innovations/software-technologies/gcn]]></description>
			<content:encoded><![CDATA[<p>In reply to <a href="https://gamingbolt.com/do-ps4xbox-one-gpus-feature-enough-compute-units-to-utilize-full-potential-of-tiled-streaming#comment-223277">GHz</a>.</p>
<p>Completely wrong. All AMD GPUs based on their GCN architecture have hardware support for PRT. That includes both consoles.</p>
<p><a href="http://www.anandtech.com/show/4455/amds-graphics-core-next-preview-amd-architects-for-compute/5" rel="nofollow ugc">http://www.anandtech.com/show/4455/amds-graphics-core-next-preview-amd-architects-for-compute/5</a></p>
<p><a href="http://www.amd.com/en-us/innovations/software-technologies/gcn" rel="nofollow ugc">http://www.amd.com/en-us/innovations/software-technologies/gcn</a></p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Rick Rambo		</title>
		<link>https://gamingbolt.com/do-ps4xbox-one-gpus-feature-enough-compute-units-to-utilize-full-potential-of-tiled-streaming#comment-223293</link>

		<dc:creator><![CDATA[Rick Rambo]]></dc:creator>
		<pubDate>Mon, 07 Apr 2014 00:56:00 +0000</pubDate>
		<guid isPermaLink="false">http://gamingbolt.com/?p=192178#comment-223293</guid>

					<description><![CDATA[Shut up.....everyone. I have never read and heard so much bullish!t in my life. My advice, live and let live. If you have PS4, love it. If you have XB1, love it. Bunch of dumba$$ haters on here!!!]]></description>
			<content:encoded><![CDATA[<p>Shut up&#8230;..everyone. I have never read and heard so much bullish!t in my life. My advice, live and let live. If you have PS4, love it. If you have XB1, love it. Bunch of dumba$$ haters on here!!!</p>
]]></content:encoded>
		
			</item>
	</channel>
</rss>
