PNY Reveal The GTX 970 & GTX 980

Can I have one? Please?

Posted By | On 22nd, Sep. 2014 Under News | Follow This Author @@Martintoney2012


PNG GeForce

Sometimes PC gamers get some love too! I mean, outside of huge sales, better visuals and sounds, more options…yeah yeah, satire. Seriously though, PNY Technologies, the Goliath tech manufacturer revealed its latest entry into the top tier graphics card market.

PNY have announced the release of the GeForce GTX 970 & GTX 980 graphics cards, featuring all the latest and greatest from that amazing company that all PC gamers love, Nvidia. The cards are set to offer excellent power efficiency, it comes stacked with 4gb of GDDR5 , burns at a memory speed of 7000MHZ and is base clocked at 1126 MHZ and boosts to 1216 MHZ.

The GTX 970 is a little bit behind, but even it packs a considerable wallop. 1664 CUDA cores, a base clock of 1051 MHZ and a boost of 1178 MHZ, like the big brother it rocks 4gb of GDDR 5.

Some of you may think that AMD is the way to go for cheap gaming on a PC, but you can get the GTX 970 for £259 and the GTX 980 for £429, recommended retail price. And both come with a three year guarantee. That’s practically giving them away!

Apparently, the GTX 980 is the fastest graphics card currently available on the market, isn’t that a fancy claim!


Awesome Stuff that you might be interested in

  • Ricoh123

    The burden of upgrading makes pc gaming not worth it.

    My console is set to play all the newest games for the next 10 years.

    No thanks to this waste of money that will never even be able to play the very best exclusives the gaming industry has to offer.

    • Steve

      “My console is set to play the newest games for the next 10 years”.

      If you enjoy playing games at 720-900p@30-50hz, all the more power to you. My current system costed a fortune compared to any console. However, I could game comfortably at 1440p @ 120hz (Max refresh rate). I also have dual 1080p monitors on the side that I can use for reference material, voice chat UIs, music, editors, etc, that could never be done on a console on a second monitor.

      I’m not sure what burden you’re talking about as far as upgrading PCs is concerned. Most desktops have 7 parts in them, if you include the case. Motherboard, CPU, Video card, power supply, hard drive, memory. None of them are tremendously complicated or beyond comprehension. There’s over 1000 tools to choose from to find out where your system is lacking in performance and how to upgrade it.

      Not only that, the one thing you can upgrade that would have given consoles an incredible performance boost would have been solid state drives. However, the SATA controllers they put inside them are so poor in specification that any solid state drive will only peak at HDD speeds, resulting in no considerable increases in loading times, or performance.

      To be fair, it could certainly be argued that the equivalent to the ps4 or Xbox One’s video card is the AMD Radeon 7770. Although the 970 is the equivalent to price of the average console, it certainly gets its value in performance, putting out 3x the graphical workload in the same amount of time. Arguing that this card is a “waste of money” however, doesn’t make me feel confident in your ability to reason.

      Worse yet the simple fact that, these cards “Will never even be able to play the very best exclusives the gaming industry has to offer”. In 2 more years, some of the games that come out on PC will be available on console. I’m sorry to tell you this, but most of these games are going to be represented on “Low” quality presets to their PC equivalent.

      The way I see it, console gamers are paying more money. PC gamers pay more for hardware, and console gamers pay much more for games. I have a steam library with over 200 games that I can still play to this day, and I haven’t spent more than 1200 dollars on 200 games. Worse still, is the fact that companies like Microsoft and Sony see that preventing cross-compatibility between Xbox360 > Xbox One and PS3 > PS4 is an effective method to keep sales for the xbox360 and Ps3 alive and healthy. This is unbelievable to me, but an industry standard.

      If you asked an individual which he would prefer- The same thing over and over for the next 10 years for cheap, or something he can improve whenever it needs to be improved but with additional cost over time, which do you think he is more likely to take?

    • Ricoh123

      Did you enjoy the forza 2 demo?

      Exactly.

    • Steve

      You’re really trying to say that exclusive games make consoles superior now? If we want to play to that tune there are about 20,000 different titles I can pull out of my hat. You didn’t read a word I said. You can’t even enjoy Grid 2 or need for speed shift 1 or 2 because the bar is set too low for consoles, and you can never play an experimental soft-bodied physics based driving game such as BeamNG.Drive on any current generation console. There’s about 15 upcoming racing games on the steam marketplace alone for PC. Enjoy Forza 2 my friend. It’s the least I can say for your sake.

    • Ricoh123

      I’ll contemplate your 20,000 pc exclusives and raise you my 500,000 Android exclusives.

      Quantity is clearly greater than quality in your books, therefore my phone beats your pc. 😉

    • Steve

      How can someone who believes consoles are superior to everything say anyone who understands PC gaming hardware doesn’t care about quality? You want to talk about quality? Space Engineers. BeamNG.Drive. Elite Dangerous. 7 Days to die. Star Citizen, These games aren’t even released, and neither is yours.

      PC has greater mod support than any console can ever have. You’re quick to change your argument, first stating that these cards are a waste of money, then trying to put it as a personal attack against PCs in general.

      You can’t even get your game of choice’s name right. “Did you get the Forza 2 demo?” For god’s sake, get your games right if you’re going to talk about them. Forza Horizon 2. That way I’m not staring at a game from 2007 trying to figure out what you’re talking about. Forza 2 will run 1080p max @ 30 frames per second. The vehicle collision deformation brings me utter disappointment, and you think this game is the pinnacle of console quality. Looks like its just going to be another thumb smashing wall grinding console racer.

      I just don’t understand how low resolution games on consoles are superior to higher resolution games on PCs. Not only that, but even at higher resolution PCs are capable of getting higher framerates, even with the cards mentioned in the article. That’s like arguing a fire hydrant is bigger than a skyscraper. Explain it to me and I’ll concede to you on this discussion.

      You never answered my implied question. What burden of upgrading PCs is there? And please don’t let that be the only question you answer. I’m tired of replying to you and getting something as ridiculous as “FORZA 2 I WIN.” and “ANDROID HAS MORE I WIN QUANTITY VS QUALITY”. I didn’t stop here to argue with you. The PS4 has more graphical power than any console on the market right now. It only has 1.8 Teraflops of graphical computing power. The Xbox has 1.2 Teraflops of graphical computing power. The 980 GTX has 4.6 teraflops of graphical computing power, and the 970 can pull up to 4.0. I stopped here to keep you informed, and to hopefully prevent you from making such misinformed ignorant statements in the future. If you’re a troll, please, keep going. I won’t be here.

    • Ricoh123

      All your teraflops in the world, and still will never be able to play the best games to get released.

      Enjoy your burden.

    • Steve

      Reading through your other posts I can tell you’re an Xbox one fanatic. Let me just pick one of your posts at random and discuss them with you since you seem to love discussion because, let’s face it, you keep responding to my posts. You are completely ignorant of everything in the field of hardware architecture and have no idea what you are talking about. You trash the PS4 and don’t understand any of the hardware. You believe that textures are stored on the hard drive on the xbox and therefore the xbox is superior to the PS4 because the PS4 stores it on ram. News flash. It’s stored on the RAM. Every texture ever, when loading a game, gets loaded to the RAM. They BOTH have 8GB of RAM.

      The differences between software are matters of opinion. People with your know-it-all mindset I don’t need to know more attitude utterly disgust me. I’m perfectly open to being corrected and learning more. Enjoy your hardware that’s obsolete for the next 10 years. I’m going to go enjoy Skyrim with 300 mods and high-resolution texture packs. You can enjoy your DLC and constant stream of games you love so much by EA, after all they take up half your marketplace.

      I’m the one with the burden? I pity the troll.

    • Ricoh123

      Wrong.

      With tiled resources, the textures are streamed from the hard drive.

      32mb of esram can stream 6gb of textures in chunks.

      Can’t be bothered to explain more, but your knowledge is 2010 of architecture.

    • Steve

      Are you kidding me? 32mb can stream 6GB of texture in chunks? ESRam has has a bandwidth of 109GB/S (Source: Microsoft). But there’s only 32MB of it. What are you talking about? How can you- What are you talking about!? The maximum bandwidth of the Xbox one’s mechanical hard drive is 130MB/S. The bandwidth of the system according to Microsoft is 192GB/S. I would have to assume most of this is video bandwidth because no system memory moves data this fast. You can’t go from 130MB/S and put it through pure magic to get it to the 109GB/S limit of ESRAM. That’s called a bottleneck. I feel so bad for you. I really do. You have no idea.

    • Ricoh123

      My young, young padawan……

      The esram doesn’t store the textures, it stores an index to the textures. Because it is super fast, textures can be referenced from the hard drive at break neck speeds.

      32mb of index equates to 6gb of accessible texture.

      That’s not even the clever part.

      The clever part is the gpu never processes this. It’s all done by the cpu.

      Hows that 1.3 teraflops looking like now huh, knowing not one flop of that will ever be processing textures.

      The net result is a far more efficient and technically capable system than the ps4, and an architecture not yet found in any pc.

      Of course, when a developer uses the 32mb esram to process actual textures in 32mb chunks, the result will of course be a lower resolution game.

      But when used properly is a system more efficient than any current pc.

      I can’t name one pc driving game that runs with the draw distance of fh2 with zero frame loss. Every pc game I’ve ever encountered has some form of microstutter.

      Doesnt happen once on fh2. Not once. Superior smoothness.

    • Steve

      First off. Thank you for leaving me a reply greater than two lines.

      So let me get this straight. You think that RAM, regular GPU or system RAM, has slower indexing than if it were indexed on an ESRAM chip? Not only that but the CPU does the texture handling using its compute power? In what universe does anything combined with a traditional hard drive produce break neck speeds?

      You would be right in saying that ESRAM is much faster than RAM. In fact it’s 100 times faster. So lets see how it plays out below if we use it for indexing.

      Below I have written out a simulation. What I state below is not 100% accurate based on reality, but 100% accurate based on specifications.

      There are 1000000000 nanoseconds in a second. We’ll work with a base 60FPS to show what I’m talking about with a small, 500 texture scene.

      There are 16,666,666.66ns per frame.

      You will have to load these files into the system’s hardware first. This will take several seconds up to a minute. Given the massive bandwidth of the architectures, it is basically negligible as well.

      DDR3 RAM takes about 10NS to cycle.
      Standard Architecture: Laptops use this setup. The have the RAM mixed with the GPU RAM. You need to cycle this twice. Once for index, and once for data. @ 500 textures, this will take 10000ns

      ESRAM takes about 0.05NS to cycle.
      With ESRAM you need to cycle this once. @ 500 textures, this will take 25ns, but then you must pull it from RAM because your system has no video RAM. 5000ns. A total of 5025ns.

      GTX 970 GDDR5 @ 7000MHZ takes 0.28ns to cycle. It will need to do this twice. Once for index, and once for data. @ 500 textures, it will take 140ns.

      The setup utilizing GDDR5 @ 7000MHZ reduces frame variance times maximally, which is the architecture displayed behind the video card on this page. Regardless of which architecture you choose, the net result is pointless, that is if we’re solely basing them off access times. There is not a considerable enough difference between the three architectures to result in any diminishable performance, or benefits of performance. Please tell me again how your console is vastly superior. You can scale this as much as you want. The delays caused by memory access times are going to be irrelevant compared to the amount of calculations required to handle the data.
      The part of the game that causes the most frame variances is rendering. Handling vertexes, textures, shaders, filters, pixels, etc, this is all done by the video card’s GPU.

      Your CPU can only perform calculations at 8% of the efficiency of a GPU. Do you know why? Because A CPU is not designed for parallelism like a GPU is. It’s designed to do a general function, very quickly. The playstation 4’s CPU has a total compute power of 102 Gigaflops, and the Xbox One’s CPU has 112 Gigaflops of compute power. The way you describe your console is brutally inefficient. Even if the programmers had tools good enough to perfectly balance the load between the CPU and GPU, your console would only have 1.42 teraflops of maximum compute performance.

      If your systems architecture is so advanced, why can’t it yield 60FPS on Forza Horizon 2? My desktop CPU which I don’t take a lot of pride in, performs at a total of 375 gigaflops. For you to sit there and tell me that texture handling offloaded to a CPU will have a dramatic increase in performance it is very disappointing to me. Consoles have all this bandwidth, but no hardware to utilize it.

    • Jon

      It’s funny that you’re trying to justify a “next-gen” console that isn’t even vaguely close to today’s gaming PC specifications.

      Very well. You can have it your way; I’ll have it my way. I’ll have my superior 144hz monitor running at a LightBoosted 120 FPS for zero motion blur- and with similar or even higher graphics presets, thanks.

    • Ricoh123

      I had a benq light boosted monitor and my tv has light boost on it too.

      The picture flickers and is too dark.

      It’s nothing to boast about and in most ways offers worse quality.

      My Sony tv has very little motion blue, offers richer colours than any monitor I’ve ever seen and hides screen tear very very well.

      Pcs micro stutter alot and you’ll never game at a consistent 144 fps.

      Smooth 30 fps is better than micro stuttering 144 fps.

      I’ve never seen just one pc game run anything as smooth as forza h 2 on the xb1. There is zero frame drops and zero stutter.

      Smooth fps offers a better experience than high fps.

    • Coovargo

      Going to put this in as simple of a manner as I know how to. Your understanding of microstuttering is very 2010. It’s not like we have video cards today dropping 2-3 frames anymore. The worst I’ve ever seen microstuttering is between 20-80ms. That’s absolutely awful, and those times are completely gone.

      Some would argue that the following information is a matter of opinion. I would argue it’s not, as I’ve tried to based it off human perception as best as I could.

      At 144FPS, the frame variance is 7ms per frame. At framerates of 60 or above, your brain can’t tell the difference. A gap of 16ms is required at this point to cause any noticeable issue. That’s two ENTIRE skipped frames at this speed. A total of 23ms (A short dip to 45FPS), would be required to be fairly obvious to some, most, not a concern.

      At 60FPS the frame variance is 16ms per frame. At this frame rate, a gap of 12MS would be required for microstuttering to be noticeable. That’s one skipped frame. A total of 28ms per frame, that’s not even an entire frame, and we’ve dropped down to nearly 30FPS for it to be fully noticeable.

      At 30FPS the frame variance is 33ms. At 33ms per frame, there is utterly no room for human perception to notice microstuttering. You could have a frame variance between 33ms and 40ms, and you’d be fine with that. But if it’s much worse your gameplay experience will be beyond ruined. This is why consoles pre-render at least one frame, so if something goes wrong, it at least hands something to the player even if it’s out of sync and has time to smooth itself out.

      Worse yet is the input delay. Due to the pre-rendered frame, consoles have an average of 70MS input delay from the controller to the screen. It’s even worse yet depending on your television. If I don’t set mine to “Game Mode” i have a 310-330ms delay. I took the time to film it with a camera one day and reduce it frame by frame. What I can’t believe is that people actually play like this and never notice it.

      On my PC the input delay is GREATLY reduced, in a worst case scenario, between 10-30MS by the time it goes from my input on the keyboard or mouse to the monitor.

      My personal preference is a keyboard and mouse for first person shooters. With racing style games, I prefer a controller because you can be precise with your steering. You can’t put a keyboard and mouse on a console. You can buy an adapter for it but it usually adds 50ms which completely ruins the point of using it in the first place. I can attach a Xbox 360 controller (Or any other USB/Bluetooth controller for that matter) to my PC without a second thought.

      Nvidia has performed extensive research into frame skipping/microstuttering since the 600 series and as a result, nearly all frame variance/microstuttering has been removed. During one testing of gameplay, Grid Autosport on PC tested with a maximum frame variance of 2.6ms on the gtx 970, and a frame variance of 2.5ms on the gtx 980 @ 1080p. At 4K, the frame variances are even less, showing the true power of these cards, landing respectively at 1.9 for the 980, and 2.1 for the 970.

      In summary, I’ll take a 144hz monitor with a 3ms microstutter and 10-30ms input lag any day compared to your 30hz 70ms+ input lag.

    • Ricoh123

      My tv has an input lag of 12ms.

      I bought the fastest on the market at the time of purchase. Sure, my benq monitor had even less input lag. But at that speed is not possible to distinguish.

      Pc games suffer from inconsistent frame rates alot more than consoles and show more noticeable microstutter.

      Yes pcs are more powerful, but consoles are more efficient.

      A consistent 30fps is much more preferable than a microstuttering 144fps.

      That’s the point.

    • Steve

      I can’t even call it microstuttering because it’s not dropping full frames or even half frames. With Gsync you’re not rendering more frames causing incomplete frames to be rendered. You cannot, EVER notice 3ms of frame variance.

      To give an example, here is a list of what a simulation would be if the frame variance maxed at any point during gameplay. You only notice micro-stuttering if it takes longer than 16ms for a frame to be returned to the monitor. If your framerate is so low (30hz, 34ms between frames.), you couldn’t notice microstuttering or frame variance because your brain adapts to this low framerate.
      Say a game renders 4 frames, in this order at 144fps.

      7ms, 10ms, 8ms, 7ms. Total variance: 4ms from 144hz This experience would range between 100FPS, and 144FPS, well faster than human perception. I personally can notice when a framerate drops below 50FPS. If it’s higher? I certainly can’t tell.

      Now we’re going to add frame pre-rendering, 2 frames.
      Exact same situation. This time you will have an additional 14ms of input delay to compensation for the additional 2 frames.

      8ms, 8ms, 8ms, 8ms. Total Variance: 4ms from 144hz. A smooth 125FPS experience.
      At the maximum frame variance of 3MS, you’re getting 100FPS minimum. You will not notice micro-stuttering in any of the above situations. The reason for pre-rendering frames is for this specific reason. To be a buffer for slow rendered frames.

      Now we’ll do the same situation at 60hz with a video card that stuttered back in 2010. Say you had a card such as a Radeon 5850. You could have up to a 20ms micro-stutter operating at 60hz. Worst case scenario in a 4 frame situation it would look like this:

      16ms, 30ms, 24ms 16ms. – A total frame variance of 24MS for 4 frames.
      This would be VERY noticeable. Two frames took longer than 20ms to draw. and the rest were smooth, resulting in what we would perceive as microstutter. At this point it would be a jittery 33-60fps experience at best. Unnacceptable.

      You could turn pre-rendering on, and at 2 frames of pre-rendering you would have an additional 32ms of input delay on average. The frames would be displayed at the following times:

      22, 23, 22, 23ms. The framerates are well outside of the tolerance of a 20ms, so you will notice a difference between 60fps gameplay. At this rate it would be 43FPS, but it would be perfectly smooth, acceptable. At this rate you would have an input lag of +32ms, a delay that would be intolerable to me.

      Fortunately you can play most games with one of the cards listed on the website at 60FPS regardless with a maximum frame variance of 3ms. Most of them turn out 200FPS+ in some most real world situations. Good luck needing pre-rendering then. You might want G-sync or at the very least V-sync.

      The Xbox one consumes 150W at full load, and produces a total compute power of 1,420 gigaflops. A GTX 970 consumes a total of 150W at full load and produces a total compute power of 3692 at the stock settings. Good thing they scale 1:1. That’s 160% MORE power efficient per calculation.

      Not only that but the GTX 970 has lower read/write/access times even without ESRAM by a factor of 50, as well as a internal bandwidth 30% higher than the entire console.

      The Xbox costs $399.99 U.S. at current market prices. The GTX 970 costs 329.99 U.S. At current market prices.

      GTX 970 – 11.18 gigaflops per dollar.

      Xbox one – 3.55 gigaflops per dollar.

      I could be fair and say the GPU was worth half the expensive of the console. But that would only put you at 7 gigaflops per dollar. Worse yet, I haven’t even talked about the other components yet, and this card is something that can be put in nearly ANY PC. Please tell me where this console having greater efficiency stems from. Please don’t say it’s coded better, Forza 5 has the worst netcode I have ever experienced. The physics are terrible. The physics had to be dumbed down so much for the Xbox One’s CPU that most of the vehicles drive like they’re glued to the ground. The graphics are OK for one of my laptops, and the damage LOD is entirely pre-scripted and pre-rendered.

      144FPS with one of these cards is infinitely superior to the 30FPS consoles drown users with. In today’s games you will never notice the framerate dropping below the limit of human perception, about 50FPS+, with one of the cards listed on the page. A PC gamer could still react 30ms faster than a console gamer in nearly any scenario. When milliseconds are all that counts, that’s a very big difference. There’s a reason console players aren’t allowed to play against PC players.

    • Ricoh123

      Microstuttering happens on pcs for more reasons than you mention.

      Inefficiencies in the operating system.
      Random hard drive access.
      Spyware.
      Malware.
      Virus.
      Anti spyware.
      Anti malware.
      Anti virus.
      Other automated schedules.
      Indexing.
      Automatic updates.
      Random processes.
      Etc etc etc.

      These things simply don’t happen on consoles.

    • Jon

      Are you dense? You bought a gaming monitor for the sole purpose of playing console games, knowing that it’d never run at 144 FPS, much less 120? Please search up what LightBoost does, please.

      Additionally, to cure microstuttering, core parking. huehuehue. At least for Intel.

      Also, explain how Sony and Microsoft got essentially DDoS’d last month, then. Is that a benefit to your argument? Nope.

      The architecture of the PS4 and XBox One are so similar to a PC’s that people can code malware potent enough to affect consoles.

      The only thing efficient about “next-gen” consoles is size. And that’s just the PS4, not the XBrick One.

    • Ricoh123

      I bought a gaming monitor for my pc. Herp. Derp.

    • Steve

      You bought a gaming monitor for your PC but say that the gaming hardware specified on this page is a waste of money? If I were in your position I would have just said, “That’s right. I use a gaming monitor for my console. I like the lower input delay.”

      The GTX 970 in the high end equipment is currently hands down, the highest real world performance per dollar for a video card right now. The 295×2 has more gigaflops per dollar but the inefficiencies in the architecture make 2 970s in SLI outperform it in many situations.

    • Carl

      You continually bash PC gaming for being “inefficient,” yet you buy a gaming monitor for a non-gaming PC.

      Huh. I wonder who’s the inefficient one here.

    • Ricoh123

      You’re very good at misjudgement.

      You should try to be more presumptuous.

      I bought a gaming monitor for my gaming pc.

      (later on i sold them both and never looked back, but that’s another story).

    • Steve

      I’m sure you get malware, spyware, viruses, all the time. I’ve run my PC without anti-viruses and malware for the last 4 years and have never had an issue. I don’t click random advertisements and fly to the dark corners of the internet looking for false information.
      I use solid state drives for my OS and games. I don’t have ANY issue with “Random Hard Drive Access”. Windows 7 is programmed by the same company that made the Xbox One, so I don’t want to hear about inefficiencies. The background running processes of my PC consume 2/16GB of RAM and about 0.5% of my total CPU time peak. So inefficient.

    • Jon

      Easiest way to prevent malware and spyware other than anti-virus programs? None other than AdBlock. 😀

    • solomonshv

      forza is garbage. stupid example.

    • JMAN91

      First things first here. These new consoles will not be able to keep up with PC gaming for the next 10 years. These new consoles are so under powered its almost a joke. Second…. Burden? Take out a screw, open the side, pull the card out, put the new one in, put the side on, screw the screw back in and.. tada. Steam alone makes PC gaming better and cheaper, and I seriously mean cheaper. Even though I paid a lot for my PC. I saved a ton of money on games through steam.

    • Ricoh123

      Come back to me when pc gaming consists of titles other than f2p, shovelware and mobas.

      The three cancers of gaming.

      Consoles have exclusives the pc gaming scene doesn’t even get to sniff.

    • Steve

      Ever play Planetside 2? I’ve put a countless number of hours into this game. They don’t have it on consoles unfortunately. It’s actually pretty fun when you get some friends together on voice chat. It’s even free to play. Of course there are in-game sales for items, paints, etc. There’s no reason to buy them when you can earn them.
      By the context of shovelware, your console is completely filled with them. In all seriousness, the only decent game I can play co op is Diablo 3, and not only that, it’s a very poor port, filled with stuttering and loading issues. One player can’t open their inventory without the entire game being interrupted. Let’s see what other games I can play with my friends on here. Oh, Lego The Hobbit! Fun for the whole family. A triple A title. There’s also Minecraft- Wait that released on PC nearly 4 years ago.

      If you recall correctly, recently a game released on both Xbox one and PC. Shadows of Mordor? Too bad the PC version was a complete flop. Oh wait, sorry, no it wasn’t. The PC version even featured 6GB of Ultra-HD textures that consoles can’t even think about touching without dropping to 10FPS. Come back to me when you have a decent Co-op game I can play with a friend or family member. I don’t want to hear this crap I have to buy a freaking whole extra system and copy of the same game to sit down at the TV and have fun with guests. If it comes down to being forced to buy an extra console just to play with a friend, I am genuinely afraid for the future of consoles. This completely defeats the purpose of owning a console with 4 controllers. The SDKs for Xbox one and PS4 certainly aren’t helping any. My friends can come over and we can play LAN on my steam account games without any issue.

      MOBAs? There are very few MOBA games on PC. 0.5% of the market is not all-encompassing.

      The only cancer of gaming is console game developers. It doesn’t matter which console you choose. They push awful games onto the market and everyone just loves it because there is nothing better on it. Your system doesn’t have many exclusives. I really do wish it had more.
      I’m sorry. I don’t mean to bash on console gaming. But you can’t seriously expect me to sit here while you continue with this holier-than-thou attitude, can you? I do support consoles, but it’s hard when they keep releasing utter garbage. It’s honestly getting worse. Forza Horizon 2 isn’t even going to have split screen based on information from an interview on IGN back in June. I’m really sorry about that. I guess it wouldn’t be so bad if people only played alone.

    • JMAN91

      You have absolutely no idea what you are talking about. Sure consoles do have a “few” good exclusives, but have you took a look at Steam lately?

    • Lee Neighoff

      It’s been two years since your comment (and 3 since the Xbone launched), and there’s already talk of a new Xbox coming out soon. I think it’s clear the current consoles weren’t meant to last 10 years.


 

Copyright © 2009-2015 GamingBolt.com. All Rights Reserved.