Improving Graphics Performance Using Cloud Is Going To Be Really Hard: Ex-Naughty Dog Dev

The idea is tempting but simply offloading assets to cloud won’t work, says Filmic World’s boss and Ex-Naughty Dog developer John Hable.

Posted By | On 31st, May. 2015 Under News | Follow This Author @GamingBoltTweet


Ever since the advent of current gen consoles, there has been a lot of speculation regarding how cloud gaming could be a game changer. Microsoft has been touting the so called power of the cloud for quite some time but except a few impressive tech demonstrations, they basically have nothing.

Sony on the other hand has been using the cloud to stream certain PS3 games on the PS4 using their PlayStation Now technology. But the big question remains answered. Can cloud be really used to enhance performance of consoles? Filmic World’s boss John Hable believes that it’s going to be really hard to do it given the current state of infrastructure.

“I can’t comment on any specific applications or timelines. But I can give you an overview of the challenges involved.  In short, it is really, really hard. The idea is certainly tempting. We have all these servers in the cloud. Intuitively it makes sense that we should offload some of our work to these servers but the devil is in the details,” Hable said to GamingBolt.

However, it’s not that simple. The biggest challenge to improve performance lies in combating latency. The hardware sends a signal to a remote server, the server will then process this signal and then the results will be sent back to the console for processing. All of this needs a bit of time which results into lag.

“The main problem is latency.  How long does it take to go from your computer to the cloud and back? It takes time to send data up to the cloud, let the cloud crunch some numbers and send the data back.  In video games we generally need the results of computations right away.  The round-trip time of sending a megabyte up and a megabyte down can easily be several seconds even if you have a broadband connection.”

However that is not the only issue. Lost data packets and signal issues can create obstacles for a consistent gameplay experience.

“Then there are all the things that could go wrong. Wifi signals can have spikes. Packets get lost. How do you recover if the server has a hardware failure?  What if a person on your wifi network starts downloading a big file? What do you do for people with slow internet connections? And of course, how much do those servers cost to use? Just because a computer is in the cloud doesn’t make it free! It’s a very difficult problem.  Not impossible, but very difficult.”

However Hable believes that cloud rendering could result into creative gameplay features but simply offloading stuff to a remote server won’t work.

“What graphics features are in a typical game where you could live with getting the results a few seconds after starting the calculation? Not many. I think that a compelling cloud rendering technique would have to be an amazing new feature that no one has really done before. If we think outside the box there might be some really cool things that we could do. But simply offloading existing work into the cloud is hard to justify because of the roundtrip latency and all the things that can go wrong on a network.”

Stay tuned for our full interview with John Hable next week.


Awesome Stuff that you might be interested in

  • Pazz

    Crackdown?!? Ex-Naughty Dog Devs are very experts
    Maybe for sony is realy hard but microsoft is specialized in that sector

    • such.wow

      Show me where Crackdown is improving graphics by using cloud.

    • red2k

      They will improve the physics using the cloud but sometimes more physics means more object or particles floating on screen and that improve the quality of the game. The graphics can be better because the local hardware would be free of that kind of task.

  • Terminator

    Its hard as he says but not impossible, unlike some PS Drones who say otherwise.

  • Ricoh123

    Aaaaand thus is the reason he is an EX naughty dog dev.

    Microsoft are masters of software.

    Sony…..not so much.

    • Vodkapls

      Wow masters of software. I though that kind of fanatic fanboyism was limited to apple but i was obviously wrong lol.

    • Ricoh123

      Name one commercial software company better than Microsoft.

      I dare you.

    • Vodkapls

      less buggy software? plenty.
      higher rated software? plenty
      higher rated games ? plenty.
      more successful hardware? there are a few as well.
      The only thing ms is best at is most installed os.

  • Lennox

    The cloud isn’t meant to take on any heavy tasks. Taking on something as big as graphics would cause a lot of latency. This is not what Microsoft is trying to do. MS is trying to through all the background tasks like AI coding and destruction coordinates to the cloud. Allowing the CPU more room to improve in other areas; like graphics. So yes, the cloud CAN improve graphics. Just indirectly.

    • Anon

      As you’re already aware, the key thing is identifying what is tolerant to latency and what isn’t. Where there are frame-to-frame changes that occur at a low enough frequency (spatially and temporally to the simulation state) as a function of user input, those are what become possible targets for offloading. This applies to graphics just as it does to AI, physics, and a select few other fields within games. The more side-effects something has on the simulation, the harder the task becomes.

      The best example on the graphics-side is that of indirect illumination, which is an expensive global illumination operation to carry out in real-time rendering, and also a current area of research for cloud computing showing promising results. As these bounce light results will represent only low frequency changes, the bandwidth requirements will inherently be quite relaxed to the point where lossy compressed video can be used stream in the result. I would point out that this is only with regards to diffuse bounce lighting, as specular bounces over glossy or shiny surfaces (mirror-like) have too much high frequency variation going on to pass by unnoticed with even the slightest bump in latency (probably not completely ruled out mind).

      Distributing tasks remotely like this isn’t a trivial topic by any means, but there are certainly some areas to look at and experiment on.

    • Starman

      Watch it … you’re making sense , you might get accused of trolling…

  • d0x360

    Damn right its going to be hard bug Microsoft has some of the best engineers and programmers in the world and they spend more on r&d every quarter than some nations spend on everything in a decade.

    • HaakonKL

      Google have problems with cloud servers simply due to the speed of light.
      Their R&D spending is comparable to Microsofts. (It’s almost as high as Volkswagen’s. Which is why Google’s self driving cares aren’t as good as the german ones yet.)

      How do you suppose MS researched a higher light speed?

  • Fayz the Lad

    yes it will be difficult… on psn because it’s run by monkeys, xbl infrastructure can handle it since they use cloud tech on a business level with great success.

  • Psionicinversion

    Naughty Dog can only dream of cloud rendering theres no chance they can do it like MS… why i hear you ask?? Sony wouldnt even pay for some driveclub servers to match the demand and you think there going to pay for servers on standby to offload some physics computations so the hardware doesnt have to do it hahaha no chance. Even then you got to hope the service is up and running.

    I think the real cloud processing is going to happen on online games where the servers can process the physics and stuff and then send them to your client so physics are done less stress on the hardware more resources free to display what your doing. Because the client and server is synced up it can use network path prediction which is something MS is working on so the server can pre-calculate stuff based on where it thinks your going if you go the right way it can send the data to the console just in time to process. MS masters of software, Sony masters of hype

    • JerkDaNERD7

      Totally agree, imagine MOBA games with higher graphics fidelity and physics?! MOBA is a sleeping beast, just need a developer to create a MOBA with a worth while campaign mode in my opinion.

    • Cinnamon267

      You mean Warcraft 3? Already exists.

    • JerkDaNERD7

      That’s single player real-time strategy, I’m talking Dota with “high graphics fidelity”.

  • Tha Truth

    More smoke and mirrors BS marketing hype, lies and underdeliver from Microsoft. With SO MANY broken promises it’s really no wonder that their weak, pathetic, underpowered, poverty console is rotting at the bottom of the sales charts where it belongs.

    The non – gaming brainwashed sheep who defend Microsoft and their pathetic, shovelware garbage truly do live a sad life. All this talk about “Powah Of Teh Cloudz” and yet every single multiplatform game STILL runs at higher resolutions on PS4, ROFL.

    Sony masters of gaming, Microsoft masters of lies, overhype, underdeliver, trash games and no sales figures.

    • Starman

      Oh , so you know and we should take your word for it … calm down , sounds like you’ve been having nightmares about xb1 exposing you’ve precious PS4…

    • Counterproductive

      …what?

    • disqus_a54470QfBE

      You’re a loser. Do people really take you seriously in real life lol doubt it

    • Tha Truth

      And you’re an ignorant gimp who doesn’t have the brains to argue with any of the points I made so instead he tries to change the subject. Do people take YOU seriously in real life, LOL, doubt it.

    • Modi Rage

      Someone named disqus_a544 is calling you a loser. Quite funny.

    • You Are Flat Out Wrong

      Turns out an ignorant gimp has more brains that a slobbering loser who shills for a company that has lied consistently for the last few years

    • Tha Truth

      The only slobbering loser is your cheap ho of a mum on a saturday night after she’s finished performing her mating rituals for every man in town to make enough money to buy you your shoddy, worthless, overpriced graphics cards upgrades you keep begging her for.

      Keep shilling for a peasant, dead platform without any exclusives, loser.

    • You Are Flat Out Wrong

      PC Announced Exclusives this week: XCOM 2, sequel to the 2012 game of the year

      Peasantstation 4 exclusives: Nothing, nada, zilch, zip

      Cry some more, loser :^)

    • disqus_a54470QfBE

      No, but im sure you are eh….NIGGER

    • Cenk Algu

      Oh yeahhh clueless ignorant fan boy. If it was Sony then that would be a revolution I know Pony. I know it.

    • You Are Flat Out Wrong

      And you don’t realise your backwards, last generation Paperweight Station Poor will still be a weak sauce 30FPS gimp box holding back gaming.

      You console gimps are all the same :^)

    • Tha Truth

      Your last – generation peasant spastic PC poor machine doesn’t get any exclusive games for you to play so you have nothing better to do with your pathetic, worthless, sub – human existence but to troll the internet all day trying to apologise on your trashy gimp platform’s behalf.

      You really are a pathetic piece of vermin scum. You don’t spend your time playing games (Because your dead platform for spastic gimps doesn’t have any games), instead you spend your whole life on the internet crying about games you don’t have like an uneducated, illiterate loser. How’s that petition for Bloodborne coming along? Got enough signatures to get the dev’s attention yet? ROFL, keep trying, vermin loser.

    • You Are Flat Out Wrong

      When’s GTA V Going to hit 60FPS on your cheap last gen peasantstation poor?

      Oh wait NEVER EVER :^)

      When’s Bloodborne gonna go above 24FPS on your pile of garbage?

      NEVER EVER

      When’s XCOM 2, sequel to GOTY 2012 coming to console

      NEVER EVER :^)

  • XbotMK1

    It must be really embarrassing and upsetting for these Microsoft fan boys to see an exNaughty Dog dev tell them the exact same thing everyone else has been telling them since the beginning.

    Microsoft lied about the cloud in the beginning which is proven by the simple fact that not a single Xbox One game has improved graphics from using the cloud. Thry falsely advertised Xbox One. Microsoft knew damn well that cloud doesn’t work in the way they were making it seem. They didn’t tell you the problems with it or explain the exact benefits would be. They sold you a marketing slogan and an unfinished theory.

    If the cloud can simply improve graphics, companies would all ready be doing it. Once this becomes a reality and companies figure out more ways to use the cloud, many companies will jump on the bandwagon. The theory of using the cloud to improve graphics isn’t anything new. Games like Battlefield and MMOs use cloud to control more objects on screen. The PS4 all ready does this in ways. PlayStation Now can be used to do it but Sony doesn’t make these false missleading promises like Microsoft did with Kinect and cloud.

    What Microsoft was trying to sell people is currently impossible or unreliable due to many factors such as internet speeds, bandwidth, architecture, and flaws of relying on the internet itself. Another problem I think John Hable forgot is developer support. Developers would have to code their games a certain way for a certain architecture to make use of any benefits from cloud.

    • Starman

      OH because it’s naughtydog .. it’s written in stone … all of you FANBOYS are worried what DX12 will do for PC and xb1 gaming …give it a rest fanboy…

    • Agent_Blade

      How about you actually grow some brains up there. Listen to yourself. Its embarrassing…

    • GHz

      “Microsoft lied about the cloud in the beginning.What Microsoft was trying to sell people is currently impossible or unreliable due to many factors such as internet speeds, bandwidth, architecture, and flaws of relying on the internet itself.”

      Microsoft Research teams up with Duke University

      “Using Doom 3 as an example, the Kahawai tool managed to cut the bandwidth required to stream the game over a 1MB line by over 80%, without any cuts in visual quality.”

      http://www.windowscentral.com/microsoft-research-teams-duke-university-cut-required-cloud-gaming-bandwidth

      “If the cloud can simply improve graphics, companies would all ready be doing it.”

      Its already here. It’s ready. It just needs to be deployed. Companies like MSFT, cloudgine and Shinra are working out the kinks. Square have their tech too, as well as Nvidia.

      https://youtu.be/4x3JnbxdngA

      And its not only MSFT who is talking cloud to improve graphics now. Others have joined in. Like Shinra Tech for example.

      https://youtu.be/HLjZlbWBHF4

      Square and Nvidia will be showing off their own tech as well.
      Why are you so afraid of advancements? Is it because Sony arent the ones implementing gaming tech that matters on a large scale? Is that why youre so pissed off? One problem @ a time. 1 1/2 yrs into our consoles lifespan and look at what we have achieved. We should be rooting for this, not tearing it down. Everyone benefits.

    • XbotMK1

      Nice try but that Kahawi tool is a “research project” that was only just recently created by Duke University and it was tested in a “controlled environment”. It’s not here and it’s not ready. Do you know what “theoretical research” means? You would see the problems with Kahawi if you actually read the research instead of cherry picking your links based off your usual Microsoft fanboyism:

      Problem #1 This technique only works when the game is running on your local machine and a full version of the game is running on a remote machine. That means developers will have to develop 2 versions of the same game to run one on a local machine AND one on a remote machine.

      Problem #2 It doesn’t actually reduce bandwidth by 80%. That is a missleading lie. It streams a partial frame (or a lower resolution image with less pixels thus reducing image quality) which is how it reduces bandwidth. To increase image quality you need to stream more pixels. This means it would still require 100% of the bandwidth if you want to keep the image quality of a full 720p frame. You might as well just stream a 720p image.

      Problem #3 Severe latency

      Problem #4 Syncing the streamed frame with the local frame. The delay will cause problems. Probably in the visual form of torn frames, overlapping frames, interpolation, and overrall worse image quality compared to traditional streaming and obviously local compute.

      Problem #5 Limited to local hardware architectures which means you can’t just stop running it on one device and then pick up and play it on a different device.

      Problem #6 Patent infringement which means every company will have to develop their own unique architectures to avoid patent infringement thus creating more problems for devs and hardware manufacturers.

      Problem #7 This technique doesn’t exactly offer benefits over traditionally methods and it also doesn’t fix the problems of traditional methods either.

      I don’t understand how you think I would be upset. Sony is actually the one that is allready implementing gaming tech that matters on a large scale while Microsoft and Squarenix are not, but I don’t understand how you think that matters or is relevant to my point. You seem upset. My comment is directed at the industry as a whole. Problems with the cloud apply to everyone including Sony.

      I’m simply pointing out the fact that Microsoft lied and misslead people. New research projects aren’t a valid excuse.

    • You Are Flat Out Wrong

      You seem upset that people are laughing at you for shilling backwards video on demand high latency garbage John Derp

      Streaming will never replace local hardware and your POS4 will still be garbage :^)

    • Agent_Blade

      Grow some common sense will ya.

    • You Are Flat Out Wrong

      I already have common sense in that I didn’t buy a garbage, backwards, peasantstation poor that only runs at 30FPS and support a company like Sony who is turning into an online DRM Cloud company :^)

    • Agent_Blade

      Obviously not because this is the most obvious troll comment I have seen in a long time but since im bored ill indulge you…..As if your poor Trollbox fail can’t even run games at 1080p. Where’s that power of da cloud eh?

    • You Are Flat Out Wrong

      There’s that Sony shill logic thinking it’s xboners.

      PC is the master race. Implying I’d ever let a garbage console in my home. How’s Battlefield Hardline in 900P, loser. It’s great for me in 1080P :^)

    • Tha Truth

      How’s Bloodborne on your worthless, cheap, last – gen, PC loser gimp machine? Oh right, Bloodborne never came to the PC masterloser race because the dev’s wanted to make a game for human beings and not sub – human vermin scum on the PC.

      Keep those petition’s going, peasant. Bloodborne will never come to your worthless dead gaming platform for mongrels.

    • You Are Flat Out Wrong

      When’s GTA V Going to hit 60FPS on your cheap last gen peasantstation poor?

      Oh wait NEVER EVER :^)

      When’s XCOM 2, sequel to GOTY 2012 coming to console

      NEVER EVER :^)

    • GHz

      LMAO 😀

      Look at you pretending to know what your talking about. Going off the reservation babbling all that nonsense. You probably believe the world is still flat too. How is Sony implementing game tech that matters when they are using old ps3’s as servers? They say that they are “upgrading” them now so we can “finally” get our free ver. of Driveclub. WOW! Leaders indeed. LOOOOOOOOL 😀

      You’re impossibly stupid. Keep being the best that you can be 😉

    • Agent_Blade

      Wow this just proves your petty. Its plain as day to see you don’t even understand half of anything he said. Stay out of the kitchen dude.

    • GHz

      XbotMK1 I know this is you. Stop creating fake accounts to vote yourself up and to defend the garbage you spew. This article is about Hable saying the task is hard not impossible. Links were provided to prove that solutions to concerns are in progress and showing that the tech is doable.

      You here just wanting to tear down all the legitimate progress being made, just so you can preach your rhetoric. Sicko!

    • Agent_Blade

      Actually I’m not XbotMK1. I honestly don’t know how you came to that conclusion but I assure yoy I’m not a Xbox fanboy. I own both consoles but a PS fan all the way.

    • Cinnamon267

      He’s not pretending anything. The collaborative rendering does have it’s latency syncing issues. Closed test environment data is worthless.

      The task of quickly generating fine-grained details — such as subtle changes in texture and shading at speeds of 60 frames per second — is still left to the remote server. But collaborative rendering lets the mobile device generate a rough sketch of each frame, or a few high-detail sketches of select frames, while the remote server fills in the gaps.”

      That test was done with mobile devices where wireless is your only option. With your hands on the device. They’re designed to be less sensitive to that, but no one has a magic bullet system. “Fill in the gaps” in a real world Internet scenario doesn’t sound great.

      Reduced bandwidth requirements and the ability to play offline with degraded visuals are the only benefits lf that “Kahawai” system.
      Every point be made is pretty solid.

    • GHz

      Whats up XbotMK1 aka Cinnamon267

      “That test was done with mobile devices where wireless is
      your only option. “

      The logic here is if you can do something like this effectively
      over a wireless connection w/o the use of massive amount of data, then you’ve pretty much solve the problem over a wired connection even more so. It was wiser to solve it using wireless and taking into account data caps because of the fact that mobile gaming today exceeds that of wired gaming, and is more prone to connection issues. You kill 2 birds with one stone.

      “With your hands on the device. They’re designed to be less
      sensitive to that, but no one has a magic bullet system.

      ????????? What h*ll are you blabbering about? You’re not
      making any sense.

      “Fill in the gaps” in a real world Internet scenario doesn’t sound great.”

      My dude, when you are researching something, your objective
      is to verify, refute, & establishing validity . In this case, MSFT & Duke have to research in a real world scenario. That means remote servers via WAN! Nothing controlled about that. And I’m curious, what makes you think that the experiment didn’t involve remote servers? Why would they do something like this locally(LAN) in the confinement of one building? That makes absolutely no sense what so ever and its wishful thinking on your part. Remote is the only way you can effectively do this experiment.

      You as your alter ego XbotMK1 made no clear points. Just pointless babbling confusing yourself to believe that what you say make sense. If you’re so smart, why aren’t you sitting on their board advising them? Between you and all your split personalities you’ve got things pretty much solved. At least until you’ve taken your meds and you’re back in the real world. Me, I’ll stick the official story and cite what was said, then wait for the tech to be fully realized. Nothing wrong with. I Cant wait to get my hands on Crackdown! While you camp out in your cave pretending that the world has stood still.

    • Cinnamon267

      “Whats up XbotMK1 aka Cinnamon267”

      … Jesus christ, dude. You just went full re-, well, you know.

      “Why would they do something like this locally(LAN) in the confinement of one building?”

      To test if it actually works, first. Which is pretty obvious.

      “Me, I’ll stick the official story and cite what was said”

      Except you’re citing some information to suit your argument and assuming the rest. Which is stupid.

      The study was performed between December 1st and 3rd, 2014 in an office inside Microsoft’s building 112 in Redmond, Washington.

      The machines used was a Surface Pro 3 connected to, via LAN(!), to a Dell XPS 8700 PC with a quad-core 3.2Ghz Intel i7-4770 CPU, 24GB of RAM, and an Nvidia GTX 780 Ti GPU.

      Resolution was set to 720p. 1080p, for instance, not tested. So the “1Mbps” number is a bit…. Useless.

      “For low-latency configurations, we used an unmodified LAN(again, !!!!) with approximately 2 ms of round-trip latency between the tablet and server, which corresponds to a typical round-trip time (RTT) over a home LAN. For high-latency configurations, we introduced a 70 ms RTT which corresponds real-world observations over an LTE connection.”

      The report itself answers, almost, all your questions. You should have read it instead of… Doing what you did. And assumed pretty much everything.

    • GHz

      “To test if it actually works, first. Which is pretty obvious.”

      Dont act as if I’m asking that question in reference to early stages of the experiment. Of course it will be in a confined environment. But they would have to eventually deploy it publicly to establish validity. Im using common sense that if they went public with their claims that they have already. In an experiment like this, there is no other way.

      And thanks for that additional info. It is very helpful in that it gives a clearer picture in how they went about doing it. And if you understood what you just quoted, you’d realize it goes against your argument and supports mine. The above describes pre public deployment. You cant reach a conclusion with that, but its a good start! In fact they went as far as addind a 70ms RTT for the high latency experiment! Do you even know what that means? No you dont. Hence the reason why you used it to support your argument.

      OMG the stupidity you display is unreal. Keep being the best that you can possibly be. Keep repping Sony hard. 😉

    • Cinnamon267

      Just starting off saying, you’re not as intelligent as you think you are. Get over yourself.

      “Im using common sense that if they went public with their claims, they have done so already.”

      Common sense dictates if they did that, they would mention it. The test only had 50 people on MS’s campus doing a test hooked up to a LAN. Common sense would also dictate you would have read the paper instead of cherry picking info from an article and assuming the rest because…. common sense.

      It supports some of your argument. But, you’re ignoring where it doesn’t or just contradicting yourself.

      And you also said this: In this case, MSFT & Duke have to research in a real world scenario.

      Real world = MS offices and a controlled LAN. Such real world. Almost certainly on a fibre network if it is at their offices, too.

      “The above describes pre public deployment. You cant reach a full conclusion with that”

      Hasn’t stopped you from assuming a conclusion just because.

      “My dude, when you are researching something, your objective is to verify, refute, & establishing validity . In this case, MSFT & Duke have to research in a real world scenario. That means remote servers via WAN! Nothing controlled about that.”

      Why haven’t you been objective and try to verify your claims? You just continue to assume everything with no basis in reality.

      I don’t know why I’m expecting an intelligent argument from someone ,who labelled another human being who likes something different to them, a Jihadist.
      Then you talk about the other person displaying “stupidity”. Some people, man. Interesting to engage with.

    • GHz

      “Common sense dictates if they did that, they would mention it.”

      Based on that info you shared, I’m pretty sure it led to a public deployment afterwards. Send me the link to where you got that quote, I’m 100% sure the story didn’t stop at the info you shared. You missed the part where I said that quote describe a part of the experiment. The wording makes it obvious. But you dont see that. You saw the word LAN & jumped to conclusions. It was the initial stage. Important nevertheless. Public deployment would be the next step. Other wise they cannot conclude the experiment to be a success. You probably dont realize because you dont understand whats being said. Nothing wrong with that. What’s wrong is that you act like you know. Then follow up with stupidity in attempt to make it truth. Leave you unchecked, you misinform because you lack understanding. Public deployment would most definitely use servers geographically on a larger scale. Not confined within a building. Servers nearby or far away, it dont matter. The language they use may have been different to explain this in final results, but it still = to the experiment being done via WAN in the end.

      And say if they didn’t. It got nothing to with me not understanding how the experiment normally should’ve been done, but rather not realizing other methods than can simulate certain conditions. Like the mention of introducing higher RTT @ 70 ms which CORRESPONDS REAL WORLD OBSERVATION OVER LTE! Thats what you’re quote said. Thats a critical clue. Something I didn’t know they could do. Which supports my argument in full!

      I called dude a jihadist because he come on here like he declaring a holy war on MSFT! All he does is lie & corrupt the truth all for the sake of love for Sony. Its sickening. Its one thing to question if you have doubts and help stimulate healthy convo, buts its a whole other thing to lie in hopes to gain support for your rhetoric. My name for him got nothing to do with his console preference, cause I own all 3! but rather his methods in the forums. And then you turn around and say “He’s not pretending anything.” What!? You must be him. If not you are just as dumb.

    • Cinnamon267

      Can find no information on a public deployment afterwards. Only info of something done public was a conference back in June of last year where they demoed it and their server was the same Dell XPS PC with a 780 with a Surface Pro 3 connected done via LAN over an Ethernet cable. Their research paper makes no other mentions and their talk at a conference back in May make no mention of any further work with the research or deployments

      “You saw the word LAN & jumped to conclusions”

      I took the conclusion they provided. Which was that everything was done over LAN. Only mentions of “WAN” are “wanted” and “Z. Wang”. That’s it.

      “Public deployment would be the next step.”

      No information on the matter and, once again, you just assume so.

      “Like the mention of introducing higher RTT @ 70 ms which CORRESPONDS REAL WORLD OBSERVATION OVER LTE”

      Via LAN. It’d be more interesting if it was done over an actual LTE network. Congestion and all. Or any normal environment connection. But, it wasn’t. Or if the game wasn’t Doom 3 having assistance from a freaking 780 to render with.

      The rest of your statement is just more meaningless nonsense mixed in with your usual assumptions regarding something you didn’t even read fully.

      “What!? You must be him”
      Clearly. I, for no reason, am conspiring against the infinite brilliance that is you.

    • GHz

      “Can find no information on a public deployment afterwards.”

      Dude you think I’m stupid? I bet you lying! Send me your link! Why wont you send it?

      I know its in reference to only a part of the experiment and you’re most likely holding out. Now youre youre telling me that all they said. I can smell your BS a mile away. Send the link. Let me read the rest and I bet that I’m right.

      “Via LAN. It’d be more interesting if it was done over an actual LTE network.”

      They compensated by adding RTT @ 70 ms. What do you think that is? Youre missing the point n going in circles cause you dont know what it means.

      W/e though. If you came about that info, I can too. Like I said before. There is no shame in not knowing. Asking is the best way to learn. But you be on that ego trip. Pretending to know better than those who are trying to make our gaming experience better. Might as well you be him because you think the same.

      You yourself admitted, “It supports some of your argument. But, you’re ignoring where it doesn’t or just contradicting yourself.” What contradiction exactly? based on that qoute you shared, its either yay or nay. Cannot be both. That quote is clear and can be only interpenetrated one way. So which is it? In support or not? Unless, I encouraged you to read on, and it lead you to a truth, and found evidence of what I said was right. So please send the link. I want to read the whole thing. You scared i catch you in a lie? You no different from XbotMK1. Might as well you be him. I cant stand you liars. Always trying muddy up the truth. Why? Dont answer back unless you got a link. In the meantime I’ll search as well. Cant thank you enough for that qoute! 😉

    • Cinnamon267

      Didn’t know which specific thing you were asking a link for.

      Here’s one from their website: http://today.duke.edu/2015/05/cloudgaming.

      No mention of future public deployments.

      And here’s the research paper itself: http://tinyurl.com/pk2p2wk

      Only mention of public deployment was for a demo at a conference back in June lf last year. Which i already mentioned.

      “They compensated by adding RTT @ 70 ms”

      Emulating it via software within a LAN is not the same as a real world LTE connection bogged down with congestion. Don’t even try and act as if it is the same.

      “Pretending to know better than those who are trying to make our gaming experience better.”

      “meanwhile you take what you read, and make egotistical comments like, ” Resolution was set to 720p. 1080p, for instance, not tested. So the “1Mbps” number is a bit…. Useless.” That is you saying that you couldve done better.”

      Pointing out their test was run at 720p when 1080p is th default people expect is not me saying I can do better. In no way, shape or form does t mean that. You’ve just dreamt that up for your own amusement.

      More than 720p is expected in the year 2015 and beyond.

    • GHz

      “Can find no information on a public deployment afterwards.”

      You’re a terrible liar. Its obvious you got that quote from the report. Your attack on me was based on it. Remember you said that I didn’t bother to read that report in full, giving the impression that you did. You snatched that quote up from it because you saw the word LAN. Its funny because you still don’t understand what the quote is saying. You thought it supported your claim and went against mine. In the end not only was I right but proof that you in fact you didn’t understand what is said.

      Your ego so BIG that you’d undermine research supported by National Science Foundation. I called you out on that because you said the research was USELESS. Reason? because they couldn’t do it in 1080p. These pple are solving problems one step @ a time and all you see is 720p. And you’re still arguing that the experiment produced inaccurate results because the 1st stage was LAN based. You undermine the techniques they used to determine how the game reacts to heavy traffic, and regard it as a sham. Thats EGO bruh!

      So in the end they did a public deployment. Its in the same research you grabbed that qoute out of. They mentioned a server within a nearby CDN. CDN’s are especially useful for services offered that have a global reach. That has WAN written all over it, because its used especially to reduce the distance content travels across a large geographical distance. You just ddnt understand. But w/e, you can keep supporting dumb statements because they appeal to you. Or keep pretending you too smart to actually learn something. Dont care. Done!

    • Cinnamon267

      “Your attack on me was based on it”

      Correcting you isn’t attacking you.

      You snatched that quote up from it because you saw the word LAN.”

      LAN is mentioned all throughout. Because it was their test environment. You still haven’t read the report they put out.

      “I called you out on that because you said the research was USELESS”
      I said the 1 Mbps number was useless due to it being 720p. Never said their reaearch was js

    • GHz

      You didnt correct me. You pointed out the config of the initial test, implying that it was final. I told you it wasnt! It was your rebuttal to me asking you why would you think they wouldn’t use remote servers via WAN. All I had to go by was the report on the conclusions. I was responding to the conclusion. Understand. When you shared that qoute, I knew that it was only the initial stage. Thats what I told you. I said that kind of experiment usually end in a public deployment and that its there you’ll find reference to WAN. I implied that public deployment is the real test. Because you didn’t understand, I advised you that they may not use the same language(WAN) but they would make reference to it one way or the other. That’s CDN! How are you correcting me when you are agreeing with XbMK, and he’s way off base? His conclusion is its all a lie. You’re conclusion is that they did it all wrong. LOL. OMG LMAO SMH..You’re ego is off the charts. End discussion. I’m done. Move on! E3 is right around the corner Fun time ahead! 🙂

    • Cinnamon267

      You didnt correct me.”

      You “bet” I was lying, I proved otherwise, therefore I corrected you. Done so numerous
      times. But, for whatever reason, you just skip over it.

      “You pointed out the config of the initial test, implying that it was final. ”

      Nope. Never implied that at all. You just dreamt that up. You did it again.

      “I said that kind of experiment usually end in a public deployment and that its there you’ll find reference to WAN. That’s CDN!”

      And I corrected you by stating the fact there was no public deployment after they released the paper. Which there hasn’t been. If fine if you think it “usually”
      happen, but, in this instance, you were wrong. Get over it. The world still
      spins.

      Your definition of WAN and CDN are as broad as they come. I would love to know your definition lf those things. Since that statements prove you don’t know much.

      Once again, CDN is mentioned once in the research paper and it’s stating the obvious “CDN nearby will be better.” Which is how all of the Internet works.

      Your done? Alright.
      Bye.

    • GHz

      Dude what the h*ll you talking about!? You said the experiment was a waste based on you thinking the 1st config was the final stage! I told you no! On page 9 of that doc you sent clearly stated that they did a public deployment. That led to the final conclusion. Go back and read it!

      “Your definition of WAN and CDN are as broad as they come”

      WHAT!? LOL. Damn you’re thick. If you say im defining these terms, it means you dont understand them, because I did not define!

      Now I’ll define.

      quick lesson. WAN is the entire internet, and within it, you find CDN which makes sending and receiving of very far away data more manageable/accessible to the end user. When they say “nearby CDN” its within the context of WAN(Wide Area Network) the internet as we know it. WAN is NOT local(LAN). Those servers(CDN) are WAN dependent! LAN (local Area network) is not! You can have the LAN without the WAN. But you cannot have the CDN w/o the WAN! To access CDN is to access WAN. CDN makes WAN runs better for you & I. That is the relationship they have, in regards to this experiment, making all that big data more manageable! That was my point from the beginning, while you want to argue with a qoute that is NOT referring THE FINAL STAGE OF THE EXPERIMENT!.

      LOOOOOL I’m done.

    • Cinnamon267

      “Dude what the h*ll you talking about!? You said the experiment was a waste based on you thinking the 1st config was the final stage! I told you no! On page 9 of that doc you sent clearly stated that they did a public deployment! That led to the final conclusion. Go back and read it!”

      Again, you’re just making stuff up. I never, ever said waste. Stop twisting what I’m saying.

      Page 9 details a demo that had at a Mobile conference at “MobiSys’14, June 16–19, 2014, Bretton Woods, New Hampshire, USA”

      Their research paper was published from a study they did on “December 1st and 3rd, 2014 in an office inside Microsoft’s building 112 in Redmond, Washington.”

      I never denied a public deployment. I told you about the conference back in June
      of last year. You, as per usual, just ignored it. I denied a public deployment AFTER the paper. Which I am still correct on as of this date.

      “Your definition of WAN and CDN are as broad as they come”

      WHAT!? LOL. Dam n you’re thick. If you say im defining these
      terms, it means you dont understand them. because there was no defining,

      quick lesson. WAN is the entire internet, and within it, you find CDN which makes sending and receiving of very far away data more manageable/accessible to the end user. When they say “nearby CDN” its within the context of WAN(Wide Area Network) the internet as we know it. WAN is NOT local(LAN). Those servers(CDN) are WAN dependent! LAN (local Area network) is not! You can have the LAN without the WAN. But you cannot have the CDN w/o the WAN! To access CDN is to access WAN!

      Fantastic! You actually know something.
      Which doesn’t change the fact that their study took place over LAN, within MS’s
      offices, and the only mention of a CDN is this direct quote: Kahawai will perform best when offloading over a LAN or to a server within a nearby CDN.

      Their paper makes no mention of testing it in any other capacity outside of a
      confided LAN. Once again, no remote servers used according to their research paper. Even though you still keep banging on about it.

      Your reading skills and comprehension are quite poor.

    • You Are Flat Out Wrong

      John Derp you are hilarious shilling for playstation now, the pay for backwards compatability service. It’s never going to help your backwards, last gen, 30FPS paperweight

  • Starman

    It’s Ex Naughtydog , they worked for Sony , so we should all listen …. GTFO..LOL So let me get this straight … MS is investing all this cash into the cloud for XB1 and PC.. for not to do anything …REALLY ? …lmao…

  • GHz

    “Microsoft has been touting the so called power of the cloud for quite some time but except a few impressive tech demonstrations, they basically have nothing.”

    We are not exactly talking small implementation of the cloud like streaming games. Thats old school and limited. We are talking about a much more ambitious deployment of cloud services aimed @ solving problems across many disciplines. Rashid, am I wrong?

    Starting with making bandwidth more reliable for games, adding xtra CPU power and eliminating latency based host advantage and hacked-host cheating. This has already been done in Titanfall. That was an expriment that worked. You dont classify that as nothing. These are very important contributions thanks to MSFT cloud. XBLCloud is not a “One Trick Pony.” It is another tool that ambitious Devs will use to solve problems that normally they couldn’t have. Just ask Respawn Ent. They tested one aspect of its use with success, and now other can follow suit. Because of their ambition and OPEN MINDEDNESS, they were able to prove an impossible task was indeed possible. Thanks to MSFT +Azure.

    “But it’s not just for dedicated servers – Microsoft thought about our problem in a bigger way. Developers aren’t going to just want dedicated servers – they’ll have all kinds of features that need a server to do some kind of work to make games better. it’s not accurate to say that the Xbox Live Cloud is simply a system for running dedicated servers – IT CAN DO A LOT MORE THAN THAT” – Jon Shiring Lead engineer of Respawn Ent.

    http://www.respawn.com/news/lets-talk-about-the-xbox-live-cloud/

    That’s coming from a game engineer who’d had access to MST cloud and used it to better his game design. And their more examples of this. We are early into this tech and these consoles are only 1 1/2 yrs old. Laws are being passed to make 25mb download speed to be the minimum in the U.S. meanwhile MSFT are cutting cloud gaming bandwidth requirements. This is all real & legitimate news.

    http://www.windowscentral.com/microsoft-research-teams-duke-university-cut-required-cloud-gaming-bandwidth

    Cloudgine + MSFT showed off their collab with cloud tech, Crackdown. Why would they lie? This tech is coming and there is nothing those who want to see MSFT fail can do about it. They are the leaders in regards and with their partners, are at the forefront of it. There’s a real world outside these forums and ya’ll need to keep up.

    • Orion Wolf

      Well put.

      Its still, IMO, one of the best possible ways as to how MS will keep the x1 competitive in a world were technological innovations are increasing exponentially every year.

      Every year there are more and more powerful GPUs, CPUs, APUs, etc. new hardware is being made which will replace old standards that are not adequate anymore (HBM replacing GDDR being the main advocate
      of that) so how would a console released in 2013 compete with a low-budget PC in 2018/2019 and the constant demand for visually better games? To put it simply the “cloud”. The “cloud” is a relatively perfect answer to that.

      In addition, MS has been researching and investing in the field for quite a long time, so dismissing something like that wouldn’t be the smartest thing to do.

      Btw I have to wonder why is it that this question are being asked of someone like an ex-naughty dog dev instead of someone who has actual experience with the topic at hand (respawn dev)?

    • HaakonKL

      And none of that actually addresses the problem of latency.

      There’s a real world and nobody has done anything great with it yet because of signal roundtrip times being what they are. Also, “This single player game requires at least 20mbps uninterrupted bandwidth” is not something anyone would like to ever print on their game boxes.

      You people who continually and shamelessly salivate over empty promises and refuse to consider the counterpoints to it infuriate me.

    • GHz

      Little by little, we can walk far. 😉

    • XbotMK1

      All I see are excuses from a Microsoft fanboy.It must be really embarrassing and upsetting for these Microsoft fan boys to see a dev tell them the exact same thing everyone else has been telling them since the beginning.

      Microsoft lied about the cloud which is proven by the simple fact that not a single Xbox One game has used this. They falsely advertised Xbox One. Microsoft knew damn well that the cloud doesn’t work in the simple way they were making it seem. They didn’t tell you the problems with it or explain the exact benefits. They sold you a marketing slogan and an unfinished theory.

      If the cloud can simply do this, companies would all ready be doing it. Once cloud is viable, many companies will jump on the bandwagon. The theory of using the cloud isn’t anything new. Games like Battlefield and MMOs use cloud to control more objects on screen. PlayStation Now can be used to do it but Sony doesn’t make these false missleading promises like Microsoft did with Kinect and cloud.

      What Microsoft was trying to sell people is currently impossible and/or unreliable due to many factors such as internet speeds, bandwidth, architecture, and flaws of relying on the internet itself. Another problem I think John Hable forgot is developer support. Developers would have to code their games a certain way for a certain architecture to make use of any benefits from cloud.

  • JerkDaNERD7

    Obviously speaking in general terms, he has no clue what Microsoft has up their sleeve, which we shouldn’t underestimate. Microsoft has a R&D budget more valuable than Sony as a whole, lol! So when they make claims and are going to bring it we all should listen in anticipation.

  • Cenk Algu

    Definitive proof of Sony’s fear.The only devs downplaying DX12+Cloud are ex or actual Sony devs.Poor Sony.They dot have money to invest in technologies.Now I see that DX12 and Cloud is a real deal that Sony get wind up hard.

  • Lex

    The cloud will change gaming as we know it but it will not be like many think it will be. Yes, there will be some technical advances making things a bit easier for the consoles.It’s really going to bring more ideas to gaming, its going to make games more personal to the gamer.

    LOL it seems like all people think about is FPS, Frame rates, and graphics.


 

Copyright © 2009-2015 GamingBolt.com. All Rights Reserved.