Cloudgine Has No Dependency On Xbox One’s eSRAM, Microsoft’s 2014 Build Demo Took A Lot of Work

The technology is completely hardware agnostic.

Posted By | On 14th, Oct. 2015 Under News | Follow This Author @Pramath1605


xbox one amd

Given our understanding of how Cloudgine’s cloud tech works in Crackdown 3, it didn’t surprise us to find out that the technology is in no way dependent on, or limited by, local Xbox hardware, such as its eSRAM. Maurizio Sciglio, the CEO of Cloudgine, explained that cloud processing completely circumvents these kinds of problems with local hardware.

“I think [the question of eSRAM] is more a question for game teams,” he said in an exclusive interview with GamingBolt.“Cloudgine runs almost entirely in the cloud so we haven’t been dealing with specific Xbox One features such as the eSRAM.” This of course makes sense- Cloudgine is, after all, completely hardware agnostic, so it makes sense it is also hardware indifferent.

He also explained how Microsoft’s BUILD 2014 showing for Crackdown 3 influenced the scope and direction of the project.

“I believe you are referring to a demo that was shown a few months earlier at Microsoft BUILD 2014. That was an initial prototype that we developed to prove the model. A lot of work went into creating it, and it’s paved the way for what we’re doing in the game.”

Keep an eye on GamingBolt for our full interview with Mr. Sciglio, which should be up soon.


Awesome Stuff that you might be interested in

  • GHz

    Yeah we need clarification as what exactly type of service Cloudgine is, and how it uses MSFT cloud to enhance the gaming experience in Crackdown. Is Cloudgine some sort of new cloud based graphic engine?

    Cloudgine CEO Maurizio Sciglio said….

    “We offload the expensive computations to Microsoft Cloud through our platform and send the results back to the Xbox One for rendering.”

    So Cloudgine doesn’t do the calculations, XBLCloud does, and XB1 does the rendering. So what kind of platform is Cloudgine exactly? Is XB1 design really instrumental in how it speaks to MSFT Cloud (XBLCloud)? XB1 is doing the rendering the MSFT cloud do the computations, Cloudgine does_______????

    He then said, ”

    “The platform itself doesn’t impose any hard limit on the number of servers. This number is determined by game design and cost considerations. Different games will strike a different balance between cost and compute resources, depending on their requirements and business models.”

    So again, Crackdown just shows us another way to use the cloud. So what everybody wants to know is if Crackdown was multiplatform, would PS4 able to run the MP cloud dependent version with Gaikai? Or better yet with MSFT cloud? Or is this type of use design specific? Can the PS4 speak to servers the same way XB1 can? Or is PS4 design for cloud use regulated to specific function as Yoshida and Mark Cerny described in their past interviews, which is basically for matching and streaming?

    • ShowanW

      Cloudgine i believe is an API written for communication between Live and Xbox. I believe it’s a subset of Azure.

      Could the PS4 do this? I believe it can, but just not overnight. The instruction set written for Crackdown (or any game that’ has the extensive instruction set) has to be years in the planning.

      Remember Cloudgine is being created along side Crackdown, so maybe this technology can be licensed out for other games like Unreal, CryEngine are.

      So yea, Gaikai can probably do this, but I think Sony would probably do better to team with Rack Space, or Amazon, as they have WAY more servers worldwide than Sony does.

    • GHz

      When they asked Mark Cerny if the cloud can be used to make a console more powerful, he said that wont work well in the cloud. Why would he say that then? And why would Yoshida say that the way MSFT is using the cloud is confusing to him? If Sony’s chief architect says no and their President, Yoshida, says MSFT cloud is confusing, what does that mean for the design of the PS4 and the capabilities of Gaikai?

    • Philip Tam

      I think it can still be done but as ShowanW says, it requires some years of planning and R&D but it is not impossible with the PS4. The hardware inside Xbox One and PS4 are neck in neck basically they are using the same APUs from AMD except the clock rates for CPU and GPU are higher in the Xbox One. Otherwise, there would be no difference between the APUs. There is just no way Microsoft has a secret sauce inside it that would make it work with the cloud without Sony being able to with the PS4. If Witcher 3 after patch 1.10 runs better than Xbox One on the PS4, that says that the hardware is comparable if not better than Xbox One. I think Sony will jump onboard soon if the technology for Xbox One with Crackdown 3 proves to be successful. I think they are just waiting out to see if that works out for Microsoft at this point.

    • GHz

      The only secrete MSFT and their partners agreed upon was DX12. It was so top secrete that AMD reps who didn’t have the clearance were going around saying that there was going to be no DX12 and that MSFT will discontinue the development of their famed API. This was the belief well into late 2012. Then when it was finally announced to the shock of many in the biz, it was difficult for them to imagine that XB1 will also use it. Then MSFT announced it for the XB1. Then they said, well XB1 is not a true DX12 device because of its old GPU. Then MSFT said FULL DX12 is coming to the XB1. Phil later then said that the graphical leap you’ll get from DX12 on XB1 will be like the 1st halo to halo4 on 360, and to expect something like that as far as graphic fidelity goes. Nobody wanted to believe it. No matter how hard MSFT tried to express this, all they got was grief. So they stepped back and just downplayed everything, and told their partners to do the same. No more talk about advanced graphics unless they have data, code, and demonstration to back it up. All Phil will say now is that Dx12 will grant devs to more advanced features on XB1, so that the console may do the things its great at. Features. What features? That was a secrete until this year GDC. Not even AMD knew what they were until MSFT were ready to discuss them. Now ask yourself Why?

    • kreator

      Because they (Microsoft) learned their lesson. Don’t run ya mouth about things, Sony always burnt then cause of this! Now they need to stop going first at E3 as well.

    • GHz

      “If Witcher 3 after patch 1.10 runs better than Xbox One on the PS4, that says that the hardware is comparable if not better than Xbox One.”

      The problem with all this talk about graphics and frame rate is that all these games were built on old ideas. PS4 will win as long as they keep developing games in Dx9-Dx11. As long as these engines exist, XB1 will NOT perform @ top peak. These engines keeps XB1 from taking advantage of better parallelization and more efficient drawcalls. PS4 already are better @ these things.
      Brad spoke out of turn and said DX12 will optimize XB1 GPU and double its performance. That’s what the man said. When we get games that are built from the ground up in DX12 for the XB1, then we can come back to this conversation. Under dx11, only XB1 left brain is functioning. That’s the PS4 vs XB1 in a nutshell. Is it a fair fight?

    • kreator

      It’s not a fair fight at all. But it’s also M$’s fault!

    • GHz

      Agreed!

    • Philip Tam

      But honestly, consoles don’t need to be the most powerful to be successful. Look at PS2, best selling console of all time, the weakest of all 3 systems. The consumers will reap the most reward in the end.

    • GHz

      The topic wasn’t about a consoles recipe for success. It was about how they will use different forms of cloud compute. In regards to the PS4, I was just asking based on Cerny’s & Yoshida’s input.
      Both the XB1 & PS4 are already on their way to be successful consoles, with PS4 being the more exceptional sales wise. With XB1 selling above normal and PS4 selling above normal x2. No worries there.

    • ShowanW

      Mark Cerny and Yoshida-san are very smart people… VERY SMART people…
      But when it comes to software (not hardware), i will believe Microsoft over Sony all day everyday. This doesn’t mean I think less of Sony, but software is what Microsoft does.
      Remember when Microsoft were first showing their tech for using the cloud when they were doing that astroid belt simulation (this was around the time of the Xbox One’s reveal/E3 2013, they had me right then an there.
      Mark Cerny and Yoshida-san couldn’t grasp it, because at the time noboday was doing it. Technology shifts so fast these days, I’m actually shocked they doubted Microsoft

    • GHz

      Yeah, I couldn’t believe his answer either. I thought he was going to say a simple “yes.” So after that I thought if Cerny says its not going to work, and Yoshida said it was confusing, then most likely the PS4 wasn’t design to take advantage of the cloud like how XB1 can take advantage of the cloud.

      I do believe that the PS4 can take advantage of other solutions though. Like Shinra Tech solutions to cloud compute. The PS4 wouldn’t be doing any rendering, it would just play the role of a host. But that would mean that the console would need to be connected @ all times. And that goes against Sony’s initial policy of an always online dependent console. They and their followers beat MSFT over the head for entertaining such an idea.

    • Psionicinversion

      because cerny hasnt got a clue about this stuff thats why he said it wont work

    • Psionicinversion

      simple, its just computing physics which only requires compute power to crunch the numbers and have it sent back to X1 for rendering

    • GHz

      That part I got clearly.

  • Triton

    Why cant I link this on Facebook with a text and picture just by pasting the link there…very annoying. It only says “Gamingbolt” and no one will read it.

    • kreator

      LOL

  • Alistein

    It seems Azure will be handling graphics also.
    Jen-Hsun Huang speaks about GPUs for Microsoft Azure Cloud …http://www.youtube.com/watch?v=7Fq21oguB9g

    • GHz

      Nvidia agrees that this the way to go. AMD agrees that this the way to go. Intel agrees, Qualcomm agrees, Square Enix agrees. MSFT have been telling us for years this is the way to go, and that the XB1 was designed to take advantage of “the future of computing.” Why when MSFT says it, people don’t want to believe? NextGen starts with XB1! 😉

    • Mark

      Yeah next year I expect to see more cloud assist developers start revealing things. I’ve been following this for years. Square Enix is without a doubt on board.

    • GHz

      So have I. been following up on the tech for years too.

    • vishmarx

      lemme know when it actually does….
      people dont wanna believe because two years into its lifecycle, there is nothing but a tech demo video of it, wasnt the xbox designed with this in mind as one of its primary features? and by the time this tech picks up any commercial popularity , the gen will be close to its end.

    • Tech junkie

      not necessarily, this could be the last console gen ever made. If the processing can be done off site whats the point of buying new hardware?

    • GHz

      Maybe. Not all devs are ready to take the leap. But pay attention to what companies like Cloudgine and Shinra Tech are saying. They designed their platform where devs don’t have to know the in n outs of coding for the cloud. Their services pretty much act as middleware, which accomplishes the most technical aspects of cloud game development. All the devs have to do is build their games like they normally would have.
      XB1 cant live by DX9-Dx11 alone and it wont this gen. We’re getting crackdown in 2016. Its a good start if its not delayed.

    • Mark

      Yeah I think graphics rendering over the net for gaming, is a bit more into the future, as opposed to the CPU dependent implementation by Crackdown.

      Don’t get me wrong, Nvidia GRID 2.0 allows for 1080p 60fps gaming as we speak, but I think Microsoft may be working on, or atleast thinking about rendering GPU side computation on Azure, a bit further out. I’d say 2017 we should see something, but who knows, there may be a reveal at next E3..

    • Alistein

      What they are working to achieve with cpu computation in Crackdown is phenomenal a fully destructible city. I imagine when they decide to unveil the Gpu computing side of things, we’d probably see something as phenomenal . I do agree though that we won’t be seeing it’s use till at least 2017 probably around the time will be seeing the first fully developed DX12 game.

    • Mark

      Agreed

    • Mark

      GeForce Now is a beast! Ignore first pic lol.

  • andy

    At this stage I don’t think any dev is depending on the awful bottleneck in Xbone that is the eSRAM. The last 2 years of under performance and disappointment in graphic/resolution output on, even launch/early gen games, is a pure sign of this.

    • mwalker

      I haven’t purchased a game yet that I was disappointed with. Besides Unity. That was horrific across all platforms though.

    • GHz

      But yet Xbox gamers are enjoying the most advance driving game on the planet on console and PC. And by that I mean everything that has been put into Forza6 to make it the most unique racer on any platform.

      Then there is the fact that the software’s for utilizing ESRAM haven’t still matured. XB1 is still using DX11, so it suffers from effectively making drawcalls more efficient. PS4 excels in that department because its API allows it to have better parallelization capabilities, a feature which will be introduced to XB1 when it gets DX12.

      All that is lacking on XB1 but still MSFT premier 1st party titles shows that PS4 has no advantage when it comes to power, and that Sony strips its games to allow a higher fidelity in graphics, leaving their games feeling like unfinished demos instead of full fledged games. All Sony has is 3rd party dominance FOR NOW.

      At the end of the day, XB1 is the platform that is showing off the most advance games. That history is documented and no matter how much you want to deny it, I can drop a 100 citable links with the biggest pple in the industry agreeing on this.

    • Philip Tam

      Wait until Gran Turismo 7 hits PS4. It should be comparable to Forza 6. Imho, Forza 6 looks amazing, but nothing that cannot be done on PS4 graphics wise. Driveclub is not a testament of what PS4 can do. GT7 is. If Sony uses Amazon for cloud engines, it can do what Forza 6 can do.

    • GHz

      That’s what we hope. But the reality is, that’s a bunch of ifs especially when you throw in Amazon. Sony haven’t announced anything in regards. Polyphony Digital is one of Sony’s premier developers. GT7 will no doubt look great on PS4. We’ll just have to wait n see how they’ll use the PS4 beefier GPU and what balanced approach they’ll take to better harness its weaker CPU. Its quite the balancing act when you start throwing in a wealth of features to make your game more meaty.

    • kreator

      lmao

    • Psionicinversion

      thats only if sony want to pay for it which is highly doubtful

    • angh

      “But yet Xbox gamers are enjoying the most advance driving game on the planet on console and PC”

      “..and PC”
      “..PC”

      Sorry, but did you ever heard about iRacing, Assetto Corsa or rFactor?
      Are you seriously trying to compare a pseudo-sim like forza with those games?
      But I guess having only 3 full AAA exclusives on system (forza 5, forza 6 and Sunset Overdrive, if you don’t count remasters) you have to grab on to something.

    • Tech junkie

      AC and rFactor are considered Forza level. iRacing is a better game but have you seen what iRacing costs. It better be better than Forza.

    • angh

      For many sites and opinions I’ve seen I never found a person who would consider Forza on same level as AC or rFactor2. Forza is just way to arcade to compare. It has to be aligned to provide good gaming experience with gamepad only, and this alone tells the story. It is a great game, of course, but I wouldn’t put in on pair with any of those 3 sims. And apart of this, the original post I was answering to was just ridiculous.

    • Tech junkie

      inside sim racing, forza has great wheel feedback

    • Tech junkie

      Forza is not arcade at all, It’s considered PC quality Sim by many. While it does have the ability to make it easier with assists, with the assists removed and a wheel attached it’s a Sim.

      Have you ever played those games? I have, I have a wheel for PC as well. I don’t have for Xbox one but I have used a wheel on Xbox one. Forza is on par.

    • Zarbor

      Clowns think that “most advance” = the best graphics. That is as far as their brain can handle. Its why many of them feel like DriveClub is the more advance game….when it reality its shallow like most of them.

    • sublimetalmsg

      Forza series being the best and most advance is your opinion. GT is a far better simulator. In contrast Forza feels more arcade like. Plus gt has more cars more tracks by a long shot.

    • Mark

      Last I saw, Rise of the Tomb Raider, Fallout 4, Fifa 16, NBA 2K, Forza 6, Elite Dangerous and others all run 1080p……no way they’d get there using just the slow DDR3 sticks of memory, for high resolution assets.

    • kreator

      You must be enjoying all the 1080p 60fps exclusive AAA games on the PS4 right?

    • Tech junkie

      esram is the opposite of a bottle neck. It allows for very low latency data transfer. It’s just more difficult to program for. DDR5 on the other hand while very high bandwidth is also very high latency.
      So while it may make the GPU happy it’s a problem for the CPU. It is why you don’t see DDR5 used for CPU’s in a pc application it’s not like it would be hard to install.

    • Hellfire

      You’re right about ESRAM, it’s purpose is to resolve the slow bandwidth with DDR3, to call it a bottleneck shows a complete lack of understanding.

      DDR5 does not exist yet, DDR4 is in it early consumer stages, you’re talking about GDDR5 that is dedicated GPU RAM, you do not use GPU memory for the CPU (high latency)… it’s important to know the difference between DDR and GDDR.

    • Tech junkie

      Yes, I didn’t put the G. It doesn’t change the fact the GDDR5 in the PS4 is high latency. And there is a reason it’s used in video cards and has not been adapted for use on CPUs in PC applications.

    • Hellfire

      They’re different types of memory so the G is very much required, DDR is system RAM, and GDDR is VRAM.

      I stated the reason in my first reply, “you do not use GPU memory for the CPU (high latency)”. So yes GDDR is high latency / high bandwidth and DDR is low latency / low bandwidth relative to GDDR.

      It’s only in the console space people think these are competing versions of RAM.. my PC has 16GB of DDR3 and 4GB of GDDR5. Games use both, and it’s a big part of why PC games run much better.

    • Tech junkie

      My point is the PS4 uses GDDR5 ram for it’s CPU which is less than optimal. I’m well aware of what the types of ram are, I just didn’t put the G.

      Doesn’t change the fact Sony is using GDDR5 for its CPU and GPU, which is good the the GPU and not good for the CPU. The latency from the GDDR5 to the CPU is the bottleneck in Sony’s hardware design and I believe it is why you see frame drops at CPU intense points in games on the PS4.

      The guy I originally replied to thinks the eSram is a bottleneck. Since most people describe PS4 ram as DDR5 it would likely have confused him if I put the G in it.

    • Hellfire

      At least GDDR5 looks different.. DDR5 looks like it’s a couple of generations ahead of DDR3 (when GDDR5 absolutely is not).

      Yes GDDR5 sucks for the CPU’s time sensitive multi-tasking, and is a big reason of frame drops on the PS4, it’s also the reason PS4 using a secondary lower power processor with 256MB of DDR3.

      So I do agree with your whole argument that in fact GDDR5 is a bottleneck on the system latency, and eSRAM is actually to alleviate a bandwidth bottleneck

      I do understand why you’d try to simplify, and thankfully next-gen GPU RAM will be HBM so there’ll be no confusion.

    • Tech junkie

      Yeah, apparently they stack the HBM right onto of the GPU with insane bandwidth. Shows how far behind these consoles are.

    • Roger Larsson

      Do not compare memory clock cycles when comparing latency.

      Compare actual nanoseconds…

  • 2econd gpu unlocking

    Another reason why have agreed on the NDA been extended for the whole generation now, on our second GPU isn’t really needed because of the cloud.
    Second we can do our 2nd GPU virtually with DX12 anyway.
    We have been playing weak 720p from launch and slowly asked for layer unlocking to get 1080p, Next we will be games only using the cloud to unlock 4k.

    • GHz

      MSFT said there is no 2nd GPU. The only hint we got as to what kind of power the XB1 is really packing is by way of Brad Wardell explaining what he thought DX12 meant to XB1. He said…..

      ““it effectively gives every Xbox One owner a new GPU that is twice as fast as the old one,” – Brad Wardell

      “it’s not literally (it’s software, not hardware) but yes, dx12 games will likely by more than 2x as fast.” – Brad Wardell

      Ive said in the past that DX11 talks only to the left brain of the XB1 and that DX12 will access the right side of the XB1s brain, figuratively speaking. So MSFT is not holding anything back on purpose. DX12 was meant to be released end of this year. We just got the XB1 early that’s all.

      As far as 4k gaming? I don’t know. Even though Major Nelson and Yusef Mehdi says yes if the game devs wants to they can. We can cite them on that. but me personally I don’t believe it. Crackdown, now that I knew was possible.

    • Mark

      He’s a troll man lol, impersonating MisterX Media

    • 2econd gpu unlocking

      We’ve agreed the 2nd GPU is now software based as the Hardware shown in the stacked die is now locked out.
      Yes it was held back to allow Sony to recover.And to show our cards as a
      Pretend weaker system to get Sony to launch early which has the weaker hardware now.
      The power to do 4k is there, it’s when the time is right to unlock another layer.

    • Philip Tam

      Stop spouting out nonsense, the entire SoC has been under autopsy, there is no second dGPU in the Xbox One. If anything, is that the API which is DX12 will help Xbox One to achieve 1080p more often, but by the looks of it, with Halo 5 requiring downsampling of the graphics in order to hit 60 fps then I cannot see why Microsoft would hold back the Xbox One IF it has the second GPU. Why wouldn’t they just unlock it if it’s there, that would give Sony a major headache. COZ it DOES not exist!

    • Mark

      Phil, u guys have gotta stop feeding 2ndGpuUnlocking, he’s just impersonating, mocking MisterX Media, and tryna start senseless arguing…and I can see it’s working. Lol. The dude makes me laugh so hard cuz that’s what MisterX sounds like somewhat. He’s just a troll man.

    • GHz

      Giving Sony a chance to recover I get, because MSFT have done that before with Apple. They understood the importance of having two major players in the PC arena. They knew 2 was better than one. So they helped Apple out of their slump, and the rest was history. But to pretend tat they were a weaker system is off because MSFT came out swinging promising big things. They wouldn’t have said all those things early on if their plan was to pretend that they had a weaker system.

      For instance, in 2013 Yusef Mehdi was telling forbes magazine that the XB1 had no limits and that it can do 4k gaming not just movies. No one understood what he meant. Then you had Group program manager of Xbox Incubation & Prototyping Jeff Henshaw who told the media that when consumers purchase an Xbox One console, they’ll actually have four: one physical, and three virtual that will reside in the cloud. Today we know that XB1 can have access to as much as 13.

      You don’t say those things if you’re going to pretend you have the weaker system.
      About 4k, the only way I thought that may be possible is when they start utilizing Tile Resources. Not to be confused with Partial resident Textures, even though it’s popular to use those terms interchangeably. They are in fact different, with TR being the superior function. But I have to see it to believe it.

    • 2econd gpu unlocking

      True, clouds power means they can keep the second gpu locked out all generation. It’s 5TF without cloud, with could 100TF+
      2 professional research teams x and c both have the data to prove it
      and backed each other.
      Picking the right time, We even support screen tearing on games if it means
      Sony will recover faster with the better game reviews.
      The Faster Sony are back in the green, then more hardware will be shown.
      If they don’t, 4k will not be unlocked.

    • GHz

      I refuse to believe that you’re an xbox fan. You’re just trying to make us look stupid. You sound like those pony idiots on here.Say hello to team XbotMk1 for me. 😉

    • 2econd gpu unlocking

      We have Forza 6. Vs their racer the #delayclub
      We have quantum break. Vs their shooter the order QuickTime 1886
      Why wouldn’t we be Xbox fans

    • Mark

      Lol!

    • kreator

      I believe they will end up streaming in 4k.

    • Psionicinversion

      dont bother with that idiot he just talks crap constantly

  • Mr Xrat

    “The technology is completely hardware agnostic.”

    But Xbox fanboys told me!

    Not that it even matters, only a few first party games’ MP modes will ever utilise it.

    • Psionicinversion

      yeah because developers cant be bothered pushing visuals theyll just keep costs down by developing for pathetic console hardware

    • GHz

      “completely hardware agnostic”

      Absolutely! Because the platform as a service, specializes in offering cloud integration to existing game creation technologies. And those game creation technologies varies like the engine that was built to develop Crackdown. That game is built from the ground up to utilize MSFT cloud. Doesn’t mean that you can throw that same code at PS4 and Gaikai & BOOM, PS4 will give you similar results. Cerny claimed that wouldn’t work well in the cloud. However Sony can integrate Cloudgine to handle everything, calculations and rendering. No rendering would be done on the PS4 side if we go by what Yoshida and Cerny said. XB1 does the rendering. Its not streaming the visuals, FX, destruction polygons etc. Don’t get it twisted. If you don’t understand, shut up.

    • Zarbor

      I agree…. “If you don’t understand, shut up.”
      Not too long ago many idiots where claiming that it can’t be done. Now that the tech is real they still don’t get it yet flapping their gums with pure nonsense.

    • sublimetalmsg

      its not real for me yet. I have not seen or played it first hand =)

    • Zarbor

      Neither is Halo 5, Uncharted 4, Tomb Raider, Fallout but I can assure you those are equally as real.


More From GamingBolt

 

Copyright © 2009-2017 GamingBolt.com. All Rights Reserved.