Google Stadia is persisting to exhibit more sensitive than local hardware … ultimately. At least, Stadia architecture boss Madj Bakar is performing that call in a discussion in the latest problem of Edge magazine. That ingests not resemble feasible, right?. Latency is essential to transmitting video over the internet. You canister did not violate the rules of physics. Besides, the fundamentals of physics nevermore included on what Google is calling “negative latency.”
Negative latency is a sequence of procedures that Google will use to decrease the lag within your screen and Stadia’s servers. The idea is that Stadia’s network of super-powerful gaming GPUs and CPUs will often have enough spare power for some clever tricks
Plenty of gamers are skeptical about cloud-based gaming. Fast internet connections just aren’t prevalent enough, the argument goes, and lag will kill the gameplay experience. Google is diving in headfirst with Stadia, though. In an interview with Edge, Stadia’s VP of Engineering Madj Bakar said the platform will eventually be more responsive than consoles. And while the technology to do such qualities sounds ridiculous, the underlying thoughts may demonstrate to be productive.
Now Google is promising something that sounds even better: “negative latency.” While that term sounds like it literally translates to “time travel,” what Stadia’s head of engineering, Majd Bakar, meant when said it was that emerging technology will eventually allow Stadia to reduce latency to the point where it’s basically nonexistent—making games on the service more responsive than even those on PCs and consoles.
The feature will be, purportedly, compatible with every applicable game on the platform, using Steam’s extant remote play technology to simulate everyone being in the same room together, playing on the same instance of the game. There’s no simulating trying to steal another player’s controller when they’re kicking your ass, though.
For Google, negative latency isn’t approximately about transferring information and frames to/from the server quicker than the velocity of light. It’s approximately about decreasing all of the different sources of latency. So intention that ends up assuming more reliable than local appliance? Unconditionally — if your determination of local hardware is a console playing game on a TV at 30 frames per second. And it’s likely that’s specifically what Google intends.