[ad_1]
In video games, Unplayable characters can be somewhat powerless. An NPC might walk through one block, then face a street light, and then might disappear into the next block. The NPC jumps into the player character’s fist or promises to kick the wall 400 times, never knowing the wall won’t kick back.
Unity Technologies is in the NPC business. Founded in 2004, Unity powered hundreds of video games with its namesake game engine using its real-time 3D computer graphics technology. Unity also offers countless tools that integrate with the game engine, including AI tools. In the Unity game engine, developers design their 3D city blocks and streetlights; model their NPCs; animate their fists; and perhaps — through Unity’s artificial intelligence — teach them when to stop kicking.
Five years ago, Unity executives realized that there are many situations in the real world that would greatly benefit from NPCs. Consider designing a roller coaster. Engineers can’t ask humans to stand on a roller coaster before making a sharp turn to test whether they’ll fly away. They definitely can’t ask them to do it 100 or 1000 times, just to make sure.But if an NPC has all the relevant qualities of a human—weight, movement, even a little impulsiveness—the engineer can whip them 100,000 times like a crazy kid at play roller coaster tycoon, to discern the circumstances under which they would be expelled. Of course, the coaster will also be digital, with its metal flexing over time, and the speed at which the car sinks and rises depending on the number of passengers.
Unity has incorporated this idea into its business and is now using its game engine technology to help clients create “digital twins” of real-life objects, environments, and, most recently, characters. “The real world is so limited,” Danny Lange, Unity’s senior vice president of artificial intelligence, said last October at Unity’s San Francisco headquarters. In a 2020 interview with Wired magazine, he told me: “In a synthetic world, you basically rebuild a world for the training system that is better than the real world. I can use this data in Unity to create more Scenes.”
Digital twins are virtual clones of things in real life that behave and react the same as their physical counterparts in virtual space. Or at least, that’s what the term means. The word “twin” does a lot of the heavy lifting. It will be a long time before simulations have one-to-one fidelity with their references. These “twins” require a lot of manpower to create. Currently, though, dozens of companies are using Unity to model digital twins of robots, production lines, buildings, and even wind turbines to virtually design, operate, monitor, optimize, and train them. These twins rust in the rain and are accelerated by the lubricant. They learn to avoid bumps or identify damaged gears. With a sufficiently accurate digital twin, Unity’s game engine can even collect “synthetic data” from simulations to better understand and advance its IRL stand-ins, Langer said.
“We’re actually a massive data company,” Lange said. “We realized early on that at the end of the day, real-time 3D is about data, not just data.” Unity’s main digital twin customers are in industrial machinery, where they can use digital simulations to replace more expensive physical models . Unity executives believe the company’s real-time 3D technology and artificial intelligence capabilities allow them to compete with many other companies entering the $3.2 billion space, including IBM, Oracle and Microsoft. Unity’s senior vice president of digital twins, David Rhodes, said his goal is for Unity to one day be able to host a “digital twin of the world.”
[ad_2]
Source link