Los Angeles, California - Want to see the future of gaming? Look in the mirror.
Video games are increasingly allowing players to custom design their own characters – often with the intention of inserting themselves into the game. Until now, players relied on pre-designed faces and body types provided by a game’s creators.
Researchers at the USC Institute for Creative Technologies are making character design more personal. They’ve released a set of free tools to allow players to upload their own face and body into a game. It takes just four minutes to scan and upload a digital avatar of yourself, and the kit supports a range of game engines, including Unity and Unreal.
The digital toolkit includes three components and relies on the Microsoft Kinect to scan the player with a high degree of photorealistic detail. The three components include scanning software; automatic rigging software to convert 3D models into a game- or simulation-ready character; and simulation software, called SmartBody which allows you to animate and control the 3D character. SmartBody provides a variety of ready-made animations: players can watch themselves running, interacting with in-game objects, lip-synching to pre-recorded speech and even performing non-verbal behavior like gestures. More complex facial expressions are on the way.
“We’re giving everyone the ability to scan and animate themselves for free,” said Ari Shapiro, one of the leads on the project who heads the Character Animation and Simulation research group. He said the team, which includes Evan Suma and Andrew Feng, is interested in putting the software in the public domain to see what creative uses people come up with for it.
Click the image below to see how the animation process works.
“We’re trying to foster innovation,” Shapiro said. “While tools to create games and capture 3D exist, the toolchain to bring the entire process together typically requires expert artistic intervention and a complicated set of processes. We are providing tools and software that, without any expertise, allow you to create and animate a 3D version of yourself in four minutes.
“The community can now develop interesting applications with it. The applications could extend beyond games and into social media, communication, training and more.”
In-game avatars have been growing increasingly complex. Games like the popular Fallout 4 allow players to choose from pre-selected faces, gender and skin tones. Character creation videos on YouTube range from celebrities to Halloween mask grotesques.
Why bother customizing a character to this kind of degree? Shapiro says there are all sorts of social motivations that drive gameplay. Having a personalized character communicates identity to the other players around you.
It also may affect how people experience the game for themselves. Shapiro and his colleagues have been researching whether people make different decisions or have different emotional reactions when playing a game with a personalized avatar. They’re interested in learning if people are more invested in a simulation when the player character looks exactly like them. If something bad happens to the player character, does make them more invested in the game?
Shapiro sees a range of uses for personalized avatars. The U.S. Army Research Lab, which funds this research, is especially interested in training simulations. Virtual avatars could also play a large role in the future of communications tools. Shapiro notes that the Oculus Rift has an app, Oculus Social, where players can interact with each other in a virtual space. Right now, their avatars are generic. But how would people’s behavior change if they could insert their own likeness into a virtual room?
“I can see a revolution in social interaction using your own 3D avatar as a means of communication,” Shapiro said. “Face-to-face interactions have the potential for more complex and nuanced kinds of communication. A 3D avatar of yourself could provide some of that complexity in virtual scenarios.”