You've been logged out of GDC Vault since the maximum users allowed for this account has been reached. To access Members Only content on GDC Vault, please log out of GDC Vault from the computer which last accessed this account.

Click here to find out about GDC Vault Membership options for more users.

close

The Number One Educational Resource for the Game Industry

Session Name: Machine Learning Summit: Full-Body Animation Generation for Expressive NPCs
Speaker(s): Yu Ding
Company Name(s): Netease
Track / Format: Machine Learning Summit

Did you know free users get access to 30% of content from the last 2 years?


Get your team full access to the most up to date GDC content

Overview: Animations are extremely important to augment the believability and effectiveness of non-player characters (NPCs) for game users. Our session describes a novel approach, relying on deep learning technologies, to automatically synthesize high-quality and life-like full-body animations for talking NPCs. The animations involve lip and chin, upper facial expression (eyebrows, upper and bottom eyelids, and eyeballs), head rotation, torso and hand gestures, and legs and feet. The synthesized animations can reflect speech prosody, express the emotional state expressed by speech and correspond to specific personalities and professions of NPCs. Without any manual intervention, it takes only less than 500ms to compute full-body animation trajectories for an utterance which lasts about 5 to 20 seconds. The automatic animation generator allows animators or artists to be released from manually creating animations and processing motion capture data.

Game Developers Conference 2021

Yu Ding

Netease

free content

Machine Learning Summit

Programming