Invention Grant
- Patent Title: Systems configured to control digital characters utilizing real-time facial and/or body motion capture and methods of use thereof
-
Application No.: US17718016Application Date: 2022-04-11
-
Publication No.: US11615571B2Publication Date: 2023-03-28
- Inventor: Amanda Legge , Chen Shen
- Applicant: Capital One Services, LLC
- Applicant Address: US VA McLean
- Assignee: Capital One Services, LLC
- Current Assignee: Capital One Services, LLC
- Current Assignee Address: US VA McLean
- Agency: Greenberg Traurig, LLP
- Main IPC: G06T13/40
- IPC: G06T13/40 ; G06F3/01

Abstract:
In some embodiments, the present disclosure provides an exemplary technically improved system and method for controlling the body movements and facial expressions of a digital character in real time by using: a body-motion capture system comprising a headset configured to be worn on a head of a user and comprising controllers and sensors that can be used to track at least one head or body motion of the user (including arms and hands); and correspondingly control at least one head or body motion (including arms and hands) of a digital character in real time based, at least in part, on the captured motion data; a mobile computing device configured to track and capture facial expression data relating to at least one facial expression of the user, and use at least that data to correspondingly control at least one facial expression of the digital character in real time; a microphone configured to capture an audio output of the user, and control an audio output of the digital character in real-time based, at least in part, on the captured audio output; an integration computing device configured to integrate the audio output, the motion data and the facial expression data to control the audio output, the motion, and the facial expression of the digital character; a vest configured to be worn on an body of the user and a structural member attached to the vest and configured to hold the mobile computing device at a predetermined distance from a face of the user so that the mobile computing device can track and capture the at least one facial expression of the user.
Public/Granted literature
Information query