ONLINE MODELING FOR REAL-TIME FACIAL ANIMATION
First Claim
1. A method for real-time facial animation, comprising:
- providing a dynamic expression model;
receiving tracking data corresponding to a facial expression of a user;
estimating tracking parameters based on the dynamic expression model and the tracking data; and
refining the dynamic expression model based on the tracking data and the estimated tracking parameters to produce a refined dynamic expression model.
4 Assignments
0 Petitions
Accused Products
Abstract
Embodiments relate to a method for real-time facial animation, and a processing device for real-time facial animation. The method includes providing a dynamic expression model, receiving tracking data corresponding to a facial expression of a user, estimating tracking parameters based on the dynamic expression model and the tracking data, and refining the dynamic expression model based on the tracking data and estimated tracking parameters. The method may further include generating a graphical representation corresponding to the facial expression of the user based on the tracking parameters. Embodiments pertain to a real-time facial animation system.
-
Citations
27 Claims
-
1. A method for real-time facial animation, comprising:
-
providing a dynamic expression model; receiving tracking data corresponding to a facial expression of a user; estimating tracking parameters based on the dynamic expression model and the tracking data; and refining the dynamic expression model based on the tracking data and the estimated tracking parameters to produce a refined dynamic expression model. - View Dependent Claims (2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16)
-
-
17. A processing device, comprising:
-
an input interface configured to receive tracking data corresponding to a facial expression of a user; a memory configured to store a dynamic expression model; and a processing component coupled to the input interface and the memory, wherein the processing component is configured to; estimate tracking parameters based on the dynamic expression model and the tracking data, and refine the dynamic expression model based on the tracking data and the estimated tracking parameters to produce a refined dynamic expression model. - View Dependent Claims (18, 19, 20, 21, 22, 23, 24, 25)
-
-
26. A real-time facial animation system, comprising:
-
a camera device, wherein the camera device is configured to generate tracking data corresponding to a facial expression of a user; and a processing device, wherein the processing device comprises; an input interface, wherein the input interface is coupled to the camera device and configured to receive the tracking data, a memory, wherein the memory is configured to store a dynamic expression model, wherein the dynamic expression model includes a plurality of blendshapes, and a processing component, wherein the processing component is coupled to the input interface and the memory, wherein the processing component is configured to estimate a corresponding plurality of weights for the plurality of blendshapes of the dynamic expression model based on the tracking data, wherein the processing component is configured to generate a graphical representation corresponding to the facial expression of the user based on the plurality of weights, wherein the processing component is configured to refine the dynamic expression model based on the tracking data and the plurality of weights for the blendshapes. - View Dependent Claims (27)
-
Specification