WO2002009040A1 - Method and system for generating an avatar animation transform using a neutral face image - Google Patents
Method and system for generating an avatar animation transform using a neutral face image Download PDFInfo
- Publication number
- WO2002009040A1 WO2002009040A1 PCT/US2001/041397 US0141397W WO0209040A1 WO 2002009040 A1 WO2002009040 A1 WO 2002009040A1 US 0141397 W US0141397 W US 0141397W WO 0209040 A1 WO0209040 A1 WO 0209040A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- avatar
- generating
- head image
- animation transform
- transform
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
- G06T13/40—3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
Definitions
- the present invention relates to avatar animation, and more particularly, to generation of an animation transform using a neutral face image.
- Virtual spaces filled with avatars are an attractive the way to allow for the experience of a shared environment.
- manual creation of a photo- realistic avatar is time consuming and automated avatar creation is prone to artifacts and feature distortion.
- the present invention is embodied in a method, and related system, for generating an avatar animation transform using a neutral face image.
- the method may include providing a neutral-face front head image and a side head image for generating an avatar and automatically finding head feature locations on the front head image and the side head image using elastic bunch graph matching. Nodes are automatically positioned at feature locations on the front head image and the side head image. The node positions are manually reviewed and corrected to remove artifacts and minimize distorted features in the avatar generated based on the node positions.
- the method may further include generating an animation transform based on the corrected node positions for the neutral face.
- the method also may include applying the animation transform to expression face avatar meshes for generating the avatar.
- FIG. 1 is a flow diagram for illustrating a method for generating an avatar animation transform using a neutral face image, according to the present invention.
- FIG. 2 is an image of an avatar editor for generating an avatar, according to the present invention.
- FIG. 3 is an image of a rear view of an avatar generated using anchor points provided by the avatar editor of FIG. 2.
- FIG. 4 is an image of an avatar editor for generating an avatar using anchor point positions corrected to remove artifacts and distortions from the avatar image, according to the present invention.
- FIG. 5 is an image of a rear view of an avatar generated using the corrected anchor point positions shown in FIG. 4, according to the present invention.
- FIG. 6 is a graph of facial expression features versus avatar mesh for linear regression mapping of sensed facial features to an avatar mesh.
- the present invention is embodied in a method, shown in FIG. 1 , and a system for generating an animation transform using a neutral face image.
- An avatar editor uses a frontal head image and a side head image of a neutral face model for generating an avatar (block 12).
- the avatar is generated by automatically finding head feature locations on the front and side head images using elastic bunch graph matching (block 14). Locating features in an image using elastic bunch graph matching is described in U.S. patent application serial number 09/188,079.
- an image is transformed into Gabor space using a wavelet transformations based on Gabor wavelets.
- the transformed image is represented by complex wavelet component values associated with each pixel of the original image.
- Elastic bunch graph matching automatically places node graphs having anchor points on .the front and side head images, respectively.
- the anchor points are placed at the general location of facial features found using the matching process (block 16).
- An avatar editor window 26, shown in FIG. 2, allows a user to generate an avatar that looks and appears similar to a model.
- a new avatar 28 is generated based on the front head image 30 and a side head image 32 of the model.
- an existing avatar may be edited to the satisfaction of the user.
- the front and side images are mapped onto an avatar mesh.
- the avatar may be animated or driven by moving drive control points on the mesh. The motion of the drive control points may be directed by facial feature tracking.
- the avatar editor window 26 includes a wizard (not shown) that leads the user through a sequence of steps for allowing the user to improve the accuracy of tracking of an avatar tracker.
- the avatar wizard may include a tutor face that prompts the user to make a number of expressions and varying head poses. An image is taken for each expression or pose and facial features are automatically located for each face image. However, certain artifacts of the image may cause the feature process to place feature nodes at erroneous locations. In addition, correct node locations may generate artifacts that detract from a photorealistic avatar. Accordingly, the user has the opportunity to manually correct the positions of the automatically located features (block 18).
- the front and side head images, 30 and 32, shown in FIG. 2 have a shadow outline that is erroneously detected as the profile outline of the side head image 32.
- certain features, such as the model's ears have numerous patterns which may cause erroneous node placement.
- the avatar 28 may have artificial eye and teeth inserts that are "exposed" while the eyes and/or the mouth are open. Accordingly, although the matching process is able to correctly locate the nodes, of the resulting avatar may have distracting features.
- Empirical adjustment of the node locations may result in a more photo- realistic avatar.
- a rear view of the avatar 28, shown in FIG. 3 is generated using the node locations shown in the avatar editor window 26 of FIG.
- a particularly distracting artifact is a white patch 34 on the rear of the head.
- the white patch appears because the automatically placed node locations cause a portion of the white background of the side head image 32 to be patched onto the rear of the avatar.
- the incorrectly placed nodes may be manually adjusted, at shown in FIG. 4, for more accurate placement of the nodes to the corresponding features.
- Generic head models, 36 and 38 have the node locations indicated so that a user may correctly place the node locations on the front and side head images.
- a node is moved by clicking a pointer, such as a mouse, on the node and dragging the node to the desired position.
- the avatar based on the corrected node positions is a more photo-realistic avatar.
- the node locations at the back of the head on the side head image are adjusted to eliminate the distracting white patch as shown in FIG. 5.
- the model images shown in FIGS. 2-5 are of a neutral face.
- Using several avatar meshes corresponding to a variety of facial expressions allows for more accurate depiction of a sensed facial expressions.
- Meshes for different expressions may be referred to as morph targets.
- one avatar mesh M SMILE may be generated using features f SMILE from smiling face images.
- Another avatar mesh M EXCL may be generated using a facial features f EXCL from face images showing surprise or exclamation.
- the neutral facial features f NEUTRAL correspond the avatar mesh M NEUTRAL .
- Sensed facial features f SENSED maybe mapped to a corresponding avatar mesh M SENSED using linear regression.
- a photo-realistic avatar may require as many as 14 to 18 expression-based avatars meshes.
- the animation transform for the neutral face features may be applied to the other facial expression avatar meshes to improve the quality of the resulting avatars (block 22).
- the avatar mesh associated with a smile may be transformed by the neutral face animation transform p as indicated in equation 2.
- M SMILE P • S ILE Equation 2
- the neutral face-based animation transform provides significant improvement to the facial expression head models without the significant editing time incurred by generating a particular animation transform for each particular facial expression (and/or pose).
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Graphics (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Processing Or Creating Images (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2002514667A JP2004509391A (en) | 2000-07-24 | 2001-07-24 | Avatar video conversion method and device using expressionless facial image |
AU2001281335A AU2001281335A1 (en) | 2000-07-24 | 2001-07-24 | Method and system for generating an avatar animation transform using a neutral face image |
KR10-2003-7001108A KR20030029638A (en) | 2000-07-24 | 2001-07-24 | Method and system for generating an avatar animation transform using a neutral face image |
EP01959816A EP1436781A1 (en) | 2000-07-24 | 2001-07-24 | Method and system for generating an avatar animation transform using a neutral face image |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US22033000P | 2000-07-24 | 2000-07-24 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2002009040A1 true WO2002009040A1 (en) | 2002-01-31 |
Family
ID=22823129
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2001/041397 WO2002009040A1 (en) | 2000-07-24 | 2001-07-24 | Method and system for generating an avatar animation transform using a neutral face image |
Country Status (5)
Country | Link |
---|---|
EP (1) | EP1436781A1 (en) |
JP (1) | JP2004509391A (en) |
KR (1) | KR20030029638A (en) |
AU (1) | AU2001281335A1 (en) |
WO (1) | WO2002009040A1 (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2007059077A2 (en) | 2005-11-14 | 2007-05-24 | E.I.Du Pont De Nemours And Company | Compositions and methods for altering alpha- and beta-tocotrienol content |
WO2008147935A2 (en) | 2007-05-24 | 2008-12-04 | E. I. Du Pont De Nemours And Company | Dgat genes from yarrowia lipolytica for increased seed storage lipid production and altered fatty acid profiles in soybean |
WO2011062748A1 (en) | 2009-11-23 | 2011-05-26 | E.I. Du Pont De Nemours And Company | Sucrose transporter genes for increasing plant seed lipids |
EP2620502A2 (en) | 2008-05-23 | 2013-07-31 | E. I. du Pont de Nemours and Company | DGAT genes from oleaginous organisms for increased seed storage lipid production and altered fatty acid profiles in oilseed plants |
WO2016101131A1 (en) * | 2014-12-23 | 2016-06-30 | Intel Corporation | Augmented facial animation |
US9799133B2 (en) | 2014-12-23 | 2017-10-24 | Intel Corporation | Facial gesture driven animation of non-facial features |
US9824502B2 (en) | 2014-12-23 | 2017-11-21 | Intel Corporation | Sketch selection for rendering 3D model avatar |
WO2019177870A1 (en) * | 2018-03-15 | 2019-09-19 | Magic Leap, Inc. | Animating virtual avatar facial movements |
US11069115B2 (en) | 2019-02-20 | 2021-07-20 | Samsung Electronics Co., Ltd. | Method of controlling display of avatar and electronic device therefor |
US11303850B2 (en) | 2012-04-09 | 2022-04-12 | Intel Corporation | Communication using interactive avatars |
US11887231B2 (en) | 2015-12-18 | 2024-01-30 | Tahoe Research, Ltd. | Avatar animation system |
WO2025038916A1 (en) * | 2023-08-16 | 2025-02-20 | Roblox Corporation | Automatic personalized avatar generation from 2d images |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102565755B1 (en) | 2018-02-23 | 2023-08-11 | 삼성전자주식회사 | Electronic device for displaying an avatar performed a motion according to a movement of a feature point of a face and method of operating the same |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1998001830A1 (en) * | 1996-07-05 | 1998-01-15 | British Telecommunications Public Limited Company | Image processing |
WO1999053443A1 (en) * | 1998-04-13 | 1999-10-21 | Eyematic Interfaces, Inc. | Wavelet-based facial motion capture for avatar animation |
-
2001
- 2001-07-24 WO PCT/US2001/041397 patent/WO2002009040A1/en active Search and Examination
- 2001-07-24 AU AU2001281335A patent/AU2001281335A1/en not_active Abandoned
- 2001-07-24 EP EP01959816A patent/EP1436781A1/en not_active Withdrawn
- 2001-07-24 JP JP2002514667A patent/JP2004509391A/en not_active Withdrawn
- 2001-07-24 KR KR10-2003-7001108A patent/KR20030029638A/en not_active Application Discontinuation
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1998001830A1 (en) * | 1996-07-05 | 1998-01-15 | British Telecommunications Public Limited Company | Image processing |
WO1999053443A1 (en) * | 1998-04-13 | 1999-10-21 | Eyematic Interfaces, Inc. | Wavelet-based facial motion capture for avatar animation |
Non-Patent Citations (2)
Title |
---|
LEE Y ET AL: "CONSTRUCTING PHYSICS-BASED FACIAL MODELS OF INDIVIDUALS", PROCEEDINGS/COMPTE RENDU GRAPHICS INTERFACE, XX, XX, 1993, pages 1 - 8, XP002064612 * |
YUENCHENG LEE ET AL: "REALISTIC MODELING FOR FACIAL ANIMATION", COMPUTER GRAPHICS PROCEEDINGS. LOS ANGELES, AUG. 6 - 11, 1995, COMPUTER GRAPHICS PROCEEDINGS (SIGGRAPH), NEW YORK, IEEE, US, 6 August 1995 (1995-08-06), pages 55 - 62, XP000546216, ISBN: 0-89791-701-4 * |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2007059077A2 (en) | 2005-11-14 | 2007-05-24 | E.I.Du Pont De Nemours And Company | Compositions and methods for altering alpha- and beta-tocotrienol content |
WO2008147935A2 (en) | 2007-05-24 | 2008-12-04 | E. I. Du Pont De Nemours And Company | Dgat genes from yarrowia lipolytica for increased seed storage lipid production and altered fatty acid profiles in soybean |
EP2730656A1 (en) | 2007-05-24 | 2014-05-14 | E. I. du Pont de Nemours and Company | Soybean meal from beans having DGAT genes from yarrowia lipolytica for increased seed storage lipid production and altered fatty acid profiles in soybean |
EP2620502A2 (en) | 2008-05-23 | 2013-07-31 | E. I. du Pont de Nemours and Company | DGAT genes from oleaginous organisms for increased seed storage lipid production and altered fatty acid profiles in oilseed plants |
EP2620500A2 (en) | 2008-05-23 | 2013-07-31 | E. I. du Pont de Nemours and Company | DGAT genes from oleaginous organisms for increased seed storage lipid production and altered fatty acid profiles in oilseed plants |
EP2620501A2 (en) | 2008-05-23 | 2013-07-31 | E. I. du Pont de Nemours and Company | DGAT genes from oleaginous organisms for increased seed storage lipid production and altered fatty acid profiles in oilseed plants |
WO2011062748A1 (en) | 2009-11-23 | 2011-05-26 | E.I. Du Pont De Nemours And Company | Sucrose transporter genes for increasing plant seed lipids |
US11303850B2 (en) | 2012-04-09 | 2022-04-12 | Intel Corporation | Communication using interactive avatars |
US11595617B2 (en) | 2012-04-09 | 2023-02-28 | Intel Corporation | Communication using interactive avatars |
US11295502B2 (en) | 2014-12-23 | 2022-04-05 | Intel Corporation | Augmented facial animation |
US9830728B2 (en) | 2014-12-23 | 2017-11-28 | Intel Corporation | Augmented facial animation |
US10540800B2 (en) | 2014-12-23 | 2020-01-21 | Intel Corporation | Facial gesture driven animation of non-facial features |
US9799133B2 (en) | 2014-12-23 | 2017-10-24 | Intel Corporation | Facial gesture driven animation of non-facial features |
WO2016101131A1 (en) * | 2014-12-23 | 2016-06-30 | Intel Corporation | Augmented facial animation |
US9824502B2 (en) | 2014-12-23 | 2017-11-21 | Intel Corporation | Sketch selection for rendering 3D model avatar |
US11887231B2 (en) | 2015-12-18 | 2024-01-30 | Tahoe Research, Ltd. | Avatar animation system |
WO2019177870A1 (en) * | 2018-03-15 | 2019-09-19 | Magic Leap, Inc. | Animating virtual avatar facial movements |
US11430169B2 (en) | 2018-03-15 | 2022-08-30 | Magic Leap, Inc. | Animating virtual avatar facial movements |
US12210666B2 (en) | 2018-03-15 | 2025-01-28 | Magic Leap, Inc. | Animating virtual avatar facial movements |
US11069115B2 (en) | 2019-02-20 | 2021-07-20 | Samsung Electronics Co., Ltd. | Method of controlling display of avatar and electronic device therefor |
WO2025038916A1 (en) * | 2023-08-16 | 2025-02-20 | Roblox Corporation | Automatic personalized avatar generation from 2d images |
Also Published As
Publication number | Publication date |
---|---|
JP2004509391A (en) | 2004-03-25 |
KR20030029638A (en) | 2003-04-14 |
AU2001281335A1 (en) | 2002-02-05 |
EP1436781A1 (en) | 2004-07-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20020067362A1 (en) | Method and system generating an avatar animation transform using a neutral face image | |
US20210045701A1 (en) | Simulated orthodontic treatment via augmented visualization in real-time | |
EP2043049B1 (en) | Facial animation using motion capture data | |
US7733346B2 (en) | FACS solving in motion capture | |
EP1436781A1 (en) | Method and system for generating an avatar animation transform using a neutral face image | |
US8659596B2 (en) | Real time generation of animation-ready 3D character models | |
US6714661B2 (en) | Method and system for customizing facial feature tracking using precise landmark finding on a neutral face image | |
US20150193975A1 (en) | Real Time Concurrent Design of Shape, Texture, and Motion for 3D Character Animation | |
US7567251B2 (en) | Techniques for creating facial animation using a face mesh | |
EP2718903B1 (en) | Controlling objects in a virtual environment | |
WO2010060113A1 (en) | Real time generation of animation-ready 3d character models | |
US7961947B2 (en) | FACS cleaning in motion capture | |
EP2615583B1 (en) | Method and arrangement for 3D model morphing | |
CN107230250B (en) | Forming method for direct three-dimensional modeling by referring to solid specimen | |
EP2047429B1 (en) | Facs solving in motion capture | |
GB2632743A (en) | Techniques for re-aging faces in images and video frames | |
CN114387392B (en) | Method for reconstructing three-dimensional human body posture according to human shadow | |
CN118864736A (en) | Method and device for molding oral prosthesis model | |
AU2001277148B2 (en) | Method and system for customizing facial feature tracking using precise landmark finding on a neutral face image | |
US20240402827A1 (en) | Method for controlling at least one characteristic of a controllable object, a related system and related device | |
JP2022152058A (en) | Information processing system, information processing method and information processing program | |
JP2002157605A (en) | Device and method for image processing, and recording medium with recorded program for image processing | |
Radford | Virtual history: the secret plot to kill Hitler | |
WO2024103190A1 (en) | Method and system for image processing across multiple frames using machine learning | |
CN116188318A (en) | Face side face beautifying method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2001281335 Country of ref document: AU |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020037001108 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2001959816 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 1020037001108 Country of ref document: KR |
|
WWP | Wipo information: published in national office |
Ref document number: 2001959816 Country of ref document: EP |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 2001959816 Country of ref document: EP |
|
DPE2 | Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101) |