US20090177976A1 - Managing and presenting avatar mood effects in a virtual world - Google Patents
Managing and presenting avatar mood effects in a virtual world Download PDFInfo
- Publication number
- US20090177976A1 US20090177976A1 US11/971,508 US97150808A US2009177976A1 US 20090177976 A1 US20090177976 A1 US 20090177976A1 US 97150808 A US97150808 A US 97150808A US 2009177976 A1 US2009177976 A1 US 2009177976A1
- Authority
- US
- United States
- Prior art keywords
- mood
- avatar
- user
- virtual world
- effect
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/04—Real-time or near real-time messaging, e.g. instant messaging [IM]
Definitions
- the present invention relates to virtual worlds, such as simulations of the real-world or real-life, and the like, and more particularly to managing and presenting avatar mood effects in a virtual world.
- Computer based simulations are becoming more ubiquitous. Simulations may be used for training purposes, for entertainment, for commerce or for other purposes.
- Computer simulations such as Second Life or similar simulations present a virtual world which allows users or players to be represented by characters known as avatars.
- Second Life is an Internet-based virtual world launched in 2003 by Linden Research, Inc.
- a downloadable client program called the Second Life Viewer enables users, called “Residents”, to interact with others in the virtual world through motional avatars.
- the virtual world basically simulates the real world or environment.
- the users or residents via their avatar can explore the virtual world, meet other users or residents, socialize, participate in individual and group activities, create and trade items (virtual property) and services from one another.
- a method for managing and presenting avatar moods or mood effects in a virtual world may include allowing a mood or mood effect to be associated with a user's avatar in the virtual world from a plurality of predefined moods or mood effects. The method may also include presenting the associated mood or mood effect to other users of the virtual world.
- a method for managing and presenting avatar moods or mood effects in a virtual world may include profiling a set of mood or mood effect changes to portray different real-world emotions and moods by a user's avatar to other users in the virtual world.
- the method may also include defining a predetermined action based on each mood or mood effect change, the predetermined action to be performed by the user's avatar in the virtual world in response to the user's avatar being associated with a mood corresponding to the mood effect change.
- the method may further include triggering the change in the mood or mood effect in response to occurrence of a predetermined event.
- a method for managing and presenting a mood of a user's avatar in a virtual world may include presenting a mood or mood effect of the user's avatar to other users in the virtual world.
- Presenting the mood or mood effect may include presenting a predefined script in association with the user's avatar to other users in the virtual world to indicate the mood of the user's avatar.
- Presenting the mood or mood effect may also include performing a predefined action by the user's avatar to indicate the mood of the user's avatar.
- Presenting the mood or mood effect may additionally include presenting the user's avatar with a predetermined appearance to indicate the mood of the user's avatar.
- FIG. 1 is a flow chart of an example of a method for managing and presenting mood effects in a virtual world in accordance with an embodiment of the present invention.
- FIG. 2 is an example of a graphical user interface for defining or editing mood effects in a virtual world in accordance with an embodiment of the present invention.
- FIG. 3 is an illustration of a tagging mechanism or visual identifier to present a mood effect of a user's avatar in the virtual world in accordance with an embodiment of the present invention.
- FIG. 4 is a diagram of an example of a system for defining, managing and presenting mood effects in a virtual world in accordance with an embodiment of the present invention.
- FIG. 5 is a block schematic diagram of an example of a system for managing and presenting mood effects in a virtual world in accordance with an embodiment of the present invention.
- the present invention may be embodied as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, the present invention may take the form of a computer program product on a computer-usable storage medium having computer-usable program code embodied in the medium.
- the computer-usable or computer-readable medium may be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection having one or more wires, a tangible medium such as a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), or other tangible optical or magnetic storage device; or transmission media such as those supporting the Internet or an intranet.
- a tangible medium such as a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory
- an optical fiber such as those supporting the Internet or an intranet.
- CD-ROM compact disc read
- a computer-usable or computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted, or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory.
- a computer-usable or computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
- the computer-usable medium may include a propagated data signal with the computer-usable program code embodied therewith, either in baseband or as part of a carrier wave.
- the computer usable program code may be transmitted using any appropriate medium, including but not limited to the Internet, wireline, optical fiber cable, radio frequency (RF) or other means.
- RF radio frequency
- Computer program code for carrying out operations of the present invention may be written in an object oriented programming language such as Java, Smalltalk, C++ or the like.
- the computer program code for carrying out operations of the present invention may also be written in conventional procedural programming languages, such as the “C” programming language or similar programming languages, or in functional programming languages, such as Haskell, Standard Meta Language (SML) or similar programming languages.
- the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
- the remote computer may be connected to the user's computer through a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
- LAN local area network
- WAN wide area network
- Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
- These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function/act specified in the flowchart and/or block diagram block or blocks.
- the computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
- FIG. 1 is a flow chart of an example of a method 100 for managing and presenting mood or mood effects in a virtual world in accordance with an embodiment of the present invention.
- mood, mood effects, emotions or similar terms may be used interchangeably herein to describe a state of mind, feeling, mood, emotion or the like of a user, a user's avatar in the virtual world or both, and to represent or convey this mood emotion, state of mind or the like to other users in the virtual world.
- GUI graphical user interface
- the mood effects GUI may be presented in response to a user selecting a feature, such as a menu, icon or other mechanism in a virtual world being presented on a display of the user's computer system.
- the GUI may be presented by a user selecting or clicking-on the menu, icon or other mechanism using a computer pointing device or similar device.
- FIG. 2 is an example of a GUI 200 for defining or editing mood effects in a virtual world in accordance with an embodiment of the present invention.
- each mood 201 or mood effect may be defined as a profile 202 , such as “happy” 204 or “sad” 206 .
- These profiles 202 may become extremely complex similar to human moods and emotions. Examples of more complex profiles may include “very happy”, “best day of my life”, “don't talk to me”, “bad” or any other moods or emotions.
- the profiles may define certain actions based on these moods. For example, as illustrated in FIG. 2 , an action that may be associated with the happy profile 204 may be an avatar performing a wave action 208 . An example of an action that may be associated with the sad profile 206 may be utterance of a grunt 210 by an avatar.
- mood effects may be attributes standardized by a virtual world owner or operator.
- the standardized mood effect attributes may be shared by other avatars in the virtual world.
- the standardized moods or mood effect attributes may be stored on a system or server to permit the mood effect attributes to be shared or associated with other avatars.
- users or participants in the virtual system may query or request the mood effect attributes of another avatar. Responses to such queries or requests may assist in automated responses and provide a better understanding of the person or avatar with which a user's avatar may be engaged in the virtual world and to facilitate interactions between users and users' avatars in the virtual world.
- a profile defining each mood or mood effect may be received by a virtual world system or stored on a user's computer system for managing and presenting mood effects.
- An identification or characterization of the mood effect may be associated with each profile, such as happy, sad, angry or similar moods or emotions.
- the profiles can be relatively complex and may include different human moods and emotions and may include different levels for each mood or emotion.
- any scripts, gestures, actions, or appearances of an avatar, or other attributes may be received and stored by a system to be associated with each mood effect as part of the mood effect profile to define the mood effect.
- the GUI presented to define each mood effect or mood effect profile in block 102 may permit a script to be entered and stored in association with each mood effect.
- the script may be a visual or verbal utterance or other form of communications.
- the GUI may also permit an action, such as a gesture or other action to be entered and stored in association with each mood effect.
- a particular action may be performed by a user's avatar while the mood affect corresponding with the particular action is associated with the user's avatar.
- the GUI may also permit an appearance of a user's avatar to be associated with each mood effect. For example, clothing worn by the user's avatar may be different depending upon a mood of the user's avatar.
- any action or actions received and stored by the virtual world system or user's computer system in association with each mood or mood effect may be configured to be triggered by occurrence of a predetermined event, such as a virtual world event or other occurrence.
- a predetermined event such as a virtual world event or other occurrence.
- an action may be performed by a user's avatar in response to entering an event or location, leaving an event or location, entering a mood, or leaving a mood.
- a particular action may also be performed by the user's avatar in response to the user's avatar coming into contact with another avatar with predetermined matched or correlated rules.
- each of the avatars may be in a particular mood, have a particular company or organization affiliation in the virtual world, same or similar virtual world experiences or other characteristics relative to one another that prompt a predefined action.
- the action may be triggered in response to the avatars coming within a predetermined proximity range of one another in the virtual world.
- a user may be allowed to select a mood effect from a plurality of predefined mood effects and associated with the user's avatar in the virtual world.
- Each of the plurality of mood effects may be defined as previously discussed.
- the mood effects may be defined by the user or the mood effects may be standardized by the virtual world system operator or owner.
- the mood effect may be selected from a menu, dropdown list using a computer pointing device or may be selected by some other mechanism known in relation to virtual world systems, simulations or other computer applications.
- the mood effect may be manually set based on inputs from the user or selection of criteria by the user.
- the mood effect may be tagged to the avatar by a tagging mechanism or visual identifier to present the mood or mood effect of the user's avatar to users of other avatars in the virtual world.
- FIG. 3 is an illustration of a tagging mechanism 300 or visual identifier to present a mood effect 302 or 304 of a user's avatar 306 or 308 in the virtual world 310 in accordance with an embodiment of the present invention. As illustrated in FIG.
- a tag such as a smiley face 312 may represent a happy mood effect 302 and a frown 314 may represent a sad mood effect 304 .
- a script such as scripts 316 and 318 may also be used to represent the mood or mood effect, either alone or in addition to any other representations or expressions of an avatar's mood or mood effects.
- Each of the scripts 316 and 318 may be visually presented in a balloon or similar means, as illustrated in FIG. 3 , or the script 316 or 318 may be audibly presented and actually spoken by the corresponding avatar via speakers of a computer system, or the scripts 316 and 318 may be both visually and audibly presented.
- the avatar's mood may be presented to the other user in response to the avatars coming within a predetermined proximity range of one another within the virtual world.
- an indication of the mood or mood effect may be presented to other users in the virtual world in response to a predetermined virtual world event, the user's avatar entering a predetermined location, or other stimuli.
- a user may be requested to indicate or select a mood or mood effect for the user's avatar when logging into the virtual world.
- Other users may also submit queries to determine a mood of other avatars and/or to obtain more detailed information about the other avatar's mood and possible reasons for the other avatar's mood.
- the information may be presented to the user in response to the query. This may provide better understanding of the person or avatar with whom the user is engaging in the virtual world.
- a mood effect associated with the user's avatar may be changed in response to an input of some criteria corresponding the mood effect or selection of a different mood effect similar to that previously described.
- a change in the mood effect may also be triggered by an occurrence, an event, an entry into a location in the virtual world by the user's avatar or other stimuli similar to that previously discussed.
- An example of a method and system for automated avatar moods in a virtual world is described in U.S. patent application Ser. No. ______ (IBM Docket No. RSW920070211US1), entitled “Automated Avatar Moods in a Virtual World”, by Steven K. Speicher et al., which is assigned to the same assignee as the present invention and is incorporated herein by reference.
- changes in mood effects may be mined or recorded and tracked along with any actions taken by avatars in response to a mood effect or change in mood effect.
- This data may be analyzed by the virtual world owner or operator or other entities providing services in the virtual world for market data intelligence or for other purposes.
- FIG. 4 is an example of a diagram of a system 400 for defining, managing and presenting mood effects in a virtual world in accordance with an embodiment of the present invention.
- the method 100 may be embodied in and performed by the system 400 .
- the system 400 may include a define mood component 402 or module to allow a user or someone else, such as a virtual world operator and/or owner, to define a mood or plurality of moods or mood effects that may be associated or tagged to the user's avatar 404 in the virtual world 406 . Similar to that previously discussed, standardized mood or mood effects may be defined by a virtual world owner or operator that may be shared and tagged to avatars of other users in the virtual world.
- the moods or mood effects can be defined as simple profiles, such as a “Happy Avatar” 408 and a “Sad Avatar” 410 , either of which may be associated with, or tagged to the user's avatar 404 to identify the avatar's mood, visually, audibly or both, to other users in the virtual world 406 .
- the avatar mood or mood effect may be tagged to the avatar 404 using a tagging mechanism, visual identifier or other mechanism.
- the tagging mechanism or visual identifier may be any such mechanism used in virtual worlds or simulations to associate information or attributes to an avatar and to present such information to other users or participants in the virtual world.
- the system 400 may also include a change mood component 412 or module to cause changes in moods or mood effects associated with or tagged to an avatar.
- the change in moods may be in response to some event or condition or actions by the avatar's user, such as selecting a different mood or mood effect from a plurality of moods.
- the system 400 may also include subsystems for defining the mood and changes in the mood or mood effects.
- the system 400 may include a scripts subsystem 414 , a gesture or action subsystem 416 , and an appearance subsystem 418 .
- the scripts subsystem 414 may permit a script to be entered and associated with a mood or mood effect in defining the mood or mood effect.
- the scripts subsystem 414 may also control presentation of the script in association with an avatar tagged with the mood corresponding to the script.
- the script may be words, sounds, such as grunts, groans, laugh or other utterances which may be spoken or expressed by an avatar that has been tagged with the particular mood or mood effect.
- the script may be presented in visual or audible form.
- the script may be presented in a balloon, similar to balloons 316 and 318 illustrated in FIG. 3 or the script may actual be spoken by the avatar and presented through speakers of a user's computer system.
- the gestures or actions subsystem 414 may permit a specific gesture or action to be entered and associated with a mood or mood effect in defining the mood or mood effect.
- the gestures subsystem 416 may also control performance of the gesture or action by an avatar tagged with the mood corresponding to the gesture or action. Examples of avatar actions or gestures that may be associated with a mood or mood effect may include the avatar's head being down and/or shoulders slumped forward to indicate a sad mood, clenched fists to indicate an angry mood, arms held overhead to indicate a happy mood, the avatar jumping up and down to express a happy mood, other movements of the arms, legs or other body parts or body language that may connote a particular mood or emotion or any other actions or gestures that may express a particular type of mood or emotion.
- the appearance subsystem 418 may permit a specific appearance of an avatar to be entered and associated with a mood or mood effect in defining the mood or mood effect.
- the appearance subsystem 418 may also control the appearance of an avatar tagged with the particular mood corresponding to the appearance.
- Examples of avatar appearances that may express a mood or emotion may include avatar facial expressions, bright colored clothing to express a happy mood, dark, black or gray colored clothing in association with a sad mood or any other visual effects associated with appearance of an avatar in the virtual world that may suggest a mood of the avatar or user associated with the avatar.
- FIG. 5 is a block schematic diagram of another example of a system 500 for managing and presenting mood effects in a virtual world in accordance with an embodiment of the present invention.
- the method 100 of FIG. 1 may be embodied in or performed by the system 500 .
- the system 500 may include a module 502 for managing and presenting moods or mood effects.
- the module 502 may be stored on a file system 504 of an exemplary computer system 506 .
- the exemplary computer system 506 may be used by a user, such as user 508 , to access or enter a virtual world as described herein.
- a plurality of users 508 may each use a computer system, similar to the exemplary computer system 506 , to participate in the virtual world by controlling an avatar in the virtual world associated with the user 508 .
- the module 502 of managing and presenting avatar mood effects in the virtual world may include a define mood sub-module or component 510 .
- the define mood component 510 may be similar to the define mood component 402 described with reference to FIG. 4 .
- the define mood component 510 may generate and present a GUI to the user 508 to permit a plurality of mood or mood effects to be defined by the user and associated with the user's avatar.
- the GUI may include provisions for entering or defining scripts, gestures, avatar appearance or similar attributes that may connote the mood of a user's avatar to other users in the virtual world.
- the mood of a user's avatar may be presented to another user's avatar in response to the avatars coming within a predetermined proximity range of one another.
- the mood of a user's avatar may be presented to another user in response to the other user querying the mood of the user's avatar in the virtual world.
- the querying operation may be performed by activating or selecting the feature in a menu or other means commonly known for selecting features in virtual worlds or computing technology.
- moods or mood effects may be standardized by the virtual world owner and/or operator. Accordingly, the virtual world owner/operator may define the different mood or mood effects. The different predefined mood or mood effects may then be shared by all of the users of the virtual world and may be associated with or tagged to respective users' avatars to express or present the avatar's mood to other users in the virtual world.
- the module for managing and presenting avatar mood effects in a virtual world may also include a mood inventory 512 .
- the mood inventory 512 may store all of the moods and associated mood effects or attributes, such as scripts, gestures or actions, appearances or other attributes associated with each mood for application to the user's avatar.
- the moods and associated mood effects or attributes may be defined by the user or the virtual world owner and/or operator may defined standardized moods and associated mood effects or attributes which may be shared among all avatars in the virtual world.
- the module for managing and presenting avatar mood effects in a virtual world may also include a component 514 to control mood changes in response to predetermined stimuli.
- mood changes may occur in response to selection of another mood by the user of the avatar, certain events in the virtual world or real world, entering or leaving a location in the virtual world, interaction with another avatar or any other stimuli that may elicit a mood change.
- Predetermined rules may be created or defined to evoke specific mood changes.
- an example of a system and method for automated changes in avatar moods in a virtual world is described in U.S. patent application Ser. No. ______ (IBM Docket No. RSW920070211US1).
- the computer system 506 may also include a processor 516 .
- the module for managing and presenting avatar mood effects 502 may be accessed from the file system 504 and run on the processor 516 .
- the computer system may also include a display 518 , a speaker system 520 , and one or more input devices, output devices or combination input/output devices, collectively I/O devices 522 .
- the I/O devices 522 may include a keyboard, pointing device, such as a mouse, disk drives and any other devices to permit a user to interface with and control operation of the computer system and to access and participate in the virtual world through the user's avatar.
- the display 518 may present the user's avatar and other users' avatars in the virtual world and present the features described herein for defining and managing moods or mood effects.
- the speaker 520 may present any sounds associated with the virtual world, such as audible mood effects or other sounds.
- the system 500 may also include a server 524 .
- a virtual world system 526 may reside and operate on the server 524 .
- Users 508 via browsers (not shown in FIG. 5 ) on their computer systems 506 may access the virtual world system 526 via a network 528 .
- the network 528 may be the Internet, an intranet or other private or proprietary network.
- a repository or inventory 528 of standardized moods, mood effects and any associated attributes may be associated with the server 524 .
- Standardized mood inventory 528 may be contained on the server 524 or may be a separate component from the server 528 .
- a mood or visual identification tagging mechanism 530 may also be operable on the server 524 to tag mood effects to respective avatar and to control and maintain any mood changes. Control, tracking and recording of moods and mood changes may be coordinated between the mood change component 514 that may be operable on the computer system 506 of each user 508 and the mood or visual identification tagging mechanism 530 .
- the module 502 for managing and presenting avatar mood effects may be operable on the server 524 or some of the features or operations described with respect to module 502 may be performed in the computer system 506 and others on the server 524 .
- each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
- the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
A method for managing and presenting avatar mood effects in a virtual world may include allowing a mood effect to be associated with a user's avatar in the virtual world from a plurality of predefined mood effects. The method may also include presenting the associated mood effect to other users of the virtual world.
Description
- The present invention relates to virtual worlds, such as simulations of the real-world or real-life, and the like, and more particularly to managing and presenting avatar mood effects in a virtual world.
- Computer based simulations are becoming more ubiquitous. Simulations may be used for training purposes, for entertainment, for commerce or for other purposes. Computer simulations such as Second Life or similar simulations present a virtual world which allows users or players to be represented by characters known as avatars. Second Life is an Internet-based virtual world launched in 2003 by Linden Research, Inc. A downloadable client program called the Second Life Viewer enables users, called “Residents”, to interact with others in the virtual world through motional avatars. The virtual world basically simulates the real world or environment. The users or residents via their avatar can explore the virtual world, meet other users or residents, socialize, participate in individual and group activities, create and trade items (virtual property) and services from one another. The challenge with respect to such simulations or virtual worlds is to make them as realistic or as much like the real-world or real-life as possible. This increases the utility of such simulations as a training tool or enjoyment of the participants or users as an entertainment medium. Current virtual worlds enable only certain limited capabilities for simulating real-world interactions such as personalization of avatars based on clothing, facial features and physique. More engaging experiences, such as moods or emotions are typically not taken into account. For example, how moods are defined and affect personal features, such as dress, facial expressions or other features, and personal interactions is lacking. Second Life is a trademark of Linden Research, Inc. in the United States, other countries or both.
- In accordance with an embodiment of the present invention, a method for managing and presenting avatar moods or mood effects in a virtual world may include allowing a mood or mood effect to be associated with a user's avatar in the virtual world from a plurality of predefined moods or mood effects. The method may also include presenting the associated mood or mood effect to other users of the virtual world.
- In accordance with another embodiment of the present invention, a method for managing and presenting avatar moods or mood effects in a virtual world may include profiling a set of mood or mood effect changes to portray different real-world emotions and moods by a user's avatar to other users in the virtual world. The method may also include defining a predetermined action based on each mood or mood effect change, the predetermined action to be performed by the user's avatar in the virtual world in response to the user's avatar being associated with a mood corresponding to the mood effect change. The method may further include triggering the change in the mood or mood effect in response to occurrence of a predetermined event.
- In accordance with another embodiment of the present invention, a method for managing and presenting a mood of a user's avatar in a virtual world may include presenting a mood or mood effect of the user's avatar to other users in the virtual world. Presenting the mood or mood effect may include presenting a predefined script in association with the user's avatar to other users in the virtual world to indicate the mood of the user's avatar. Presenting the mood or mood effect may also include performing a predefined action by the user's avatar to indicate the mood of the user's avatar. Presenting the mood or mood effect may additionally include presenting the user's avatar with a predetermined appearance to indicate the mood of the user's avatar.
- Other aspects and features of the present invention, as defined solely by the claims, will become apparent to those ordinarily skilled in the art upon review of the following non-limited detailed description of the invention in conjunction with the accompanying figures.
-
FIG. 1 is a flow chart of an example of a method for managing and presenting mood effects in a virtual world in accordance with an embodiment of the present invention. -
FIG. 2 is an example of a graphical user interface for defining or editing mood effects in a virtual world in accordance with an embodiment of the present invention. -
FIG. 3 is an illustration of a tagging mechanism or visual identifier to present a mood effect of a user's avatar in the virtual world in accordance with an embodiment of the present invention. -
FIG. 4 is a diagram of an example of a system for defining, managing and presenting mood effects in a virtual world in accordance with an embodiment of the present invention. -
FIG. 5 is a block schematic diagram of an example of a system for managing and presenting mood effects in a virtual world in accordance with an embodiment of the present invention. - The following detailed description of embodiments refers to the accompanying drawings, which illustrate specific embodiments of the invention. Other embodiments having different structures and operations do not depart from the scope of the present invention.
- As will be appreciated by one of skill in the art, the present invention may be embodied as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, the present invention may take the form of a computer program product on a computer-usable storage medium having computer-usable program code embodied in the medium.
- Any suitable computer usable or computer readable medium may be utilized. The computer-usable or computer-readable medium may be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection having one or more wires, a tangible medium such as a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), or other tangible optical or magnetic storage device; or transmission media such as those supporting the Internet or an intranet. Note that the computer-usable or computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted, or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory. In the context of this document, a computer-usable or computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The computer-usable medium may include a propagated data signal with the computer-usable program code embodied therewith, either in baseband or as part of a carrier wave. The computer usable program code may be transmitted using any appropriate medium, including but not limited to the Internet, wireline, optical fiber cable, radio frequency (RF) or other means.
- Computer program code for carrying out operations of the present invention may be written in an object oriented programming language such as Java, Smalltalk, C++ or the like. However, the computer program code for carrying out operations of the present invention may also be written in conventional procedural programming languages, such as the “C” programming language or similar programming languages, or in functional programming languages, such as Haskell, Standard Meta Language (SML) or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
- The present invention is described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
- These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function/act specified in the flowchart and/or block diagram block or blocks.
- The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
-
FIG. 1 is a flow chart of an example of amethod 100 for managing and presenting mood or mood effects in a virtual world in accordance with an embodiment of the present invention. The terms mood, mood effects, emotions or similar terms may be used interchangeably herein to describe a state of mind, feeling, mood, emotion or the like of a user, a user's avatar in the virtual world or both, and to represent or convey this mood emotion, state of mind or the like to other users in the virtual world. Inblock 102, a graphical user interface (GUI) may be presented to define or edit a mood effect or mood effects. The mood effects GUI may be presented in response to a user selecting a feature, such as a menu, icon or other mechanism in a virtual world being presented on a display of the user's computer system. The GUI may be presented by a user selecting or clicking-on the menu, icon or other mechanism using a computer pointing device or similar device. - Referring also to
FIG. 2 ,FIG. 2 is an example of aGUI 200 for defining or editing mood effects in a virtual world in accordance with an embodiment of the present invention. As illustrated inFIG. 2 , eachmood 201 or mood effect may be defined as aprofile 202, such as “happy” 204 or “sad” 206. Theseprofiles 202 may become extremely complex similar to human moods and emotions. Examples of more complex profiles may include “very happy”, “best day of my life”, “don't talk to me”, “bad” or any other moods or emotions. The profiles may define certain actions based on these moods. For example, as illustrated inFIG. 2 , an action that may be associated with thehappy profile 204 may be an avatar performing awave action 208. An example of an action that may be associated with thesad profile 206 may be utterance of agrunt 210 by an avatar. - As illustrated in
block 102, mood effects may be attributes standardized by a virtual world owner or operator. The standardized mood effect attributes may be shared by other avatars in the virtual world. As described herein, the standardized moods or mood effect attributes may be stored on a system or server to permit the mood effect attributes to be shared or associated with other avatars. In accordance with an embodiment of the present invention, users or participants in the virtual system may query or request the mood effect attributes of another avatar. Responses to such queries or requests may assist in automated responses and provide a better understanding of the person or avatar with which a user's avatar may be engaged in the virtual world and to facilitate interactions between users and users' avatars in the virtual world. - In
block 104, a profile defining each mood or mood effect may be received by a virtual world system or stored on a user's computer system for managing and presenting mood effects. An identification or characterization of the mood effect may be associated with each profile, such as happy, sad, angry or similar moods or emotions. As previously discussed the profiles can be relatively complex and may include different human moods and emotions and may include different levels for each mood or emotion. - In
block 106, any scripts, gestures, actions, or appearances of an avatar, or other attributes may be received and stored by a system to be associated with each mood effect as part of the mood effect profile to define the mood effect. The GUI presented to define each mood effect or mood effect profile inblock 102 may permit a script to be entered and stored in association with each mood effect. The script may be a visual or verbal utterance or other form of communications. The GUI may also permit an action, such as a gesture or other action to be entered and stored in association with each mood effect. A particular action may be performed by a user's avatar while the mood affect corresponding with the particular action is associated with the user's avatar. The GUI may also permit an appearance of a user's avatar to be associated with each mood effect. For example, clothing worn by the user's avatar may be different depending upon a mood of the user's avatar. - In
block 108, any action or actions received and stored by the virtual world system or user's computer system in association with each mood or mood effect may be configured to be triggered by occurrence of a predetermined event, such as a virtual world event or other occurrence. For example an action may be performed by a user's avatar in response to entering an event or location, leaving an event or location, entering a mood, or leaving a mood. A particular action may also be performed by the user's avatar in response to the user's avatar coming into contact with another avatar with predetermined matched or correlated rules. For example, each of the avatars may be in a particular mood, have a particular company or organization affiliation in the virtual world, same or similar virtual world experiences or other characteristics relative to one another that prompt a predefined action. The action may be triggered in response to the avatars coming within a predetermined proximity range of one another in the virtual world. - In block 110, a user may be allowed to select a mood effect from a plurality of predefined mood effects and associated with the user's avatar in the virtual world. Each of the plurality of mood effects may be defined as previously discussed. The mood effects may be defined by the user or the mood effects may be standardized by the virtual world system operator or owner. The mood effect may be selected from a menu, dropdown list using a computer pointing device or may be selected by some other mechanism known in relation to virtual world systems, simulations or other computer applications.
- In accordance with another embodiment of the present invention, the mood effect may be manually set based on inputs from the user or selection of criteria by the user. The mood effect may be tagged to the avatar by a tagging mechanism or visual identifier to present the mood or mood effect of the user's avatar to users of other avatars in the virtual world. Referring also to
FIG. 3 ,FIG. 3 is an illustration of atagging mechanism 300 or visual identifier to present amood effect avatar virtual world 310 in accordance with an embodiment of the present invention. As illustrated inFIG. 3 , a tag such as asmiley face 312 may represent ahappy mood effect 302 and afrown 314 may represent asad mood effect 304. As previously discussed, a script, such asscripts scripts FIG. 3 , or thescript scripts - Referring back to
FIG. 1 , inblock 112, in accordance with an embodiment of the present invention, an indication of the mood or mood effect may be presented to other users in the virtual world in response to a predetermined virtual world event, the user's avatar entering a predetermined location, or other stimuli. A user may be requested to indicate or select a mood or mood effect for the user's avatar when logging into the virtual world. Other users may also submit queries to determine a mood of other avatars and/or to obtain more detailed information about the other avatar's mood and possible reasons for the other avatar's mood. The information may be presented to the user in response to the query. This may provide better understanding of the person or avatar with whom the user is engaging in the virtual world. - In block 114, a mood effect associated with the user's avatar may be changed in response to an input of some criteria corresponding the mood effect or selection of a different mood effect similar to that previously described. A change in the mood effect may also be triggered by an occurrence, an event, an entry into a location in the virtual world by the user's avatar or other stimuli similar to that previously discussed. An example of a method and system for automated avatar moods in a virtual world is described in U.S. patent application Ser. No. ______ (IBM Docket No. RSW920070211US1), entitled “Automated Avatar Moods in a Virtual World”, by Steven K. Speicher et al., which is assigned to the same assignee as the present invention and is incorporated herein by reference.
- In
block 116, changes in mood effects may be mined or recorded and tracked along with any actions taken by avatars in response to a mood effect or change in mood effect. This data may be analyzed by the virtual world owner or operator or other entities providing services in the virtual world for market data intelligence or for other purposes. -
FIG. 4 is an example of a diagram of asystem 400 for defining, managing and presenting mood effects in a virtual world in accordance with an embodiment of the present invention. Themethod 100 may be embodied in and performed by thesystem 400. Thesystem 400 may include a definemood component 402 or module to allow a user or someone else, such as a virtual world operator and/or owner, to define a mood or plurality of moods or mood effects that may be associated or tagged to the user'savatar 404 in thevirtual world 406. Similar to that previously discussed, standardized mood or mood effects may be defined by a virtual world owner or operator that may be shared and tagged to avatars of other users in the virtual world. - The moods or mood effects can be defined as simple profiles, such as a “Happy Avatar” 408 and a “Sad Avatar” 410, either of which may be associated with, or tagged to the user's
avatar 404 to identify the avatar's mood, visually, audibly or both, to other users in thevirtual world 406. The avatar mood or mood effect may be tagged to theavatar 404 using a tagging mechanism, visual identifier or other mechanism. The tagging mechanism or visual identifier may be any such mechanism used in virtual worlds or simulations to associate information or attributes to an avatar and to present such information to other users or participants in the virtual world. - The
system 400 may also include achange mood component 412 or module to cause changes in moods or mood effects associated with or tagged to an avatar. As previously discussed, the change in moods may be in response to some event or condition or actions by the avatar's user, such as selecting a different mood or mood effect from a plurality of moods. - The
system 400 may also include subsystems for defining the mood and changes in the mood or mood effects. For example, thesystem 400 may include ascripts subsystem 414, a gesture oraction subsystem 416, and anappearance subsystem 418. The scripts subsystem 414 may permit a script to be entered and associated with a mood or mood effect in defining the mood or mood effect. The scripts subsystem 414 may also control presentation of the script in association with an avatar tagged with the mood corresponding to the script. The script may be words, sounds, such as grunts, groans, laugh or other utterances which may be spoken or expressed by an avatar that has been tagged with the particular mood or mood effect. The script may be presented in visual or audible form. For example, the script may be presented in a balloon, similar toballoons FIG. 3 or the script may actual be spoken by the avatar and presented through speakers of a user's computer system. - The gestures or actions subsystem 414 may permit a specific gesture or action to be entered and associated with a mood or mood effect in defining the mood or mood effect. The gestures subsystem 416 may also control performance of the gesture or action by an avatar tagged with the mood corresponding to the gesture or action. Examples of avatar actions or gestures that may be associated with a mood or mood effect may include the avatar's head being down and/or shoulders slumped forward to indicate a sad mood, clenched fists to indicate an angry mood, arms held overhead to indicate a happy mood, the avatar jumping up and down to express a happy mood, other movements of the arms, legs or other body parts or body language that may connote a particular mood or emotion or any other actions or gestures that may express a particular type of mood or emotion.
- The
appearance subsystem 418 may permit a specific appearance of an avatar to be entered and associated with a mood or mood effect in defining the mood or mood effect. Theappearance subsystem 418 may also control the appearance of an avatar tagged with the particular mood corresponding to the appearance. Examples of avatar appearances that may express a mood or emotion may include avatar facial expressions, bright colored clothing to express a happy mood, dark, black or gray colored clothing in association with a sad mood or any other visual effects associated with appearance of an avatar in the virtual world that may suggest a mood of the avatar or user associated with the avatar. -
FIG. 5 is a block schematic diagram of another example of asystem 500 for managing and presenting mood effects in a virtual world in accordance with an embodiment of the present invention. Themethod 100 ofFIG. 1 may be embodied in or performed by thesystem 500. Thesystem 500 may include amodule 502 for managing and presenting moods or mood effects. Themodule 502 may be stored on afile system 504 of anexemplary computer system 506. Theexemplary computer system 506 may be used by a user, such asuser 508, to access or enter a virtual world as described herein. A plurality ofusers 508 may each use a computer system, similar to theexemplary computer system 506, to participate in the virtual world by controlling an avatar in the virtual world associated with theuser 508. - The
module 502 of managing and presenting avatar mood effects in the virtual world may include a define mood sub-module orcomponent 510. The definemood component 510 may be similar to the definemood component 402 described with reference toFIG. 4 . The definemood component 510 may generate and present a GUI to theuser 508 to permit a plurality of mood or mood effects to be defined by the user and associated with the user's avatar. The GUI may include provisions for entering or defining scripts, gestures, avatar appearance or similar attributes that may connote the mood of a user's avatar to other users in the virtual world. As previously described, the mood of a user's avatar may be presented to another user's avatar in response to the avatars coming within a predetermined proximity range of one another. In accordance with another embodiment of the present invention, the mood of a user's avatar may be presented to another user in response to the other user querying the mood of the user's avatar in the virtual world. The querying operation may be performed by activating or selecting the feature in a menu or other means commonly known for selecting features in virtual worlds or computing technology. - In accordance with another embodiment of the present invention, as previously described, moods or mood effects may be standardized by the virtual world owner and/or operator. Accordingly, the virtual world owner/operator may define the different mood or mood effects. The different predefined mood or mood effects may then be shared by all of the users of the virtual world and may be associated with or tagged to respective users' avatars to express or present the avatar's mood to other users in the virtual world.
- The module for managing and presenting avatar mood effects in a virtual world may also include a
mood inventory 512. Themood inventory 512 may store all of the moods and associated mood effects or attributes, such as scripts, gestures or actions, appearances or other attributes associated with each mood for application to the user's avatar. As previously discussed, the moods and associated mood effects or attributes may be defined by the user or the virtual world owner and/or operator may defined standardized moods and associated mood effects or attributes which may be shared among all avatars in the virtual world. - The module for managing and presenting avatar mood effects in a virtual world may also include a
component 514 to control mood changes in response to predetermined stimuli. For example, mood changes may occur in response to selection of another mood by the user of the avatar, certain events in the virtual world or real world, entering or leaving a location in the virtual world, interaction with another avatar or any other stimuli that may elicit a mood change. Predetermined rules may be created or defined to evoke specific mood changes. As previously discussed, an example of a system and method for automated changes in avatar moods in a virtual world is described in U.S. patent application Ser. No. ______ (IBM Docket No. RSW920070211US1). - The
computer system 506 may also include aprocessor 516. The module for managing and presenting avatar mood effects 502 may be accessed from thefile system 504 and run on theprocessor 516. - The computer system may also include a
display 518, aspeaker system 520, and one or more input devices, output devices or combination input/output devices, collectively I/O devices 522. The I/O devices 522 may include a keyboard, pointing device, such as a mouse, disk drives and any other devices to permit a user to interface with and control operation of the computer system and to access and participate in the virtual world through the user's avatar. Thedisplay 518 may present the user's avatar and other users' avatars in the virtual world and present the features described herein for defining and managing moods or mood effects. Thespeaker 520 may present any sounds associated with the virtual world, such as audible mood effects or other sounds. - The
system 500 may also include aserver 524. Avirtual world system 526 may reside and operate on theserver 524.Users 508 via browsers (not shown inFIG. 5 ) on theircomputer systems 506 may access thevirtual world system 526 via anetwork 528. Thenetwork 528 may be the Internet, an intranet or other private or proprietary network. - Other embodiments of the present invention are not limited to only a server and the system and features described herein may be in one of many forms. Examples may include may include a client, configurations that support peer-to-peer communications, a wireless solution or other arrangements.
- In accordance with an embodiment of the present invention, a repository or
inventory 528 of standardized moods, mood effects and any associated attributes may be associated with theserver 524.Standardized mood inventory 528 may be contained on theserver 524 or may be a separate component from theserver 528. A mood or visualidentification tagging mechanism 530 may also be operable on theserver 524 to tag mood effects to respective avatar and to control and maintain any mood changes. Control, tracking and recording of moods and mood changes may be coordinated between themood change component 514 that may be operable on thecomputer system 506 of eachuser 508 and the mood or visualidentification tagging mechanism 530. In accordance with an embodiment of the present invention, themodule 502 for managing and presenting avatar mood effects may be operable on theserver 524 or some of the features or operations described with respect tomodule 502 may be performed in thecomputer system 506 and others on theserver 524. - The flowcharts and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
- The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
- Although specific embodiments have been illustrated and described herein, those of ordinary skill in the art appreciate that any arrangement which is calculated to achieve the same purpose may be substituted for the specific embodiments shown and that the invention has other applications in other environments. This application is intended to cover any adaptations or variations of the present invention. The following claims are in no way intended to limit the scope of the invention to the specific embodiments described herein.
Claims (20)
1. A method for managing and presenting avatar mood effects in a virtual world, comprising:
allowing a mood effect to be associated with a user's avatar in the virtual world from a plurality of predefined mood effects; and
presenting the associated mood effect to other users of the virtual world.
2. The method of claim 1 , further comprising presenting an interface to define each of the plurality of mood effects.
3. The method of claim 2 , wherein presenting the interface to define each of the plurality of mood effects comprises:
permitting a script to be associated with each mood effect;
permitting an action to be associated with each mood effect, wherein the action associated with a particular mood effect is performable by the user's avatar while the particular mood effect is associated with user's avatar; and
permitting an appearance of the user's avatar to be associated with each mood effect.
4. The method of claim 1 , wherein allowing a mood effect to be associated with the user's avatar from a plurality of predefined mood effects comprises:
allowing a happy mood effect to be selected and associated with the user's avatar; and
allowing a sad mood effect to be selected and associated with the user's avatar.
5. The method of claim 1 , further comprising presenting a visual indicator associated with the user's avatar to other users in the virtual world in response to the associated mood effect being selected and associated with the user's avatar.
6. The method of claim 1 , further comprising changing the mood effect presentable to other user's in the virtual world in response to the user selecting another mood effect from the plurality of mood effects to be associated with the user's avatar.
7. The method of claim 1 , further comprising at least one of presenting a predefined script in association with the user's avatar to other users in the virtual world to indicate the selected mood of the user's avatar in the virtual world, the user's avatar performing a predefined action to indicate the selected mood of the user's avatar in the virtual world, and the user's avatar having a predetermined appearance in the virtual world to indicate the selected mood of the user's avatar.
8. The method of claim 1 , further comprising:
requesting the user to select a mood of the user's avatar in response to the user logging into the virtual world; and
associating the mood effect corresponding to the mood selected by the user with the user's avatar.
9. The method of claim 1 , further comprising:
standardizing the plurality of mood effects as avatar mood effect attributes within the virtual world; and
permitting the mood effect attributes to be shared by other avatars in the virtual world.
10. The method of claim 1 , further comprising permitting a query by another user of the mood effect of the user's avatar to assist in interacting within the virtual world.
11. The method of claim 1 , further comprising allowing mining of mood effect changes and any actions taken by the user's avatar while in a particular mood effect.
12. The method of claim 1 , further comprising permitting an action to be associated with the mood effect, wherein the action is performable by the user's avatar when in the mood effect.
13. The method of claim 12 , further comprising triggering the action to be performable by the user's avatar in response to occurrence of a predetermined event in the virtual world.
14. A method for managing and presenting avatar mood effects in a virtual world,
profiling a set of mood effect changes to portray different real-world emotions and moods by a user's avatar to other users of the virtual world;
defining a predetermined action based on each mood effect change, the predetermined action to be performed by the user's avatar in the virtual world in response to the user's avatar being associated with a mood corresponding to the mood effect change; and
triggering the change in the mood effect in response to occurrence of a predetermined event.
15. The method of claim 14 , further comprising at least one of:
changing a script in response to each mood effect change;
changing actions performable by the user's avatar in response to each mood effect change; and
changing an appearance of the user's avatar in response to each mood effect change.
16. The method of claim 15 , wherein changing an appearance of the user's avatar comprises changing clothing of the user's avatar in the virtual world.
17. The method of claim 15 , wherein changing a script comprises changing one of a verbal and a visual utterance expressible by the user's avatar.
18. A method for managing and presenting a mood of a user's avatar in a virtual world, comprising:
presenting a mood effect of the user's avatar to other users in the virtual world, wherein presenting the mood effect comprises at least one of:
presenting a predefined script in association with the user's avatar to other users in the virtual world to indicate a mood of the user's avatar;
performing a predefined action by the user's avatar to indicate the mood of the user's avatar;
presenting the user's avatar with a predetermined appearance to indicate the mood of the user's avatar.
19. The method of claim 18 , further comprising triggering a mood change in response to an event.
20. The method of claim 18 , further comprising:
standardizing a set of mood effects in the virtual world; and
permitting the mood effects to be shared by each of the avatars in the virtual world.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/971,508 US20090177976A1 (en) | 2008-01-09 | 2008-01-09 | Managing and presenting avatar mood effects in a virtual world |
US12/330,829 US9568993B2 (en) | 2008-01-09 | 2008-12-09 | Automated avatar mood effects in a virtual world |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/971,508 US20090177976A1 (en) | 2008-01-09 | 2008-01-09 | Managing and presenting avatar mood effects in a virtual world |
Publications (1)
Publication Number | Publication Date |
---|---|
US20090177976A1 true US20090177976A1 (en) | 2009-07-09 |
Family
ID=40845571
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/971,508 Abandoned US20090177976A1 (en) | 2008-01-09 | 2008-01-09 | Managing and presenting avatar mood effects in a virtual world |
US12/330,829 Active 2030-12-06 US9568993B2 (en) | 2008-01-09 | 2008-12-09 | Automated avatar mood effects in a virtual world |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/330,829 Active 2030-12-06 US9568993B2 (en) | 2008-01-09 | 2008-12-09 | Automated avatar mood effects in a virtual world |
Country Status (1)
Country | Link |
---|---|
US (2) | US20090177976A1 (en) |
Cited By (218)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090156907A1 (en) * | 2007-12-13 | 2009-06-18 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for specifying an avatar |
US20090157660A1 (en) * | 2007-12-13 | 2009-06-18 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems employing a cohort-linked avatar |
US20090157625A1 (en) * | 2007-12-13 | 2009-06-18 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for identifying an avatar-linked population cohort |
US20090157751A1 (en) * | 2007-12-13 | 2009-06-18 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for specifying an avatar |
US20090157482A1 (en) * | 2007-12-13 | 2009-06-18 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for indicating behavior in a population cohort |
US20090157481A1 (en) * | 2007-12-13 | 2009-06-18 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for specifying a cohort-linked avatar attribute |
US20090164549A1 (en) * | 2007-12-20 | 2009-06-25 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for determining interest in a cohort-linked avatar |
US20090164131A1 (en) * | 2007-12-20 | 2009-06-25 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for specifying a media content-linked population cohort |
US20090164503A1 (en) * | 2007-12-20 | 2009-06-25 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for specifying a media content-linked population cohort |
US20090172540A1 (en) * | 2007-12-31 | 2009-07-02 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Population cohort-linked avatar |
US20090171164A1 (en) * | 2007-12-17 | 2009-07-02 | Jung Edward K Y | Methods and systems for identifying an avatar-linked population cohort |
US20090241049A1 (en) * | 2008-03-18 | 2009-09-24 | International Business Machines Corporation | Method and computer program product for implementing automatic avatar status indicators |
US20100050088A1 (en) * | 2008-08-22 | 2010-02-25 | Neustaedter Carman G | Configuring a virtual world user-interface |
US20100131876A1 (en) * | 2008-11-21 | 2010-05-27 | Nortel Networks Limited | Ability to create a preferred profile for the agent in a customer interaction experience |
US20100146407A1 (en) * | 2008-01-09 | 2010-06-10 | Bokor Brian R | Automated avatar mood effects in a virtual world |
US20100156909A1 (en) * | 2008-12-19 | 2010-06-24 | International Business Machines Corporation | Enhanced visibility of avatars satisfying a profile |
US20100198924A1 (en) * | 2009-02-03 | 2010-08-05 | International Business Machines Corporation | Interactive avatar in messaging environment |
US20130332859A1 (en) * | 2012-06-08 | 2013-12-12 | Sri International | Method and user interface for creating an animated communication |
US8620850B2 (en) | 2010-09-07 | 2013-12-31 | Blackberry Limited | Dynamically manipulating an emoticon or avatar |
US8788943B2 (en) | 2009-05-15 | 2014-07-22 | Ganz | Unlocking emoticons using feature codes |
US20160074758A1 (en) * | 2013-02-15 | 2016-03-17 | Disney Enterprises, Inc. | Initiate events through hidden interactions |
GB2553607A (en) * | 2016-03-11 | 2018-03-14 | Sony Interactive Entertainment Europe Ltd | Virtual reality |
USD818732S1 (en) * | 2016-01-13 | 2018-05-29 | Paragon Furniture, Inc. | Chair shell |
US10454857B1 (en) * | 2017-01-23 | 2019-10-22 | Snap Inc. | Customized digital avatar accessories |
US10579401B2 (en) * | 2017-06-21 | 2020-03-03 | Rovi Guides, Inc. | Systems and methods for providing a virtual assistant to accommodate different sentiments among a group of users by correlating or prioritizing causes of the different sentiments |
US20200145615A1 (en) * | 2018-11-01 | 2020-05-07 | Honda Motor Co., Ltd. | System and method for providing virtual interpersonal communication |
US10848446B1 (en) | 2016-07-19 | 2020-11-24 | Snap Inc. | Displaying customized electronic messaging graphics |
US10852918B1 (en) | 2019-03-08 | 2020-12-01 | Snap Inc. | Contextual information in chat |
US10861170B1 (en) | 2018-11-30 | 2020-12-08 | Snap Inc. | Efficient human pose tracking in videos |
US10872451B2 (en) | 2018-10-31 | 2020-12-22 | Snap Inc. | 3D avatar rendering |
US10880246B2 (en) | 2016-10-24 | 2020-12-29 | Snap Inc. | Generating and displaying customized avatars in electronic messages |
US10893385B1 (en) | 2019-06-07 | 2021-01-12 | Snap Inc. | Detection of a physical collision between two client devices in a location sharing system |
US10895964B1 (en) | 2018-09-25 | 2021-01-19 | Snap Inc. | Interface to display shared user groups |
US10896534B1 (en) | 2018-09-19 | 2021-01-19 | Snap Inc. | Avatar style transformation using neural networks |
US10902661B1 (en) | 2018-11-28 | 2021-01-26 | Snap Inc. | Dynamic composite user identifier |
US10904181B2 (en) | 2018-09-28 | 2021-01-26 | Snap Inc. | Generating customized graphics having reactions to electronic message content |
US10911387B1 (en) | 2019-08-12 | 2021-02-02 | Snap Inc. | Message reminder interface |
US10939246B1 (en) | 2019-01-16 | 2021-03-02 | Snap Inc. | Location-based context information sharing in a messaging system |
US10936066B1 (en) | 2019-02-13 | 2021-03-02 | Snap Inc. | Sleep detection in a location sharing system |
US10936157B2 (en) | 2017-11-29 | 2021-03-02 | Snap Inc. | Selectable item including a customized graphic for an electronic messaging application |
US10952013B1 (en) | 2017-04-27 | 2021-03-16 | Snap Inc. | Selective location-based identity communication |
US10949648B1 (en) | 2018-01-23 | 2021-03-16 | Snap Inc. | Region-based stabilized face tracking |
US10951562B2 (en) | 2017-01-18 | 2021-03-16 | Snap. Inc. | Customized contextual media content item generation |
US10963529B1 (en) | 2017-04-27 | 2021-03-30 | Snap Inc. | Location-based search mechanism in a graphical user interface |
US10964082B2 (en) | 2019-02-26 | 2021-03-30 | Snap Inc. | Avatar based on weather |
US10979752B1 (en) | 2018-02-28 | 2021-04-13 | Snap Inc. | Generating media content items based on location information |
USD916809S1 (en) | 2019-05-28 | 2021-04-20 | Snap Inc. | Display screen or portion thereof with a transitional graphical user interface |
USD916871S1 (en) | 2019-05-28 | 2021-04-20 | Snap Inc. | Display screen or portion thereof with a transitional graphical user interface |
USD916810S1 (en) | 2019-05-28 | 2021-04-20 | Snap Inc. | Display screen or portion thereof with a graphical user interface |
US10984575B2 (en) | 2019-02-06 | 2021-04-20 | Snap Inc. | Body pose estimation |
USD916872S1 (en) | 2019-05-28 | 2021-04-20 | Snap Inc. | Display screen or portion thereof with a graphical user interface |
USD916811S1 (en) | 2019-05-28 | 2021-04-20 | Snap Inc. | Display screen or portion thereof with a transitional graphical user interface |
US10984569B2 (en) | 2016-06-30 | 2021-04-20 | Snap Inc. | Avatar based ideogram generation |
US10992619B2 (en) | 2019-04-30 | 2021-04-27 | Snap Inc. | Messaging system with avatar generation |
US10991395B1 (en) | 2014-02-05 | 2021-04-27 | Snap Inc. | Method for real time video processing involving changing a color of an object on a human face in a video |
US11010022B2 (en) | 2019-02-06 | 2021-05-18 | Snap Inc. | Global event-based avatar |
US11030789B2 (en) | 2017-10-30 | 2021-06-08 | Snap Inc. | Animated chat presence |
US11032670B1 (en) | 2019-01-14 | 2021-06-08 | Snap Inc. | Destination sharing in location sharing system |
US11030813B2 (en) | 2018-08-30 | 2021-06-08 | Snap Inc. | Video clip object tracking |
US11036989B1 (en) | 2019-12-11 | 2021-06-15 | Snap Inc. | Skeletal tracking using previous frames |
US11039270B2 (en) | 2019-03-28 | 2021-06-15 | Snap Inc. | Points of interest in a location sharing system |
US11036781B1 (en) | 2020-01-30 | 2021-06-15 | Snap Inc. | Video generation system to render frames on demand using a fleet of servers |
US11048916B2 (en) | 2016-03-31 | 2021-06-29 | Snap Inc. | Automated avatar generation |
US11055514B1 (en) | 2018-12-14 | 2021-07-06 | Snap Inc. | Image face manipulation |
US11063891B2 (en) | 2019-12-03 | 2021-07-13 | Snap Inc. | Personalized avatar notification |
US11069103B1 (en) | 2017-04-20 | 2021-07-20 | Snap Inc. | Customized user interface for electronic communications |
US11074675B2 (en) | 2018-07-31 | 2021-07-27 | Snap Inc. | Eye texture inpainting |
US11080917B2 (en) | 2019-09-30 | 2021-08-03 | Snap Inc. | Dynamic parameterized user avatar stories |
US11100311B2 (en) | 2016-10-19 | 2021-08-24 | Snap Inc. | Neural networks for facial modeling |
US11103795B1 (en) | 2018-10-31 | 2021-08-31 | Snap Inc. | Game drawer |
US11120601B2 (en) | 2018-02-28 | 2021-09-14 | Snap Inc. | Animated expressive icon |
US11120597B2 (en) | 2017-10-26 | 2021-09-14 | Snap Inc. | Joint audio-video facial animation system |
US11122094B2 (en) | 2017-07-28 | 2021-09-14 | Snap Inc. | Software application manager for messaging applications |
US11128715B1 (en) | 2019-12-30 | 2021-09-21 | Snap Inc. | Physical friend proximity in chat |
US11128586B2 (en) | 2019-12-09 | 2021-09-21 | Snap Inc. | Context sensitive avatar captions |
US11140515B1 (en) | 2019-12-30 | 2021-10-05 | Snap Inc. | Interfaces for relative device positioning |
US11166123B1 (en) | 2019-03-28 | 2021-11-02 | Snap Inc. | Grouped transmission of location data in a location sharing system |
US11169658B2 (en) | 2019-12-31 | 2021-11-09 | Snap Inc. | Combined map icon with action indicator |
US11176737B2 (en) | 2018-11-27 | 2021-11-16 | Snap Inc. | Textured mesh building |
US11189070B2 (en) | 2018-09-28 | 2021-11-30 | Snap Inc. | System and method of generating targeted user lists using customizable avatar characteristics |
US11189098B2 (en) | 2019-06-28 | 2021-11-30 | Snap Inc. | 3D object camera customization system |
US11188190B2 (en) | 2019-06-28 | 2021-11-30 | Snap Inc. | Generating animation overlays in a communication session |
US11199957B1 (en) | 2018-11-30 | 2021-12-14 | Snap Inc. | Generating customized avatars based on location information |
WO2021263208A1 (en) * | 2020-06-25 | 2021-12-30 | Snap Inc. | Updating avatar clothing in a messaging system |
US20210406542A1 (en) * | 2020-06-30 | 2021-12-30 | Ilteris Canberk | Augmented reality eyewear with mood sharing |
WO2021263210A1 (en) * | 2020-06-25 | 2021-12-30 | Snap Inc. | Updating an avatar status in a messaging system |
US11217020B2 (en) | 2020-03-16 | 2022-01-04 | Snap Inc. | 3D cutout image modification |
US11218838B2 (en) | 2019-10-31 | 2022-01-04 | Snap Inc. | Focused map-based context information surfacing |
US11227442B1 (en) | 2019-12-19 | 2022-01-18 | Snap Inc. | 3D captions with semantic graphical elements |
US11229849B2 (en) | 2012-05-08 | 2022-01-25 | Snap Inc. | System and method for generating and displaying avatars |
US11245658B2 (en) | 2018-09-28 | 2022-02-08 | Snap Inc. | System and method of generating private notifications between users in a communication session |
US11263817B1 (en) | 2019-12-19 | 2022-03-01 | Snap Inc. | 3D captions with face tracking |
US20220076504A1 (en) * | 2019-09-06 | 2022-03-10 | Snap Inc. | Context-based virtual object rendering |
US11284144B2 (en) | 2020-01-30 | 2022-03-22 | Snap Inc. | Video generation system to render frames on demand using a fleet of GPUs |
US11294936B1 (en) | 2019-01-30 | 2022-04-05 | Snap Inc. | Adaptive spatial density based clustering |
US11307747B2 (en) | 2019-07-11 | 2022-04-19 | Snap Inc. | Edge gesture interface with smart interactions |
US11310176B2 (en) | 2018-04-13 | 2022-04-19 | Snap Inc. | Content suggestion system |
US11320969B2 (en) | 2019-09-16 | 2022-05-03 | Snap Inc. | Messaging system with battery level sharing |
US11356720B2 (en) | 2020-01-30 | 2022-06-07 | Snap Inc. | Video generation system to render frames on demand |
US11360733B2 (en) | 2020-09-10 | 2022-06-14 | Snap Inc. | Colocated shared augmented reality without shared backend |
US11411895B2 (en) | 2017-11-29 | 2022-08-09 | Snap Inc. | Generating aggregated media content items for a group of users in an electronic messaging application |
US11425062B2 (en) | 2019-09-27 | 2022-08-23 | Snap Inc. | Recommended content viewed by friends |
US11438341B1 (en) | 2016-10-10 | 2022-09-06 | Snap Inc. | Social media post subscribe requests for buffer user accounts |
US11450051B2 (en) | 2020-11-18 | 2022-09-20 | Snap Inc. | Personalized avatar real-time motion capture |
US11455081B2 (en) | 2019-08-05 | 2022-09-27 | Snap Inc. | Message thread prioritization interface |
US11452939B2 (en) | 2020-09-21 | 2022-09-27 | Snap Inc. | Graphical marker generation system for synchronizing users |
US11455082B2 (en) | 2018-09-28 | 2022-09-27 | Snap Inc. | Collaborative achievement interface |
US11460974B1 (en) | 2017-11-28 | 2022-10-04 | Snap Inc. | Content discovery refresh |
US20220366210A1 (en) * | 2018-12-05 | 2022-11-17 | Disney Enterprises, Inc. | Simulated human-like affect-driven behavior by a virtual agent |
US11516173B1 (en) | 2018-12-26 | 2022-11-29 | Snap Inc. | Message composition interface |
US11543939B2 (en) | 2020-06-08 | 2023-01-03 | Snap Inc. | Encoded image based messaging system |
US11544883B1 (en) | 2017-01-16 | 2023-01-03 | Snap Inc. | Coded vision system |
US11544885B2 (en) | 2021-03-19 | 2023-01-03 | Snap Inc. | Augmented reality experience based on physical items |
US11557093B1 (en) * | 2019-09-10 | 2023-01-17 | Meta Platforms Technologies, Llc | Using social connections to define graphical representations of users in an artificial reality setting |
US11562548B2 (en) | 2021-03-22 | 2023-01-24 | Snap Inc. | True size eyewear in real time |
US11580682B1 (en) | 2020-06-30 | 2023-02-14 | Snap Inc. | Messaging system with augmented reality makeup |
US11580700B2 (en) | 2016-10-24 | 2023-02-14 | Snap Inc. | Augmented reality object manipulation |
US11615713B2 (en) * | 2016-05-27 | 2023-03-28 | Janssen Pharmaceutica Nv | System and method for assessing cognitive and mood states of a real world user as a function of virtual world activity |
US11616745B2 (en) | 2017-01-09 | 2023-03-28 | Snap Inc. | Contextual generation and selection of customized media content |
US11615592B2 (en) | 2020-10-27 | 2023-03-28 | Snap Inc. | Side-by-side character animation from realtime 3D body motion capture |
US11619501B2 (en) | 2020-03-11 | 2023-04-04 | Snap Inc. | Avatar based on trip |
US11625873B2 (en) | 2020-03-30 | 2023-04-11 | Snap Inc. | Personalized media overlay recommendation |
US11636662B2 (en) | 2021-09-30 | 2023-04-25 | Snap Inc. | Body normal network light and rendering control |
US11636654B2 (en) | 2021-05-19 | 2023-04-25 | Snap Inc. | AR-based connected portal shopping |
US11651572B2 (en) | 2021-10-11 | 2023-05-16 | Snap Inc. | Light and rendering of garments |
US11651539B2 (en) | 2020-01-30 | 2023-05-16 | Snap Inc. | System for generating media content items on demand |
US11662900B2 (en) | 2016-05-31 | 2023-05-30 | Snap Inc. | Application control using a gesture based trigger |
US11660022B2 (en) | 2020-10-27 | 2023-05-30 | Snap Inc. | Adaptive skeletal joint smoothing |
US11663792B2 (en) | 2021-09-08 | 2023-05-30 | Snap Inc. | Body fitted accessory with physics simulation |
US11670059B2 (en) | 2021-09-01 | 2023-06-06 | Snap Inc. | Controlling interactive fashion based on body gestures |
US11676199B2 (en) | 2019-06-28 | 2023-06-13 | Snap Inc. | Generating customizable avatar outfits |
US11673054B2 (en) | 2021-09-07 | 2023-06-13 | Snap Inc. | Controlling AR games on fashion items |
US11683280B2 (en) | 2020-06-10 | 2023-06-20 | Snap Inc. | Messaging system including an external-resource dock and drawer |
US11704878B2 (en) | 2017-01-09 | 2023-07-18 | Snap Inc. | Surface aware lens |
US11734959B2 (en) | 2021-03-16 | 2023-08-22 | Snap Inc. | Activating hands-free mode on mirroring device |
US11734866B2 (en) | 2021-09-13 | 2023-08-22 | Snap Inc. | Controlling interactive fashion based on voice |
US11734894B2 (en) | 2020-11-18 | 2023-08-22 | Snap Inc. | Real-time motion transfer for prosthetic limbs |
US11748931B2 (en) | 2020-11-18 | 2023-09-05 | Snap Inc. | Body animation sharing and remixing |
US11748958B2 (en) | 2021-12-07 | 2023-09-05 | Snap Inc. | Augmented reality unboxing experience |
US11763481B2 (en) | 2021-10-20 | 2023-09-19 | Snap Inc. | Mirror-based augmented reality experience |
US11790614B2 (en) | 2021-10-11 | 2023-10-17 | Snap Inc. | Inferring intent from pose and speech input |
US11790531B2 (en) | 2021-02-24 | 2023-10-17 | Snap Inc. | Whole body segmentation |
US11798201B2 (en) | 2021-03-16 | 2023-10-24 | Snap Inc. | Mirroring device with whole-body outfits |
US11798238B2 (en) | 2021-09-14 | 2023-10-24 | Snap Inc. | Blending body mesh into external mesh |
US11809633B2 (en) | 2021-03-16 | 2023-11-07 | Snap Inc. | Mirroring device with pointing based navigation |
US11818286B2 (en) | 2020-03-30 | 2023-11-14 | Snap Inc. | Avatar recommendation and reply |
US11823346B2 (en) | 2022-01-17 | 2023-11-21 | Snap Inc. | AR body part tracking system |
US11830209B2 (en) | 2017-05-26 | 2023-11-28 | Snap Inc. | Neural network-based image stream modification |
US11836866B2 (en) | 2021-09-20 | 2023-12-05 | Snap Inc. | Deforming real-world object using an external mesh |
US11836862B2 (en) | 2021-10-11 | 2023-12-05 | Snap Inc. | External mesh with vertex attributes |
US11842411B2 (en) | 2017-04-27 | 2023-12-12 | Snap Inc. | Location-based virtual avatars |
US11852554B1 (en) | 2019-03-21 | 2023-12-26 | Snap Inc. | Barometer calibration in a location sharing system |
US11854069B2 (en) | 2021-07-16 | 2023-12-26 | Snap Inc. | Personalized try-on ads |
US11863513B2 (en) | 2020-08-31 | 2024-01-02 | Snap Inc. | Media content playback and comments management |
US11870745B1 (en) | 2022-06-28 | 2024-01-09 | Snap Inc. | Media gallery sharing and management |
US11868414B1 (en) | 2019-03-14 | 2024-01-09 | Snap Inc. | Graph-based prediction for contact suggestion in a location sharing system |
US11875439B2 (en) | 2018-04-18 | 2024-01-16 | Snap Inc. | Augmented expression system |
US11880947B2 (en) | 2021-12-21 | 2024-01-23 | Snap Inc. | Real-time upper-body garment exchange |
US11887260B2 (en) | 2021-12-30 | 2024-01-30 | Snap Inc. | AR position indicator |
US11888795B2 (en) | 2020-09-21 | 2024-01-30 | Snap Inc. | Chats with micro sound clips |
US11893166B1 (en) | 2022-11-08 | 2024-02-06 | Snap Inc. | User avatar movement control using an augmented reality eyewear device |
US11900506B2 (en) | 2021-09-09 | 2024-02-13 | Snap Inc. | Controlling interactive fashion based on facial expressions |
US11908243B2 (en) | 2021-03-16 | 2024-02-20 | Snap Inc. | Menu hierarchy navigation on electronic mirroring devices |
US11910269B2 (en) | 2020-09-25 | 2024-02-20 | Snap Inc. | Augmented reality content items including user avatar to share location |
US11908083B2 (en) | 2021-08-31 | 2024-02-20 | Snap Inc. | Deforming custom mesh based on body mesh |
US11922010B2 (en) | 2020-06-08 | 2024-03-05 | Snap Inc. | Providing contextual information with keyboard interface for messaging system |
US11928783B2 (en) | 2021-12-30 | 2024-03-12 | Snap Inc. | AR position and orientation along a plane |
US11941227B2 (en) | 2021-06-30 | 2024-03-26 | Snap Inc. | Hybrid search system for customizable media |
US11954762B2 (en) | 2022-01-19 | 2024-04-09 | Snap Inc. | Object replacement system |
US11956190B2 (en) | 2020-05-08 | 2024-04-09 | Snap Inc. | Messaging system with a carousel of related entities |
US11960784B2 (en) | 2021-12-07 | 2024-04-16 | Snap Inc. | Shared augmented reality unboxing experience |
US11969075B2 (en) | 2020-03-31 | 2024-04-30 | Snap Inc. | Augmented reality beauty product tutorials |
US11978283B2 (en) | 2021-03-16 | 2024-05-07 | Snap Inc. | Mirroring device with a hands-free mode |
US11983826B2 (en) | 2021-09-30 | 2024-05-14 | Snap Inc. | 3D upper garment tracking |
US11983462B2 (en) | 2021-08-31 | 2024-05-14 | Snap Inc. | Conversation guided augmented reality experience |
US11991419B2 (en) | 2020-01-30 | 2024-05-21 | Snap Inc. | Selecting avatars to be included in the video being generated on demand |
US11995757B2 (en) | 2021-10-29 | 2024-05-28 | Snap Inc. | Customized animation from video |
US11996113B2 (en) | 2021-10-29 | 2024-05-28 | Snap Inc. | Voice notes with changing effects |
US12002146B2 (en) | 2022-03-28 | 2024-06-04 | Snap Inc. | 3D modeling based on neural light field |
US12008811B2 (en) | 2020-12-30 | 2024-06-11 | Snap Inc. | Machine learning-based selection of a representative video frame within a messaging application |
US12020358B2 (en) | 2021-10-29 | 2024-06-25 | Snap Inc. | Animated custom sticker creation |
US12020386B2 (en) | 2022-06-23 | 2024-06-25 | Snap Inc. | Applying pregenerated virtual experiences in new location |
US12020384B2 (en) | 2022-06-21 | 2024-06-25 | Snap Inc. | Integrating augmented reality experiences with other components |
US12034680B2 (en) | 2021-03-31 | 2024-07-09 | Snap Inc. | User presence indication data management |
US12046037B2 (en) | 2020-06-10 | 2024-07-23 | Snap Inc. | Adding beauty products to augmented reality tutorials |
US12047337B1 (en) | 2023-07-03 | 2024-07-23 | Snap Inc. | Generating media content items during user interaction |
US12051163B2 (en) | 2022-08-25 | 2024-07-30 | Snap Inc. | External computer vision for an eyewear device |
US12056792B2 (en) | 2020-12-30 | 2024-08-06 | Snap Inc. | Flow-guided motion retargeting |
US12062144B2 (en) | 2022-05-27 | 2024-08-13 | Snap Inc. | Automated augmented reality experience creation based on sample source and target images |
US12062146B2 (en) | 2022-07-28 | 2024-08-13 | Snap Inc. | Virtual wardrobe AR experience |
US12067804B2 (en) | 2021-03-22 | 2024-08-20 | Snap Inc. | True size eyewear experience in real time |
US12070682B2 (en) | 2019-03-29 | 2024-08-27 | Snap Inc. | 3D avatar plugin for third-party games |
US12080065B2 (en) | 2019-11-22 | 2024-09-03 | Snap Inc | Augmented reality items based on scan |
US12086916B2 (en) | 2021-10-22 | 2024-09-10 | Snap Inc. | Voice note with face tracking |
US12096153B2 (en) | 2021-12-21 | 2024-09-17 | Snap Inc. | Avatar call platform |
US12100156B2 (en) | 2021-04-12 | 2024-09-24 | Snap Inc. | Garment segmentation |
US12106486B2 (en) | 2021-02-24 | 2024-10-01 | Snap Inc. | Whole body visual effects |
US12142257B2 (en) | 2022-02-08 | 2024-11-12 | Snap Inc. | Emotion-based text to speech |
US12149489B2 (en) | 2023-03-14 | 2024-11-19 | Snap Inc. | Techniques for recommending reply stickers |
US12148105B2 (en) | 2022-03-30 | 2024-11-19 | Snap Inc. | Surface normals for pixel-aligned object |
US12154232B2 (en) | 2022-09-30 | 2024-11-26 | Snap Inc. | 9-DoF object tracking |
US12166734B2 (en) | 2019-09-27 | 2024-12-10 | Snap Inc. | Presenting reactions from friends |
US12165243B2 (en) | 2021-03-30 | 2024-12-10 | Snap Inc. | Customizable avatar modification system |
US12164109B2 (en) | 2022-04-29 | 2024-12-10 | Snap Inc. | AR/VR enabled contact lens |
US12170638B2 (en) | 2021-03-31 | 2024-12-17 | Snap Inc. | User presence status indicators generation and management |
US12175570B2 (en) | 2021-03-31 | 2024-12-24 | Snap Inc. | Customizable avatar generation system |
US12182583B2 (en) | 2021-05-19 | 2024-12-31 | Snap Inc. | Personalized avatar experience during a system boot process |
US12198398B2 (en) | 2021-12-21 | 2025-01-14 | Snap Inc. | Real-time motion and appearance transfer |
US12198287B2 (en) | 2022-01-17 | 2025-01-14 | Snap Inc. | AR body part tracking system |
US12198664B2 (en) | 2021-09-02 | 2025-01-14 | Snap Inc. | Interactive fashion with music AR |
US12223672B2 (en) | 2021-12-21 | 2025-02-11 | Snap Inc. | Real-time garment exchange |
US12229901B2 (en) | 2022-10-05 | 2025-02-18 | Snap Inc. | External screen streaming for an eyewear device |
US12235991B2 (en) | 2022-07-06 | 2025-02-25 | Snap Inc. | Obscuring elements based on browser focus |
US12236512B2 (en) | 2022-08-23 | 2025-02-25 | Snap Inc. | Avatar call on an eyewear device |
US12242979B1 (en) | 2019-03-12 | 2025-03-04 | Snap Inc. | Departure time estimation in a location sharing system |
US12243266B2 (en) | 2022-12-29 | 2025-03-04 | Snap Inc. | Device pairing using machine-readable optical label |
US12254577B2 (en) | 2022-04-05 | 2025-03-18 | Snap Inc. | Pixel depth determination for object |
US12265692B2 (en) | 2022-10-03 | 2025-04-01 | Snap Inc. | Content discovery refresh |
Families Citing this family (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9741147B2 (en) * | 2008-12-12 | 2017-08-22 | International Business Machines Corporation | System and method to modify avatar characteristics based on inferred conditions |
US20110092287A1 (en) * | 2009-10-15 | 2011-04-21 | Sanders Paul Maurice | Gaming participant attribute tag method and system |
US20110214071A1 (en) * | 2010-02-26 | 2011-09-01 | University Of Southern California | Information channels in mmogs |
US20110225518A1 (en) * | 2010-03-10 | 2011-09-15 | Oddmobb, Inc. | Friends toolbar for a virtual social venue |
US20110225039A1 (en) * | 2010-03-10 | 2011-09-15 | Oddmobb, Inc. | Virtual social venue feeding multiple video streams |
US8572177B2 (en) * | 2010-03-10 | 2013-10-29 | Xmobb, Inc. | 3D social platform for sharing videos and webpages |
US20110225519A1 (en) * | 2010-03-10 | 2011-09-15 | Oddmobb, Inc. | Social media platform for simulating a live experience |
US20110239136A1 (en) * | 2010-03-10 | 2011-09-29 | Oddmobb, Inc. | Instantiating widgets into a virtual social venue |
US8667402B2 (en) * | 2010-03-10 | 2014-03-04 | Onset Vi, L.P. | Visualizing communications within a social setting |
US20110225515A1 (en) * | 2010-03-10 | 2011-09-15 | Oddmobb, Inc. | Sharing emotional reactions to social media |
US9152734B2 (en) * | 2010-05-24 | 2015-10-06 | Iii Holdings 2, Llc | Systems and methods for identifying intersections using content metadata |
US20120130717A1 (en) * | 2010-11-19 | 2012-05-24 | Microsoft Corporation | Real-time Animation for an Expressive Avatar |
US20120158515A1 (en) * | 2010-12-21 | 2012-06-21 | Yahoo! Inc. | Dynamic advertisement serving based on an avatar |
US9348479B2 (en) * | 2011-12-08 | 2016-05-24 | Microsoft Technology Licensing, Llc | Sentiment aware user interface customization |
US9378290B2 (en) | 2011-12-20 | 2016-06-28 | Microsoft Technology Licensing, Llc | Scenario-adaptive input method editor |
KR20130084543A (en) * | 2012-01-17 | 2013-07-25 | 삼성전자주식회사 | Apparatus and method for providing user interface |
EP2864856A4 (en) | 2012-06-25 | 2015-10-14 | Microsoft Technology Licensing Llc | Input method editor application platform |
WO2014032244A1 (en) | 2012-08-30 | 2014-03-06 | Microsoft Corporation | Feature-based candidate selection |
US9746990B2 (en) * | 2012-09-28 | 2017-08-29 | Intel Corporation | Selectively augmenting communications transmitted by a communication device |
WO2015018055A1 (en) | 2013-08-09 | 2015-02-12 | Microsoft Corporation | Input method editor providing language assistance |
KR20160133154A (en) * | 2015-05-12 | 2016-11-22 | 삼성전자주식회사 | Electronic device and Method for providing graphical user interface of the same |
US10386996B2 (en) | 2015-06-11 | 2019-08-20 | Microsoft Technology Licensing, Llc | Communicating emotional information via avatar animation |
US10884502B2 (en) * | 2016-11-23 | 2021-01-05 | Google Llc | Providing mediated social interactions |
JPWO2018235607A1 (en) * | 2017-06-20 | 2020-04-16 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
US20200019242A1 (en) * | 2018-07-12 | 2020-01-16 | Microsoft Technology Licensing, Llc | Digital personal expression via wearable device |
US11989509B2 (en) * | 2021-09-03 | 2024-05-21 | International Business Machines Corporation | Generative adversarial network implemented digital script modification |
Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5926179A (en) * | 1996-09-30 | 1999-07-20 | Sony Corporation | Three-dimensional virtual reality space display processing apparatus, a three-dimensional virtual reality space display processing method, and an information providing medium |
US6064383A (en) * | 1996-10-04 | 2000-05-16 | Microsoft Corporation | Method and system for selecting an emotional appearance and prosody for a graphical character |
US6229533B1 (en) * | 1996-08-02 | 2001-05-08 | Fujitsu Limited | Ghost object for a virtual world |
US6329986B1 (en) * | 1998-02-21 | 2001-12-11 | U.S. Philips Corporation | Priority-based virtual environment |
US20020008716A1 (en) * | 2000-07-21 | 2002-01-24 | Colburn Robert A. | System and method for controlling expression characteristics of a virtual agent |
US6359622B1 (en) * | 1995-07-19 | 2002-03-19 | Extempo Systems, Inc. | System and method for directed improvisation by computer controlled characters |
US20050223328A1 (en) * | 2004-01-30 | 2005-10-06 | Ashish Ashtekar | Method and apparatus for providing dynamic moods for avatars |
US20070111795A1 (en) * | 2005-11-15 | 2007-05-17 | Joon-Hyuk Choi | Virtual entity on a network |
US20070218987A1 (en) * | 2005-10-14 | 2007-09-20 | Leviathan Entertainment, Llc | Event-Driven Alteration of Avatars |
US20080059570A1 (en) * | 2006-09-05 | 2008-03-06 | Aol Llc | Enabling an im user to navigate a virtual world |
US20080091692A1 (en) * | 2006-06-09 | 2008-04-17 | Christopher Keith | Information collection in multi-participant online communities |
US7386799B1 (en) * | 2002-11-21 | 2008-06-10 | Forterra Systems, Inc. | Cinematic techniques in avatar-centric communication during a multi-user online simulation |
US20090069084A1 (en) * | 2007-09-12 | 2009-03-12 | Reece Alex D | System and Methods for Monitoring and Controlling the Actions of an Avatar in a Virtual Environment |
US20090147008A1 (en) * | 2007-12-10 | 2009-06-11 | International Business Machines Corporation | Arrangements for controlling activites of an avatar |
US20090209335A1 (en) * | 2007-01-29 | 2009-08-20 | Sony Online Entertainment Llc | System and method of automatic entry creation for blogs, web pages or file-sharing sites based on game events |
Family Cites Families (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6285380B1 (en) | 1994-08-02 | 2001-09-04 | New York University | Method and system for scripting interactive animated actors |
US5880731A (en) | 1995-12-14 | 1999-03-09 | Microsoft Corporation | Use of avatars with automatic gesturing and bounded interaction in on-line chat session |
US5802296A (en) | 1996-08-02 | 1998-09-01 | Fujitsu Software Corporation | Supervisory powers that provide additional control over images on computers system displays to users interactings via computer systems |
IL125432A (en) | 1998-01-30 | 2010-11-30 | Easynet Access Inc | Personalized internet interaction |
US6563503B1 (en) | 1999-05-07 | 2003-05-13 | Nintendo Co., Ltd. | Object modeling for computer simulation and animation |
TW552539B (en) * | 2000-09-29 | 2003-09-11 | Sony Corp | Agent system, agent transaction method, data providing device, and data recording medium |
JP4395687B2 (en) * | 2000-12-20 | 2010-01-13 | ソニー株式会社 | Information processing device |
US7137070B2 (en) * | 2002-06-27 | 2006-11-14 | International Business Machines Corporation | Sampling responses to communication content for use in analyzing reaction responses to other communications |
US20040225640A1 (en) | 2002-06-27 | 2004-11-11 | International Business Machines Corporation | Context searchable communications |
US20070113181A1 (en) * | 2003-03-03 | 2007-05-17 | Blattner Patrick D | Using avatars to communicate real-time information |
US7725419B2 (en) | 2003-09-05 | 2010-05-25 | Samsung Electronics Co., Ltd | Proactive user interface including emotional agent |
JP3625212B1 (en) | 2003-09-16 | 2005-03-02 | 独立行政法人科学技術振興機構 | Three-dimensional virtual space simulator, three-dimensional virtual space simulation program, and computer-readable recording medium recording the same |
US7949738B2 (en) * | 2004-02-12 | 2011-05-24 | Sap Aktiengesellschaft | Graphical interface for generating and previewing a rule |
US7468729B1 (en) | 2004-12-21 | 2008-12-23 | Aol Llc, A Delaware Limited Liability Company | Using an avatar to generate user profile information |
KR100511210B1 (en) | 2004-12-27 | 2005-08-30 | 주식회사지앤지커머스 | Method for converting 2d image into pseudo 3d image and user-adapted total coordination method in use artificial intelligence, and service besiness method thereof |
US20060248461A1 (en) | 2005-04-29 | 2006-11-02 | Omron Corporation | Socially intelligent agent software |
US20070136068A1 (en) | 2005-12-09 | 2007-06-14 | Microsoft Corporation | Multimodal multilingual devices and applications for enhanced goal-interpretation and translation for service providers |
US8683386B2 (en) | 2006-10-03 | 2014-03-25 | Brian Mark Shuster | Virtual environment for computer game |
US20080120558A1 (en) | 2006-11-16 | 2008-05-22 | Paco Xander Nathan | Systems and methods for managing a persistent virtual avatar with migrational ability |
EP2597868B1 (en) | 2007-09-24 | 2017-09-13 | Qualcomm Incorporated | Enhanced interface for voice and video communications |
US20090094517A1 (en) * | 2007-10-03 | 2009-04-09 | Brody Jonathan S | Conversational advertising |
US8892999B2 (en) * | 2007-11-30 | 2014-11-18 | Nike, Inc. | Interactive avatar for social network services |
US20090158170A1 (en) | 2007-12-14 | 2009-06-18 | Rajesh Narayanan | Automatic profile-based avatar generation |
US20090177976A1 (en) * | 2008-01-09 | 2009-07-09 | Bokor Brian R | Managing and presenting avatar mood effects in a virtual world |
US20090300525A1 (en) | 2008-05-27 | 2009-12-03 | Jolliff Maria Elena Romera | Method and system for automatically updating avatar to indicate user's status |
-
2008
- 2008-01-09 US US11/971,508 patent/US20090177976A1/en not_active Abandoned
- 2008-12-09 US US12/330,829 patent/US9568993B2/en active Active
Patent Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6359622B1 (en) * | 1995-07-19 | 2002-03-19 | Extempo Systems, Inc. | System and method for directed improvisation by computer controlled characters |
US6229533B1 (en) * | 1996-08-02 | 2001-05-08 | Fujitsu Limited | Ghost object for a virtual world |
US5926179A (en) * | 1996-09-30 | 1999-07-20 | Sony Corporation | Three-dimensional virtual reality space display processing apparatus, a three-dimensional virtual reality space display processing method, and an information providing medium |
US6064383A (en) * | 1996-10-04 | 2000-05-16 | Microsoft Corporation | Method and system for selecting an emotional appearance and prosody for a graphical character |
US6329986B1 (en) * | 1998-02-21 | 2001-12-11 | U.S. Philips Corporation | Priority-based virtual environment |
US20020008716A1 (en) * | 2000-07-21 | 2002-01-24 | Colburn Robert A. | System and method for controlling expression characteristics of a virtual agent |
US7386799B1 (en) * | 2002-11-21 | 2008-06-10 | Forterra Systems, Inc. | Cinematic techniques in avatar-centric communication during a multi-user online simulation |
US20050223328A1 (en) * | 2004-01-30 | 2005-10-06 | Ashish Ashtekar | Method and apparatus for providing dynamic moods for avatars |
US20070218987A1 (en) * | 2005-10-14 | 2007-09-20 | Leviathan Entertainment, Llc | Event-Driven Alteration of Avatars |
US20070111795A1 (en) * | 2005-11-15 | 2007-05-17 | Joon-Hyuk Choi | Virtual entity on a network |
US20080091692A1 (en) * | 2006-06-09 | 2008-04-17 | Christopher Keith | Information collection in multi-participant online communities |
US20080059570A1 (en) * | 2006-09-05 | 2008-03-06 | Aol Llc | Enabling an im user to navigate a virtual world |
US20090209335A1 (en) * | 2007-01-29 | 2009-08-20 | Sony Online Entertainment Llc | System and method of automatic entry creation for blogs, web pages or file-sharing sites based on game events |
US20090069084A1 (en) * | 2007-09-12 | 2009-03-12 | Reece Alex D | System and Methods for Monitoring and Controlling the Actions of an Avatar in a Virtual Environment |
US20090147008A1 (en) * | 2007-12-10 | 2009-06-11 | International Business Machines Corporation | Arrangements for controlling activites of an avatar |
Cited By (396)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9211077B2 (en) * | 2007-12-13 | 2015-12-15 | The Invention Science Fund I, Llc | Methods and systems for specifying an avatar |
US9495684B2 (en) | 2007-12-13 | 2016-11-15 | The Invention Science Fund I, Llc | Methods and systems for indicating behavior in a population cohort |
US20090157625A1 (en) * | 2007-12-13 | 2009-06-18 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for identifying an avatar-linked population cohort |
US20090157751A1 (en) * | 2007-12-13 | 2009-06-18 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for specifying an avatar |
US20090157482A1 (en) * | 2007-12-13 | 2009-06-18 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for indicating behavior in a population cohort |
US20090157481A1 (en) * | 2007-12-13 | 2009-06-18 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for specifying a cohort-linked avatar attribute |
US8615479B2 (en) | 2007-12-13 | 2013-12-24 | The Invention Science Fund I, Llc | Methods and systems for indicating behavior in a population cohort |
US20090156907A1 (en) * | 2007-12-13 | 2009-06-18 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for specifying an avatar |
US20090157660A1 (en) * | 2007-12-13 | 2009-06-18 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems employing a cohort-linked avatar |
US20090171164A1 (en) * | 2007-12-17 | 2009-07-02 | Jung Edward K Y | Methods and systems for identifying an avatar-linked population cohort |
US20090164549A1 (en) * | 2007-12-20 | 2009-06-25 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for determining interest in a cohort-linked avatar |
US9418368B2 (en) | 2007-12-20 | 2016-08-16 | Invention Science Fund I, Llc | Methods and systems for determining interest in a cohort-linked avatar |
US20090164131A1 (en) * | 2007-12-20 | 2009-06-25 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for specifying a media content-linked population cohort |
US20090164503A1 (en) * | 2007-12-20 | 2009-06-25 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Methods and systems for specifying a media content-linked population cohort |
US9775554B2 (en) | 2007-12-31 | 2017-10-03 | Invention Science Fund I, Llc | Population cohort-linked avatar |
US20090172540A1 (en) * | 2007-12-31 | 2009-07-02 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Population cohort-linked avatar |
US20100146407A1 (en) * | 2008-01-09 | 2010-06-10 | Bokor Brian R | Automated avatar mood effects in a virtual world |
US9568993B2 (en) * | 2008-01-09 | 2017-02-14 | International Business Machines Corporation | Automated avatar mood effects in a virtual world |
US8006182B2 (en) * | 2008-03-18 | 2011-08-23 | International Business Machines Corporation | Method and computer program product for implementing automatic avatar status indicators |
US20090241049A1 (en) * | 2008-03-18 | 2009-09-24 | International Business Machines Corporation | Method and computer program product for implementing automatic avatar status indicators |
US9223469B2 (en) * | 2008-08-22 | 2015-12-29 | Intellectual Ventures Fund 83 Llc | Configuring a virtual world user-interface |
US20100050088A1 (en) * | 2008-08-22 | 2010-02-25 | Neustaedter Carman G | Configuring a virtual world user-interface |
US20100131876A1 (en) * | 2008-11-21 | 2010-05-27 | Nortel Networks Limited | Ability to create a preferred profile for the agent in a customer interaction experience |
US20100156909A1 (en) * | 2008-12-19 | 2010-06-24 | International Business Machines Corporation | Enhanced visibility of avatars satisfying a profile |
US8878873B2 (en) * | 2008-12-19 | 2014-11-04 | International Business Machines Corporation | Enhanced visibility of avatars satisfying a profile |
US9749270B2 (en) | 2009-02-03 | 2017-08-29 | Snap Inc. | Interactive avatar in messaging environment |
US10158589B2 (en) | 2009-02-03 | 2018-12-18 | Snap Inc. | Interactive avatar in messaging environment |
US9105014B2 (en) * | 2009-02-03 | 2015-08-11 | International Business Machines Corporation | Interactive avatar in messaging environment |
US11425068B2 (en) | 2009-02-03 | 2022-08-23 | Snap Inc. | Interactive avatar in messaging environment |
US20100198924A1 (en) * | 2009-02-03 | 2010-08-05 | International Business Machines Corporation | Interactive avatar in messaging environment |
US8788943B2 (en) | 2009-05-15 | 2014-07-22 | Ganz | Unlocking emoticons using feature codes |
US8620850B2 (en) | 2010-09-07 | 2013-12-31 | Blackberry Limited | Dynamically manipulating an emoticon or avatar |
US11925869B2 (en) | 2012-05-08 | 2024-03-12 | Snap Inc. | System and method for generating and displaying avatars |
US11607616B2 (en) | 2012-05-08 | 2023-03-21 | Snap Inc. | System and method for generating and displaying avatars |
US11229849B2 (en) | 2012-05-08 | 2022-01-25 | Snap Inc. | System and method for generating and displaying avatars |
US20130332859A1 (en) * | 2012-06-08 | 2013-12-12 | Sri International | Method and user interface for creating an animated communication |
US10279272B2 (en) * | 2013-02-15 | 2019-05-07 | Disney Enterprise, Inc. | Initiate events through hidden interactions |
US20160074758A1 (en) * | 2013-02-15 | 2016-03-17 | Disney Enterprises, Inc. | Initiate events through hidden interactions |
US10991395B1 (en) | 2014-02-05 | 2021-04-27 | Snap Inc. | Method for real time video processing involving changing a color of an object on a human face in a video |
US11651797B2 (en) | 2014-02-05 | 2023-05-16 | Snap Inc. | Real time video processing for changing proportions of an object in the video |
US11443772B2 (en) | 2014-02-05 | 2022-09-13 | Snap Inc. | Method for triggering events in a video |
USD818732S1 (en) * | 2016-01-13 | 2018-05-29 | Paragon Furniture, Inc. | Chair shell |
EP3427129A1 (en) * | 2016-03-11 | 2019-01-16 | Sony Interactive Entertainment Europe Limited | Virtual reality |
US10559110B2 (en) | 2016-03-11 | 2020-02-11 | Sony Interactive Entertainment Europe Limited | Virtual reality |
US10733781B2 (en) | 2016-03-11 | 2020-08-04 | Sony Interactive Entertainment Europe Limited | Virtual reality |
GB2553607A (en) * | 2016-03-11 | 2018-03-14 | Sony Interactive Entertainment Europe Ltd | Virtual reality |
US10943382B2 (en) | 2016-03-11 | 2021-03-09 | Sony Interactive Entertainment Inc. | Virtual reality |
GB2556347A (en) * | 2016-03-11 | 2018-05-30 | Sony Interactive Entertainment Europe Ltd | Virtual reality |
EP3427130A1 (en) * | 2016-03-11 | 2019-01-16 | Sony Interactive Entertainment Europe Limited | Virtual reality |
GB2556347B (en) * | 2016-03-11 | 2019-08-28 | Sony Interactive Entertainment Europe Ltd | Virtual Reality |
US11048916B2 (en) | 2016-03-31 | 2021-06-29 | Snap Inc. | Automated avatar generation |
US11631276B2 (en) | 2016-03-31 | 2023-04-18 | Snap Inc. | Automated avatar generation |
US11615713B2 (en) * | 2016-05-27 | 2023-03-28 | Janssen Pharmaceutica Nv | System and method for assessing cognitive and mood states of a real world user as a function of virtual world activity |
US12073742B2 (en) | 2016-05-27 | 2024-08-27 | Janssen Pharmaceutica Nv | System and method for assessing cognitive and mood states of a real world user as a function of virtual world activity |
US11662900B2 (en) | 2016-05-31 | 2023-05-30 | Snap Inc. | Application control using a gesture based trigger |
US12131015B2 (en) | 2016-05-31 | 2024-10-29 | Snap Inc. | Application control using a gesture based trigger |
US10984569B2 (en) | 2016-06-30 | 2021-04-20 | Snap Inc. | Avatar based ideogram generation |
US11509615B2 (en) | 2016-07-19 | 2022-11-22 | Snap Inc. | Generating customized electronic messaging graphics |
US11438288B2 (en) | 2016-07-19 | 2022-09-06 | Snap Inc. | Displaying customized electronic messaging graphics |
US10855632B2 (en) | 2016-07-19 | 2020-12-01 | Snap Inc. | Displaying customized electronic messaging graphics |
US10848446B1 (en) | 2016-07-19 | 2020-11-24 | Snap Inc. | Displaying customized electronic messaging graphics |
US11418470B2 (en) | 2016-07-19 | 2022-08-16 | Snap Inc. | Displaying customized electronic messaging graphics |
US11962598B2 (en) | 2016-10-10 | 2024-04-16 | Snap Inc. | Social media post subscribe requests for buffer user accounts |
US11438341B1 (en) | 2016-10-10 | 2022-09-06 | Snap Inc. | Social media post subscribe requests for buffer user accounts |
US11100311B2 (en) | 2016-10-19 | 2021-08-24 | Snap Inc. | Neural networks for facial modeling |
US10938758B2 (en) | 2016-10-24 | 2021-03-02 | Snap Inc. | Generating and displaying customized avatars in media overlays |
US11580700B2 (en) | 2016-10-24 | 2023-02-14 | Snap Inc. | Augmented reality object manipulation |
US12113760B2 (en) | 2016-10-24 | 2024-10-08 | Snap Inc. | Generating and displaying customized avatars in media overlays |
US12206635B2 (en) | 2016-10-24 | 2025-01-21 | Snap Inc. | Generating and displaying customized avatars in electronic messages |
US11876762B1 (en) | 2016-10-24 | 2024-01-16 | Snap Inc. | Generating and displaying customized avatars in media overlays |
US11218433B2 (en) | 2016-10-24 | 2022-01-04 | Snap Inc. | Generating and displaying customized avatars in electronic messages |
US11843456B2 (en) | 2016-10-24 | 2023-12-12 | Snap Inc. | Generating and displaying customized avatars in media overlays |
US10880246B2 (en) | 2016-10-24 | 2020-12-29 | Snap Inc. | Generating and displaying customized avatars in electronic messages |
US12217374B2 (en) | 2017-01-09 | 2025-02-04 | Snap Inc. | Surface aware lens |
US11616745B2 (en) | 2017-01-09 | 2023-03-28 | Snap Inc. | Contextual generation and selection of customized media content |
US11704878B2 (en) | 2017-01-09 | 2023-07-18 | Snap Inc. | Surface aware lens |
US12028301B2 (en) | 2017-01-09 | 2024-07-02 | Snap Inc. | Contextual generation and selection of customized media content |
US11989809B2 (en) | 2017-01-16 | 2024-05-21 | Snap Inc. | Coded vision system |
US11544883B1 (en) | 2017-01-16 | 2023-01-03 | Snap Inc. | Coded vision system |
US10951562B2 (en) | 2017-01-18 | 2021-03-16 | Snap. Inc. | Customized contextual media content item generation |
US11991130B2 (en) | 2017-01-18 | 2024-05-21 | Snap Inc. | Customized contextual media content item generation |
US10454857B1 (en) * | 2017-01-23 | 2019-10-22 | Snap Inc. | Customized digital avatar accessories |
US11870743B1 (en) | 2017-01-23 | 2024-01-09 | Snap Inc. | Customized digital avatar accessories |
US11593980B2 (en) | 2017-04-20 | 2023-02-28 | Snap Inc. | Customized user interface for electronic communications |
US11069103B1 (en) | 2017-04-20 | 2021-07-20 | Snap Inc. | Customized user interface for electronic communications |
US11474663B2 (en) | 2017-04-27 | 2022-10-18 | Snap Inc. | Location-based search mechanism in a graphical user interface |
US12112013B2 (en) | 2017-04-27 | 2024-10-08 | Snap Inc. | Location privacy management on map-based social media platforms |
US10963529B1 (en) | 2017-04-27 | 2021-03-30 | Snap Inc. | Location-based search mechanism in a graphical user interface |
US11842411B2 (en) | 2017-04-27 | 2023-12-12 | Snap Inc. | Location-based virtual avatars |
US11418906B2 (en) | 2017-04-27 | 2022-08-16 | Snap Inc. | Selective location-based identity communication |
US12223156B2 (en) | 2017-04-27 | 2025-02-11 | Snap Inc. | Low-latency delivery mechanism for map-based GUI |
US11451956B1 (en) | 2017-04-27 | 2022-09-20 | Snap Inc. | Location privacy management on map-based social media platforms |
US12131003B2 (en) | 2017-04-27 | 2024-10-29 | Snap Inc. | Map-based graphical user interface indicating geospatial activity metrics |
US12086381B2 (en) | 2017-04-27 | 2024-09-10 | Snap Inc. | Map-based graphical user interface for multi-type social media galleries |
US10952013B1 (en) | 2017-04-27 | 2021-03-16 | Snap Inc. | Selective location-based identity communication |
US12058583B2 (en) | 2017-04-27 | 2024-08-06 | Snap Inc. | Selective location-based identity communication |
US11995288B2 (en) | 2017-04-27 | 2024-05-28 | Snap Inc. | Location-based search mechanism in a graphical user interface |
US11782574B2 (en) | 2017-04-27 | 2023-10-10 | Snap Inc. | Map-based graphical user interface indicating geospatial activity metrics |
US11385763B2 (en) | 2017-04-27 | 2022-07-12 | Snap Inc. | Map-based graphical user interface indicating geospatial activity metrics |
US11392264B1 (en) | 2017-04-27 | 2022-07-19 | Snap Inc. | Map-based graphical user interface for multi-type social media galleries |
US11893647B2 (en) | 2017-04-27 | 2024-02-06 | Snap Inc. | Location-based virtual avatars |
US11830209B2 (en) | 2017-05-26 | 2023-11-28 | Snap Inc. | Neural network-based image stream modification |
US10579401B2 (en) * | 2017-06-21 | 2020-03-03 | Rovi Guides, Inc. | Systems and methods for providing a virtual assistant to accommodate different sentiments among a group of users by correlating or prioritizing causes of the different sentiments |
US11659014B2 (en) | 2017-07-28 | 2023-05-23 | Snap Inc. | Software application manager for messaging applications |
US11882162B2 (en) | 2017-07-28 | 2024-01-23 | Snap Inc. | Software application manager for messaging applications |
US11122094B2 (en) | 2017-07-28 | 2021-09-14 | Snap Inc. | Software application manager for messaging applications |
US12177273B2 (en) | 2017-07-28 | 2024-12-24 | Snap Inc. | Software application manager for messaging applications |
US12182919B2 (en) | 2017-10-26 | 2024-12-31 | Snap Inc. | Joint audio-video facial animation system |
US11120597B2 (en) | 2017-10-26 | 2021-09-14 | Snap Inc. | Joint audio-video facial animation system |
US11610354B2 (en) | 2017-10-26 | 2023-03-21 | Snap Inc. | Joint audio-video facial animation system |
US11930055B2 (en) | 2017-10-30 | 2024-03-12 | Snap Inc. | Animated chat presence |
US12212614B2 (en) | 2017-10-30 | 2025-01-28 | Snap Inc. | Animated chat presence |
US11030789B2 (en) | 2017-10-30 | 2021-06-08 | Snap Inc. | Animated chat presence |
US11706267B2 (en) | 2017-10-30 | 2023-07-18 | Snap Inc. | Animated chat presence |
US11354843B2 (en) | 2017-10-30 | 2022-06-07 | Snap Inc. | Animated chat presence |
US11460974B1 (en) | 2017-11-28 | 2022-10-04 | Snap Inc. | Content discovery refresh |
US10936157B2 (en) | 2017-11-29 | 2021-03-02 | Snap Inc. | Selectable item including a customized graphic for an electronic messaging application |
US11411895B2 (en) | 2017-11-29 | 2022-08-09 | Snap Inc. | Generating aggregated media content items for a group of users in an electronic messaging application |
US12242708B2 (en) | 2017-11-29 | 2025-03-04 | Snap Inc. | Selectable item including a customized graphic for an electronic messaging application |
US11769259B2 (en) | 2018-01-23 | 2023-09-26 | Snap Inc. | Region-based stabilized face tracking |
US10949648B1 (en) | 2018-01-23 | 2021-03-16 | Snap Inc. | Region-based stabilized face tracking |
US11468618B2 (en) | 2018-02-28 | 2022-10-11 | Snap Inc. | Animated expressive icon |
US11120601B2 (en) | 2018-02-28 | 2021-09-14 | Snap Inc. | Animated expressive icon |
US11880923B2 (en) | 2018-02-28 | 2024-01-23 | Snap Inc. | Animated expressive icon |
US10979752B1 (en) | 2018-02-28 | 2021-04-13 | Snap Inc. | Generating media content items based on location information |
US11688119B2 (en) | 2018-02-28 | 2023-06-27 | Snap Inc. | Animated expressive icon |
US11523159B2 (en) | 2018-02-28 | 2022-12-06 | Snap Inc. | Generating media content items based on location information |
US11310176B2 (en) | 2018-04-13 | 2022-04-19 | Snap Inc. | Content suggestion system |
US12113756B2 (en) | 2018-04-13 | 2024-10-08 | Snap Inc. | Content suggestion system |
US11875439B2 (en) | 2018-04-18 | 2024-01-16 | Snap Inc. | Augmented expression system |
US11074675B2 (en) | 2018-07-31 | 2021-07-27 | Snap Inc. | Eye texture inpainting |
US11715268B2 (en) | 2018-08-30 | 2023-08-01 | Snap Inc. | Video clip object tracking |
US11030813B2 (en) | 2018-08-30 | 2021-06-08 | Snap Inc. | Video clip object tracking |
US12182921B2 (en) | 2018-09-19 | 2024-12-31 | Snap Inc. | Avatar style transformation using neural networks |
US11348301B2 (en) | 2018-09-19 | 2022-05-31 | Snap Inc. | Avatar style transformation using neural networks |
US10896534B1 (en) | 2018-09-19 | 2021-01-19 | Snap Inc. | Avatar style transformation using neural networks |
US10895964B1 (en) | 2018-09-25 | 2021-01-19 | Snap Inc. | Interface to display shared user groups |
US11868590B2 (en) | 2018-09-25 | 2024-01-09 | Snap Inc. | Interface to display shared user groups |
US11294545B2 (en) | 2018-09-25 | 2022-04-05 | Snap Inc. | Interface to display shared user groups |
US11245658B2 (en) | 2018-09-28 | 2022-02-08 | Snap Inc. | System and method of generating private notifications between users in a communication session |
US12105938B2 (en) | 2018-09-28 | 2024-10-01 | Snap Inc. | Collaborative achievement interface |
US10904181B2 (en) | 2018-09-28 | 2021-01-26 | Snap Inc. | Generating customized graphics having reactions to electronic message content |
US11189070B2 (en) | 2018-09-28 | 2021-11-30 | Snap Inc. | System and method of generating targeted user lists using customizable avatar characteristics |
US11704005B2 (en) | 2018-09-28 | 2023-07-18 | Snap Inc. | Collaborative achievement interface |
US11171902B2 (en) | 2018-09-28 | 2021-11-09 | Snap Inc. | Generating customized graphics having reactions to electronic message content |
US11477149B2 (en) | 2018-09-28 | 2022-10-18 | Snap Inc. | Generating customized graphics having reactions to electronic message content |
US11610357B2 (en) | 2018-09-28 | 2023-03-21 | Snap Inc. | System and method of generating targeted user lists using customizable avatar characteristics |
US11824822B2 (en) | 2018-09-28 | 2023-11-21 | Snap Inc. | Generating customized graphics having reactions to electronic message content |
US11455082B2 (en) | 2018-09-28 | 2022-09-27 | Snap Inc. | Collaborative achievement interface |
US11321896B2 (en) | 2018-10-31 | 2022-05-03 | Snap Inc. | 3D avatar rendering |
US10872451B2 (en) | 2018-10-31 | 2020-12-22 | Snap Inc. | 3D avatar rendering |
US11103795B1 (en) | 2018-10-31 | 2021-08-31 | Snap Inc. | Game drawer |
US20200145615A1 (en) * | 2018-11-01 | 2020-05-07 | Honda Motor Co., Ltd. | System and method for providing virtual interpersonal communication |
US10893236B2 (en) * | 2018-11-01 | 2021-01-12 | Honda Motor Co., Ltd. | System and method for providing virtual interpersonal communication |
US11620791B2 (en) | 2018-11-27 | 2023-04-04 | Snap Inc. | Rendering 3D captions within real-world environments |
US12106441B2 (en) | 2018-11-27 | 2024-10-01 | Snap Inc. | Rendering 3D captions within real-world environments |
US11176737B2 (en) | 2018-11-27 | 2021-11-16 | Snap Inc. | Textured mesh building |
US12020377B2 (en) | 2018-11-27 | 2024-06-25 | Snap Inc. | Textured mesh building |
US11836859B2 (en) | 2018-11-27 | 2023-12-05 | Snap Inc. | Textured mesh building |
US20220044479A1 (en) | 2018-11-27 | 2022-02-10 | Snap Inc. | Textured mesh building |
US11887237B2 (en) | 2018-11-28 | 2024-01-30 | Snap Inc. | Dynamic composite user identifier |
US10902661B1 (en) | 2018-11-28 | 2021-01-26 | Snap Inc. | Dynamic composite user identifier |
US12153788B2 (en) | 2018-11-30 | 2024-11-26 | Snap Inc. | Generating customized avatars based on location information |
US11315259B2 (en) | 2018-11-30 | 2022-04-26 | Snap Inc. | Efficient human pose tracking in videos |
US11783494B2 (en) | 2018-11-30 | 2023-10-10 | Snap Inc. | Efficient human pose tracking in videos |
US11698722B2 (en) | 2018-11-30 | 2023-07-11 | Snap Inc. | Generating customized avatars based on location information |
US12165335B2 (en) | 2018-11-30 | 2024-12-10 | Snap Inc. | Efficient human pose tracking in videos |
US11199957B1 (en) | 2018-11-30 | 2021-12-14 | Snap Inc. | Generating customized avatars based on location information |
US10861170B1 (en) | 2018-11-30 | 2020-12-08 | Snap Inc. | Efficient human pose tracking in videos |
US20220366210A1 (en) * | 2018-12-05 | 2022-11-17 | Disney Enterprises, Inc. | Simulated human-like affect-driven behavior by a virtual agent |
US12242944B2 (en) * | 2018-12-05 | 2025-03-04 | Disney Enterprises, Inc. | Simulated human-like affect-driven behavior by a virtual agent |
US11055514B1 (en) | 2018-12-14 | 2021-07-06 | Snap Inc. | Image face manipulation |
US11798261B2 (en) | 2018-12-14 | 2023-10-24 | Snap Inc. | Image face manipulation |
US11516173B1 (en) | 2018-12-26 | 2022-11-29 | Snap Inc. | Message composition interface |
US11032670B1 (en) | 2019-01-14 | 2021-06-08 | Snap Inc. | Destination sharing in location sharing system |
US11877211B2 (en) | 2019-01-14 | 2024-01-16 | Snap Inc. | Destination sharing in location sharing system |
US12213028B2 (en) | 2019-01-14 | 2025-01-28 | Snap Inc. | Destination sharing in location sharing system |
US11751015B2 (en) | 2019-01-16 | 2023-09-05 | Snap Inc. | Location-based context information sharing in a messaging system |
US10945098B2 (en) | 2019-01-16 | 2021-03-09 | Snap Inc. | Location-based context information sharing in a messaging system |
US12192854B2 (en) | 2019-01-16 | 2025-01-07 | Snap Inc. | Location-based context information sharing in a messaging system |
US10939246B1 (en) | 2019-01-16 | 2021-03-02 | Snap Inc. | Location-based context information sharing in a messaging system |
US11294936B1 (en) | 2019-01-30 | 2022-04-05 | Snap Inc. | Adaptive spatial density based clustering |
US11693887B2 (en) | 2019-01-30 | 2023-07-04 | Snap Inc. | Adaptive spatial density based clustering |
US12131006B2 (en) | 2019-02-06 | 2024-10-29 | Snap Inc. | Global event-based avatar |
US11557075B2 (en) | 2019-02-06 | 2023-01-17 | Snap Inc. | Body pose estimation |
US11714524B2 (en) | 2019-02-06 | 2023-08-01 | Snap Inc. | Global event-based avatar |
US11010022B2 (en) | 2019-02-06 | 2021-05-18 | Snap Inc. | Global event-based avatar |
US12136158B2 (en) | 2019-02-06 | 2024-11-05 | Snap Inc. | Body pose estimation |
US10984575B2 (en) | 2019-02-06 | 2021-04-20 | Snap Inc. | Body pose estimation |
US11809624B2 (en) | 2019-02-13 | 2023-11-07 | Snap Inc. | Sleep detection in a location sharing system |
US11275439B2 (en) | 2019-02-13 | 2022-03-15 | Snap Inc. | Sleep detection in a location sharing system |
US10936066B1 (en) | 2019-02-13 | 2021-03-02 | Snap Inc. | Sleep detection in a location sharing system |
US10964082B2 (en) | 2019-02-26 | 2021-03-30 | Snap Inc. | Avatar based on weather |
US11574431B2 (en) | 2019-02-26 | 2023-02-07 | Snap Inc. | Avatar based on weather |
US11301117B2 (en) | 2019-03-08 | 2022-04-12 | Snap Inc. | Contextual information in chat |
US10852918B1 (en) | 2019-03-08 | 2020-12-01 | Snap Inc. | Contextual information in chat |
US12242979B1 (en) | 2019-03-12 | 2025-03-04 | Snap Inc. | Departure time estimation in a location sharing system |
US12141215B2 (en) | 2019-03-14 | 2024-11-12 | Snap Inc. | Graph-based prediction for contact suggestion in a location sharing system |
US11868414B1 (en) | 2019-03-14 | 2024-01-09 | Snap Inc. | Graph-based prediction for contact suggestion in a location sharing system |
US11852554B1 (en) | 2019-03-21 | 2023-12-26 | Snap Inc. | Barometer calibration in a location sharing system |
US11039270B2 (en) | 2019-03-28 | 2021-06-15 | Snap Inc. | Points of interest in a location sharing system |
US11166123B1 (en) | 2019-03-28 | 2021-11-02 | Snap Inc. | Grouped transmission of location data in a location sharing system |
US11638115B2 (en) | 2019-03-28 | 2023-04-25 | Snap Inc. | Points of interest in a location sharing system |
US12070682B2 (en) | 2019-03-29 | 2024-08-27 | Snap Inc. | 3D avatar plugin for third-party games |
US11973732B2 (en) | 2019-04-30 | 2024-04-30 | Snap Inc. | Messaging system with avatar generation |
US10992619B2 (en) | 2019-04-30 | 2021-04-27 | Snap Inc. | Messaging system with avatar generation |
USD916871S1 (en) | 2019-05-28 | 2021-04-20 | Snap Inc. | Display screen or portion thereof with a transitional graphical user interface |
USD916872S1 (en) | 2019-05-28 | 2021-04-20 | Snap Inc. | Display screen or portion thereof with a graphical user interface |
USD916810S1 (en) | 2019-05-28 | 2021-04-20 | Snap Inc. | Display screen or portion thereof with a graphical user interface |
USD916809S1 (en) | 2019-05-28 | 2021-04-20 | Snap Inc. | Display screen or portion thereof with a transitional graphical user interface |
USD916811S1 (en) | 2019-05-28 | 2021-04-20 | Snap Inc. | Display screen or portion thereof with a transitional graphical user interface |
US11917495B2 (en) | 2019-06-07 | 2024-02-27 | Snap Inc. | Detection of a physical collision between two client devices in a location sharing system |
US11601783B2 (en) | 2019-06-07 | 2023-03-07 | Snap Inc. | Detection of a physical collision between two client devices in a location sharing system |
US10893385B1 (en) | 2019-06-07 | 2021-01-12 | Snap Inc. | Detection of a physical collision between two client devices in a location sharing system |
US11823341B2 (en) | 2019-06-28 | 2023-11-21 | Snap Inc. | 3D object camera customization system |
US11676199B2 (en) | 2019-06-28 | 2023-06-13 | Snap Inc. | Generating customizable avatar outfits |
US11189098B2 (en) | 2019-06-28 | 2021-11-30 | Snap Inc. | 3D object camera customization system |
US12211159B2 (en) | 2019-06-28 | 2025-01-28 | Snap Inc. | 3D object camera customization system |
US12056760B2 (en) | 2019-06-28 | 2024-08-06 | Snap Inc. | Generating customizable avatar outfits |
US12147644B2 (en) | 2019-06-28 | 2024-11-19 | Snap Inc. | Generating animation overlays in a communication session |
US11188190B2 (en) | 2019-06-28 | 2021-11-30 | Snap Inc. | Generating animation overlays in a communication session |
US11443491B2 (en) | 2019-06-28 | 2022-09-13 | Snap Inc. | 3D object camera customization system |
US12147654B2 (en) | 2019-07-11 | 2024-11-19 | Snap Inc. | Edge gesture interface with smart interactions |
US11714535B2 (en) | 2019-07-11 | 2023-08-01 | Snap Inc. | Edge gesture interface with smart interactions |
US11307747B2 (en) | 2019-07-11 | 2022-04-19 | Snap Inc. | Edge gesture interface with smart interactions |
US11455081B2 (en) | 2019-08-05 | 2022-09-27 | Snap Inc. | Message thread prioritization interface |
US12099701B2 (en) | 2019-08-05 | 2024-09-24 | Snap Inc. | Message thread prioritization interface |
US11588772B2 (en) | 2019-08-12 | 2023-02-21 | Snap Inc. | Message reminder interface |
US10911387B1 (en) | 2019-08-12 | 2021-02-02 | Snap Inc. | Message reminder interface |
US11956192B2 (en) | 2019-08-12 | 2024-04-09 | Snap Inc. | Message reminder interface |
US20220076504A1 (en) * | 2019-09-06 | 2022-03-10 | Snap Inc. | Context-based virtual object rendering |
US11557093B1 (en) * | 2019-09-10 | 2023-01-17 | Meta Platforms Technologies, Llc | Using social connections to define graphical representations of users in an artificial reality setting |
US12099703B2 (en) | 2019-09-16 | 2024-09-24 | Snap Inc. | Messaging system with battery level sharing |
US11822774B2 (en) | 2019-09-16 | 2023-11-21 | Snap Inc. | Messaging system with battery level sharing |
US11662890B2 (en) | 2019-09-16 | 2023-05-30 | Snap Inc. | Messaging system with battery level sharing |
US11320969B2 (en) | 2019-09-16 | 2022-05-03 | Snap Inc. | Messaging system with battery level sharing |
US12166734B2 (en) | 2019-09-27 | 2024-12-10 | Snap Inc. | Presenting reactions from friends |
US11425062B2 (en) | 2019-09-27 | 2022-08-23 | Snap Inc. | Recommended content viewed by friends |
US11270491B2 (en) | 2019-09-30 | 2022-03-08 | Snap Inc. | Dynamic parameterized user avatar stories |
US11080917B2 (en) | 2019-09-30 | 2021-08-03 | Snap Inc. | Dynamic parameterized user avatar stories |
US11676320B2 (en) | 2019-09-30 | 2023-06-13 | Snap Inc. | Dynamic media collection generation |
US11218838B2 (en) | 2019-10-31 | 2022-01-04 | Snap Inc. | Focused map-based context information surfacing |
US12080065B2 (en) | 2019-11-22 | 2024-09-03 | Snap Inc | Augmented reality items based on scan |
US11563702B2 (en) | 2019-12-03 | 2023-01-24 | Snap Inc. | Personalized avatar notification |
US11063891B2 (en) | 2019-12-03 | 2021-07-13 | Snap Inc. | Personalized avatar notification |
US11582176B2 (en) | 2019-12-09 | 2023-02-14 | Snap Inc. | Context sensitive avatar captions |
US11128586B2 (en) | 2019-12-09 | 2021-09-21 | Snap Inc. | Context sensitive avatar captions |
US12198372B2 (en) | 2019-12-11 | 2025-01-14 | Snap Inc. | Skeletal tracking using previous frames |
US11036989B1 (en) | 2019-12-11 | 2021-06-15 | Snap Inc. | Skeletal tracking using previous frames |
US11594025B2 (en) | 2019-12-11 | 2023-02-28 | Snap Inc. | Skeletal tracking using previous frames |
US11908093B2 (en) | 2019-12-19 | 2024-02-20 | Snap Inc. | 3D captions with semantic graphical elements |
US11263817B1 (en) | 2019-12-19 | 2022-03-01 | Snap Inc. | 3D captions with face tracking |
US11636657B2 (en) | 2019-12-19 | 2023-04-25 | Snap Inc. | 3D captions with semantic graphical elements |
US11810220B2 (en) | 2019-12-19 | 2023-11-07 | Snap Inc. | 3D captions with face tracking |
US12175613B2 (en) | 2019-12-19 | 2024-12-24 | Snap Inc. | 3D captions with face tracking |
US11227442B1 (en) | 2019-12-19 | 2022-01-18 | Snap Inc. | 3D captions with semantic graphical elements |
US11128715B1 (en) | 2019-12-30 | 2021-09-21 | Snap Inc. | Physical friend proximity in chat |
US12063569B2 (en) | 2019-12-30 | 2024-08-13 | Snap Inc. | Interfaces for relative device positioning |
US11140515B1 (en) | 2019-12-30 | 2021-10-05 | Snap Inc. | Interfaces for relative device positioning |
US11893208B2 (en) | 2019-12-31 | 2024-02-06 | Snap Inc. | Combined map icon with action indicator |
US11169658B2 (en) | 2019-12-31 | 2021-11-09 | Snap Inc. | Combined map icon with action indicator |
US11991419B2 (en) | 2020-01-30 | 2024-05-21 | Snap Inc. | Selecting avatars to be included in the video being generated on demand |
US11356720B2 (en) | 2020-01-30 | 2022-06-07 | Snap Inc. | Video generation system to render frames on demand |
US11263254B2 (en) | 2020-01-30 | 2022-03-01 | Snap Inc. | Video generation system to render frames on demand using a fleet of servers |
US12231709B2 (en) | 2020-01-30 | 2025-02-18 | Snap Inc. | Video generation system to render frames on demand using a fleet of GPUS |
US12111863B2 (en) | 2020-01-30 | 2024-10-08 | Snap Inc. | Video generation system to render frames on demand using a fleet of servers |
US11831937B2 (en) | 2020-01-30 | 2023-11-28 | Snap Inc. | Video generation system to render frames on demand using a fleet of GPUS |
US11284144B2 (en) | 2020-01-30 | 2022-03-22 | Snap Inc. | Video generation system to render frames on demand using a fleet of GPUs |
US11729441B2 (en) | 2020-01-30 | 2023-08-15 | Snap Inc. | Video generation system to render frames on demand |
US11651539B2 (en) | 2020-01-30 | 2023-05-16 | Snap Inc. | System for generating media content items on demand |
US11036781B1 (en) | 2020-01-30 | 2021-06-15 | Snap Inc. | Video generation system to render frames on demand using a fleet of servers |
US11651022B2 (en) | 2020-01-30 | 2023-05-16 | Snap Inc. | Video generation system to render frames on demand using a fleet of servers |
US11619501B2 (en) | 2020-03-11 | 2023-04-04 | Snap Inc. | Avatar based on trip |
US11775165B2 (en) | 2020-03-16 | 2023-10-03 | Snap Inc. | 3D cutout image modification |
US11217020B2 (en) | 2020-03-16 | 2022-01-04 | Snap Inc. | 3D cutout image modification |
US11625873B2 (en) | 2020-03-30 | 2023-04-11 | Snap Inc. | Personalized media overlay recommendation |
US11978140B2 (en) | 2020-03-30 | 2024-05-07 | Snap Inc. | Personalized media overlay recommendation |
US11818286B2 (en) | 2020-03-30 | 2023-11-14 | Snap Inc. | Avatar recommendation and reply |
US12226001B2 (en) | 2020-03-31 | 2025-02-18 | Snap Inc. | Augmented reality beauty product tutorials |
US11969075B2 (en) | 2020-03-31 | 2024-04-30 | Snap Inc. | Augmented reality beauty product tutorials |
US11956190B2 (en) | 2020-05-08 | 2024-04-09 | Snap Inc. | Messaging system with a carousel of related entities |
US11922010B2 (en) | 2020-06-08 | 2024-03-05 | Snap Inc. | Providing contextual information with keyboard interface for messaging system |
US11822766B2 (en) | 2020-06-08 | 2023-11-21 | Snap Inc. | Encoded image based messaging system |
US11543939B2 (en) | 2020-06-08 | 2023-01-03 | Snap Inc. | Encoded image based messaging system |
US12046037B2 (en) | 2020-06-10 | 2024-07-23 | Snap Inc. | Adding beauty products to augmented reality tutorials |
US11683280B2 (en) | 2020-06-10 | 2023-06-20 | Snap Inc. | Messaging system including an external-resource dock and drawer |
US12067214B2 (en) | 2020-06-25 | 2024-08-20 | Snap Inc. | Updating avatar clothing for a user of a messaging system |
US12184809B2 (en) | 2020-06-25 | 2024-12-31 | Snap Inc. | Updating an avatar status for a user of a messaging system |
WO2021263210A1 (en) * | 2020-06-25 | 2021-12-30 | Snap Inc. | Updating an avatar status in a messaging system |
WO2021263208A1 (en) * | 2020-06-25 | 2021-12-30 | Snap Inc. | Updating avatar clothing in a messaging system |
US12169968B2 (en) * | 2020-06-30 | 2024-12-17 | Snap Inc. | Augmented reality eyewear with mood sharing |
US12136153B2 (en) | 2020-06-30 | 2024-11-05 | Snap Inc. | Messaging system with augmented reality makeup |
US20210406542A1 (en) * | 2020-06-30 | 2021-12-30 | Ilteris Canberk | Augmented reality eyewear with mood sharing |
US11580682B1 (en) | 2020-06-30 | 2023-02-14 | Snap Inc. | Messaging system with augmented reality makeup |
US11863513B2 (en) | 2020-08-31 | 2024-01-02 | Snap Inc. | Media content playback and comments management |
US11360733B2 (en) | 2020-09-10 | 2022-06-14 | Snap Inc. | Colocated shared augmented reality without shared backend |
US11893301B2 (en) | 2020-09-10 | 2024-02-06 | Snap Inc. | Colocated shared augmented reality without shared backend |
US11888795B2 (en) | 2020-09-21 | 2024-01-30 | Snap Inc. | Chats with micro sound clips |
US11833427B2 (en) | 2020-09-21 | 2023-12-05 | Snap Inc. | Graphical marker generation system for synchronizing users |
US12121811B2 (en) | 2020-09-21 | 2024-10-22 | Snap Inc. | Graphical marker generation system for synchronization |
US11452939B2 (en) | 2020-09-21 | 2022-09-27 | Snap Inc. | Graphical marker generation system for synchronizing users |
US11910269B2 (en) | 2020-09-25 | 2024-02-20 | Snap Inc. | Augmented reality content items including user avatar to share location |
US11615592B2 (en) | 2020-10-27 | 2023-03-28 | Snap Inc. | Side-by-side character animation from realtime 3D body motion capture |
US12243173B2 (en) | 2020-10-27 | 2025-03-04 | Snap Inc. | Side-by-side character animation from realtime 3D body motion capture |
US11660022B2 (en) | 2020-10-27 | 2023-05-30 | Snap Inc. | Adaptive skeletal joint smoothing |
US11450051B2 (en) | 2020-11-18 | 2022-09-20 | Snap Inc. | Personalized avatar real-time motion capture |
US12229860B2 (en) | 2020-11-18 | 2025-02-18 | Snap Inc. | Body animation sharing and remixing |
US11748931B2 (en) | 2020-11-18 | 2023-09-05 | Snap Inc. | Body animation sharing and remixing |
US12002175B2 (en) | 2020-11-18 | 2024-06-04 | Snap Inc. | Real-time motion transfer for prosthetic limbs |
US11734894B2 (en) | 2020-11-18 | 2023-08-22 | Snap Inc. | Real-time motion transfer for prosthetic limbs |
US12169890B2 (en) | 2020-11-18 | 2024-12-17 | Snap Inc. | Personalized avatar real-time motion capture |
US12008811B2 (en) | 2020-12-30 | 2024-06-11 | Snap Inc. | Machine learning-based selection of a representative video frame within a messaging application |
US12056792B2 (en) | 2020-12-30 | 2024-08-06 | Snap Inc. | Flow-guided motion retargeting |
US11790531B2 (en) | 2021-02-24 | 2023-10-17 | Snap Inc. | Whole body segmentation |
US12205295B2 (en) | 2021-02-24 | 2025-01-21 | Snap Inc. | Whole body segmentation |
US12106486B2 (en) | 2021-02-24 | 2024-10-01 | Snap Inc. | Whole body visual effects |
US11809633B2 (en) | 2021-03-16 | 2023-11-07 | Snap Inc. | Mirroring device with pointing based navigation |
US12164699B2 (en) | 2021-03-16 | 2024-12-10 | Snap Inc. | Mirroring device with pointing based navigation |
US11978283B2 (en) | 2021-03-16 | 2024-05-07 | Snap Inc. | Mirroring device with a hands-free mode |
US11798201B2 (en) | 2021-03-16 | 2023-10-24 | Snap Inc. | Mirroring device with whole-body outfits |
US11734959B2 (en) | 2021-03-16 | 2023-08-22 | Snap Inc. | Activating hands-free mode on mirroring device |
US11908243B2 (en) | 2021-03-16 | 2024-02-20 | Snap Inc. | Menu hierarchy navigation on electronic mirroring devices |
US11544885B2 (en) | 2021-03-19 | 2023-01-03 | Snap Inc. | Augmented reality experience based on physical items |
US12175575B2 (en) | 2021-03-19 | 2024-12-24 | Snap Inc. | Augmented reality experience based on physical items |
US11562548B2 (en) | 2021-03-22 | 2023-01-24 | Snap Inc. | True size eyewear in real time |
US12067804B2 (en) | 2021-03-22 | 2024-08-20 | Snap Inc. | True size eyewear experience in real time |
US12165243B2 (en) | 2021-03-30 | 2024-12-10 | Snap Inc. | Customizable avatar modification system |
US12175570B2 (en) | 2021-03-31 | 2024-12-24 | Snap Inc. | Customizable avatar generation system |
US12170638B2 (en) | 2021-03-31 | 2024-12-17 | Snap Inc. | User presence status indicators generation and management |
US12034680B2 (en) | 2021-03-31 | 2024-07-09 | Snap Inc. | User presence indication data management |
US12218893B2 (en) | 2021-03-31 | 2025-02-04 | Snap Inc. | User presence indication data management |
US12100156B2 (en) | 2021-04-12 | 2024-09-24 | Snap Inc. | Garment segmentation |
US11636654B2 (en) | 2021-05-19 | 2023-04-25 | Snap Inc. | AR-based connected portal shopping |
US11941767B2 (en) | 2021-05-19 | 2024-03-26 | Snap Inc. | AR-based connected portal shopping |
US12182583B2 (en) | 2021-05-19 | 2024-12-31 | Snap Inc. | Personalized avatar experience during a system boot process |
US11941227B2 (en) | 2021-06-30 | 2024-03-26 | Snap Inc. | Hybrid search system for customizable media |
US12260450B2 (en) | 2021-07-16 | 2025-03-25 | Snap Inc. | Personalized try-on ads |
US11854069B2 (en) | 2021-07-16 | 2023-12-26 | Snap Inc. | Personalized try-on ads |
US11983462B2 (en) | 2021-08-31 | 2024-05-14 | Snap Inc. | Conversation guided augmented reality experience |
US11908083B2 (en) | 2021-08-31 | 2024-02-20 | Snap Inc. | Deforming custom mesh based on body mesh |
US11670059B2 (en) | 2021-09-01 | 2023-06-06 | Snap Inc. | Controlling interactive fashion based on body gestures |
US12056832B2 (en) | 2021-09-01 | 2024-08-06 | Snap Inc. | Controlling interactive fashion based on body gestures |
US12198664B2 (en) | 2021-09-02 | 2025-01-14 | Snap Inc. | Interactive fashion with music AR |
US11673054B2 (en) | 2021-09-07 | 2023-06-13 | Snap Inc. | Controlling AR games on fashion items |
US11663792B2 (en) | 2021-09-08 | 2023-05-30 | Snap Inc. | Body fitted accessory with physics simulation |
US11900506B2 (en) | 2021-09-09 | 2024-02-13 | Snap Inc. | Controlling interactive fashion based on facial expressions |
US11734866B2 (en) | 2021-09-13 | 2023-08-22 | Snap Inc. | Controlling interactive fashion based on voice |
US11798238B2 (en) | 2021-09-14 | 2023-10-24 | Snap Inc. | Blending body mesh into external mesh |
US12086946B2 (en) | 2021-09-14 | 2024-09-10 | Snap Inc. | Blending body mesh into external mesh |
US11836866B2 (en) | 2021-09-20 | 2023-12-05 | Snap Inc. | Deforming real-world object using an external mesh |
US12198281B2 (en) | 2021-09-20 | 2025-01-14 | Snap Inc. | Deforming real-world object using an external mesh |
US11983826B2 (en) | 2021-09-30 | 2024-05-14 | Snap Inc. | 3D upper garment tracking |
US11636662B2 (en) | 2021-09-30 | 2023-04-25 | Snap Inc. | Body normal network light and rendering control |
US11836862B2 (en) | 2021-10-11 | 2023-12-05 | Snap Inc. | External mesh with vertex attributes |
US11651572B2 (en) | 2021-10-11 | 2023-05-16 | Snap Inc. | Light and rendering of garments |
US11790614B2 (en) | 2021-10-11 | 2023-10-17 | Snap Inc. | Inferring intent from pose and speech input |
US12148108B2 (en) | 2021-10-11 | 2024-11-19 | Snap Inc. | Light and rendering of garments |
US11763481B2 (en) | 2021-10-20 | 2023-09-19 | Snap Inc. | Mirror-based augmented reality experience |
US12217453B2 (en) | 2021-10-20 | 2025-02-04 | Snap Inc. | Mirror-based augmented reality experience |
US12086916B2 (en) | 2021-10-22 | 2024-09-10 | Snap Inc. | Voice note with face tracking |
US12020358B2 (en) | 2021-10-29 | 2024-06-25 | Snap Inc. | Animated custom sticker creation |
US11996113B2 (en) | 2021-10-29 | 2024-05-28 | Snap Inc. | Voice notes with changing effects |
US11995757B2 (en) | 2021-10-29 | 2024-05-28 | Snap Inc. | Customized animation from video |
US12170747B2 (en) | 2021-12-07 | 2024-12-17 | Snap Inc. | Augmented reality unboxing experience |
US11748958B2 (en) | 2021-12-07 | 2023-09-05 | Snap Inc. | Augmented reality unboxing experience |
US11960784B2 (en) | 2021-12-07 | 2024-04-16 | Snap Inc. | Shared augmented reality unboxing experience |
US12198398B2 (en) | 2021-12-21 | 2025-01-14 | Snap Inc. | Real-time motion and appearance transfer |
US12223672B2 (en) | 2021-12-21 | 2025-02-11 | Snap Inc. | Real-time garment exchange |
US11880947B2 (en) | 2021-12-21 | 2024-01-23 | Snap Inc. | Real-time upper-body garment exchange |
US12096153B2 (en) | 2021-12-21 | 2024-09-17 | Snap Inc. | Avatar call platform |
US11887260B2 (en) | 2021-12-30 | 2024-01-30 | Snap Inc. | AR position indicator |
US11928783B2 (en) | 2021-12-30 | 2024-03-12 | Snap Inc. | AR position and orientation along a plane |
US12198287B2 (en) | 2022-01-17 | 2025-01-14 | Snap Inc. | AR body part tracking system |
US11823346B2 (en) | 2022-01-17 | 2023-11-21 | Snap Inc. | AR body part tracking system |
US11954762B2 (en) | 2022-01-19 | 2024-04-09 | Snap Inc. | Object replacement system |
US12142257B2 (en) | 2022-02-08 | 2024-11-12 | Snap Inc. | Emotion-based text to speech |
US12002146B2 (en) | 2022-03-28 | 2024-06-04 | Snap Inc. | 3D modeling based on neural light field |
US12148105B2 (en) | 2022-03-30 | 2024-11-19 | Snap Inc. | Surface normals for pixel-aligned object |
US12254577B2 (en) | 2022-04-05 | 2025-03-18 | Snap Inc. | Pixel depth determination for object |
US12164109B2 (en) | 2022-04-29 | 2024-12-10 | Snap Inc. | AR/VR enabled contact lens |
US12062144B2 (en) | 2022-05-27 | 2024-08-13 | Snap Inc. | Automated augmented reality experience creation based on sample source and target images |
US12020384B2 (en) | 2022-06-21 | 2024-06-25 | Snap Inc. | Integrating augmented reality experiences with other components |
US12020386B2 (en) | 2022-06-23 | 2024-06-25 | Snap Inc. | Applying pregenerated virtual experiences in new location |
US12170640B2 (en) | 2022-06-28 | 2024-12-17 | Snap Inc. | Media gallery sharing and management |
US11870745B1 (en) | 2022-06-28 | 2024-01-09 | Snap Inc. | Media gallery sharing and management |
US12235991B2 (en) | 2022-07-06 | 2025-02-25 | Snap Inc. | Obscuring elements based on browser focus |
US12062146B2 (en) | 2022-07-28 | 2024-08-13 | Snap Inc. | Virtual wardrobe AR experience |
US12236512B2 (en) | 2022-08-23 | 2025-02-25 | Snap Inc. | Avatar call on an eyewear device |
US12051163B2 (en) | 2022-08-25 | 2024-07-30 | Snap Inc. | External computer vision for an eyewear device |
US12154232B2 (en) | 2022-09-30 | 2024-11-26 | Snap Inc. | 9-DoF object tracking |
US12265692B2 (en) | 2022-10-03 | 2025-04-01 | Snap Inc. | Content discovery refresh |
US12229901B2 (en) | 2022-10-05 | 2025-02-18 | Snap Inc. | External screen streaming for an eyewear device |
US11893166B1 (en) | 2022-11-08 | 2024-02-06 | Snap Inc. | User avatar movement control using an augmented reality eyewear device |
US12243266B2 (en) | 2022-12-29 | 2025-03-04 | Snap Inc. | Device pairing using machine-readable optical label |
US12149489B2 (en) | 2023-03-14 | 2024-11-19 | Snap Inc. | Techniques for recommending reply stickers |
US12047337B1 (en) | 2023-07-03 | 2024-07-23 | Snap Inc. | Generating media content items during user interaction |
Also Published As
Publication number | Publication date |
---|---|
US20100146407A1 (en) | 2010-06-10 |
US9568993B2 (en) | 2017-02-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20090177976A1 (en) | Managing and presenting avatar mood effects in a virtual world | |
US20240207730A1 (en) | Generating a mini-game of a video game from a game play recording | |
US11398067B2 (en) | Virtual reality presentation of body postures of avatars | |
US10860345B2 (en) | System for user sentiment tracking | |
US9063565B2 (en) | Automated avatar creation and interaction in a virtual world | |
US9164664B2 (en) | System and method for avatar cloning | |
JP2023540874A (en) | Artificial reality collaborative work environment | |
US10300394B1 (en) | Spectator audio analysis in online gaming environments | |
WO2023049053A1 (en) | Content linking for artificial reality environments | |
US9086776B2 (en) | Modifying avatar attributes | |
DE112021001301T5 (en) | DIALOGUE-BASED AI PLATFORM WITH RENDERED GRAPHIC OUTPUT | |
US8453062B2 (en) | Virtual world viewer | |
KR20170085422A (en) | Apparatus and method for operating personal agent | |
US9299178B2 (en) | Generation of animated gesture responses in a virtual world | |
US9220981B2 (en) | Controlling attribute expression within a virtual environment | |
US9223399B2 (en) | Translation of gesture responses in a virtual world | |
US20220161145A1 (en) | Modifying user interface of application during recording session | |
US10671151B2 (en) | Mitigating digital reality leakage through session modification | |
CN111880874A (en) | Media file sharing method, device and equipment and computer readable storage medium | |
US11521653B2 (en) | Video sequence layout method, electronic device and storage medium | |
WO2023049052A1 (en) | Visual navigation elements for artificial reality environments | |
CN111273764A (en) | Human-like emotion-driven behavior simulated by virtual agents | |
CN114527912B (en) | Information processing method, information processing device, computer readable medium and electronic equipment | |
JP2023120130A (en) | Conversation-type ai platform using extraction question response | |
US10296723B2 (en) | Managing companionship data |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: INTERNATIONAL BUSINESS MACHINES CORPORATION, NEW Y Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BOKOR, BRIAN R.;SMITH, ANDREW B.;SPEICHER, STEVEN K.;REEL/FRAME:020342/0793;SIGNING DATES FROM 20080108 TO 20080109 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION |