US20170269946A1 - System and method for an interactive query utilizing a simulated personality - Google Patents
System and method for an interactive query utilizing a simulated personality Download PDFInfo
- Publication number
- US20170269946A1 US20170269946A1 US15/345,327 US201615345327A US2017269946A1 US 20170269946 A1 US20170269946 A1 US 20170269946A1 US 201615345327 A US201615345327 A US 201615345327A US 2017269946 A1 US2017269946 A1 US 2017269946A1
- Authority
- US
- United States
- Prior art keywords
- user
- personality
- simulated
- image
- personalities
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G06F9/4446—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/01—Social networking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
- G06F9/453—Help systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/24—Querying
- G06F16/242—Query formulation
- G06F16/2423—Interactive query statement specification based on a database schema
-
- G06F17/30392—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
- G06T13/40—3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L21/00—Speech or voice signal processing techniques to produce another audible or non-audible signal, e.g. visual or tactile, in order to modify its quality or its intelligibility
- G10L21/06—Transformation of speech into a non-audible representation, e.g. speech visualisation or speech processing for tactile aids
- G10L21/10—Transforming into visible information
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L21/00—Speech or voice signal processing techniques to produce another audible or non-audible signal, e.g. visual or tactile, in order to modify its quality or its intelligibility
- G10L21/06—Transformation of speech into a non-audible representation, e.g. speech visualisation or speech processing for tactile aids
- G10L21/10—Transforming into visible information
- G10L2021/105—Synthesis of the lips movements from speech, e.g. for talking heads
Definitions
- the present invention relates to a system and method for an interactive query utilizing a simulated personality. More specifically, the present invention relates to systems and methods of simulating a personality of a user, providing one or more still or animated images of a user, and providing interactive queries with another user, preferably in the context of social media or social networking.
- FIG. 1 depicts how they view the role of personality and emotion as a glue between perception, dialogue and expression.
- a first step towards such an approach is a system that can simulate personalized facial animation with speech and expressions, modulated through mood.
- Vel'asquez proposed a model of emotions, mood and temperament that provides a flexible way of controlling the behaviour of the autonomous entities.
- moods and emotions are only differentiated in terms of levels of arousal.
- Simple models have been proposed for blending, mixing and decaying emotions to subsequently select actions of the agent.
- Personality and emotion are basically the same mechanisms only differentiated by two cognitive variables time and duration, and personality can be seen as consistent expression of emotion.
- the abstract entity that represents the individual at a time t can be called I t .
- an individual has a personality and an emotional state (not yet taking mood into consideration). The model based on this assumption is called PE.
- I t as a tuple (p, e t ), where p represents the personality and et represents the emotional state at time t.
- p represents the personality
- et represents the emotional state at time t.
- a person will portray emotions (that change over time) based on what happens, but how she obtains these emotions and the behavior that results from it, depends on a static part of her being, the personality.
- a personality has n dimensions, where each dimension is represented by a value in the interval.
- a value of 0 corresponds to an absence of the dimension in the personality;
- a value of 1 corresponds to a maximum presence of the dimension in the personality.
- the personality p of an individual can then be represented by the following vector:
- the emotional state is a set of emotions with a certain intensity. The size of this set depends on the theory that is used. For example, in the OCC model, 22 emotions are defined, while others may define 6 that are used as a basis for facial expression classification.
- the emotional state is something that can change over time (for example due to a decay factor). Therefore, an emotional state can be relative to a time t.
- the emotional state can be defined e t as an m-dimensional vector, where all m emotion intensities are represented by a value in the interval [0, 1]. A value of 0 corresponds to an absence of the emotion; a value of 1 corresponds to a maximum intensity of the emotion. This vector is given as follows:
- an emotional state history can be defined ⁇ t that contains all emotional states until e t , thus:
- ⁇ t ( e 0 , e 1 , . . . , e t )
- An extended version of the PE model can be given by including mood.
- the individual I t can be defined as a triple (p, m t , e t ), where m t represents the mood at a time t.
- Mood has been accepted as the notions of personality and emotional state. Mood is less static than personality and less fluent than emotional state. It is an intermediate form that exists between the two and that describes a rather static state of being that lasts longer than the average emotion as illustrated in FIG. 2 . This state of being can be one-dimensional (being in a good or a bad mood) or perhaps multi-dimensional (feeling in love, feeling depressed).
- a possibility of having multiple mood dimensions can be defined so a how many dimensions mood actually has can be selected.
- a mood dimension can be defined as a value that is either negative or positive and that lies in the interval [ ⁇ 1, 1]. Supposing that there are k mood dimensions, the mood can be described by a vector:
- ⁇ t ( m 0 , m 1 , . . . , m t )
- SDDI Social Security Death Index
- a system for system for an interactive query comprises a first input module capable of receiving input for creating a simulated personality for a first user; an expert system capable of creating and storing the simulated personality; an output module for presenting the simulated personality to a second user; and an interactive query module capable of allowing the second user to communicate with the simulated personality of the first user.
- a method for an interactive query comprising: receiving input for creating a simulated personality for a first user; creating and storing the simulated personality; presenting the simulated personality to a second user; and allowing the second user to communicate with the simulated personality of the first user.
- FIG. 1 diagramatic overview of a prior art intelligent agent framework for an emotional state and personality
- FIG. 2 is a prior art personality, mood and emotion scale from static to dynamic according to the prior art system of FIG. 1 ;
- FIG. 3 is a flow diagram illustrating the steps performed by a system and method for an interactive query utilizing a simulated personality in accord with one embodiment
- FIG. 4 is a flow diagram illustrating data flow for providing close simulation of a personality of a user by an expert algorithm according to the embodiment of FIG. 3 ;
- FIG. 5 is a diagramatic representation of an interaction between a user, a knowledge bank and an interactive query with another user according to the embodiment of FIG. 3 ;
- FIGS. 6A-6C illustrate a dataflow diagram showing dataflow in a social relationship according to the embodiment of FIG. 3 ;
- FIGS. 7A-7C illustrates a dataflow diagram showing dataflow during the interaction of the image personality of a user in employment screen according to the embodiment of FIG. 3 ;
- FIG. 8 is a dataflow diagram illustrating the data flow during application of the embodiment of FIG. 3 in an advertising system using social media in which a set of first users are targeted by virtue of their image personalities and genders;
- FIGS. 9A-9C illustrates a dataflow diagram showing dataflow during application of image personalities and genders of a set of first users in the context of real-time location based social advertisement or social commerce according to the embodiment if FIGS. 3 and 8 ;
- FIG. 10 is a flow diagram illustrating the steps performed by a system and method for an interactive query utilizing a simulated personality in accord with one embodiment to aid a shopper in buying a gift.
- Fundamental characteristics of personality of a user 100 can include: (1) consistency—essentially, any user act in the similar ways in a variety of situations and there is a recognizable order and regularity to behaviors, (2) psychological and physiological—personality is a psychological construct, but it can be influenced by biological processes and needs, (3) behaviors and actions—personality not only influence how a user respond in a particular environment, but also causes a user to act in certain ways, (4) multiple expressions—personality is displayed in more than just behavior. It can also be seen in thoughts, feelings, close relationships and other social interactions.
- Various parameters can be used and updated in order to simulate personality of a user at any point in their life. These parameters may include, by way of example, and not by way of limitation: (1) relational psychology—situations and decisions that can prove revealing, (2) personality profiles—questions that define the character, (3) inner traits—analysis of preferences such as drawing style and handshake, (4) love test—scenarios that attempt to explain outlook on love, (5) shape test—shapes and symbols that reveal personality characteristics, (6) food test—how food and drink preferences relate to personality, (7) color test—color preferences, (8) internal beliefs -values, moral beliefs, political beliefs and spiritual beliefs, (9) personal details—age, family background, education, profession and external environment, (10) personal interviews—utilizing a speech recognition algorithm and/or a natural language processing algorithm, (11) personal library of audios and videos, and (12) what others think & say.
- An expert algorithm by way of example, and not by way of limitation, an artificial intelligence algorithm and/or a neural network algorithm, can synthesize results of the above parameters and can simulate closely the personality of a user.
- a flow diagram illustrates the steps performed by a system and method for an interactive query utilizing a simulated personality in accord with one embodiment.
- a family member, friend, coworker or employer by way of example, and not by way of limitation, a son or daughter, logs into a website.
- an image of the user in this example, a virtual father, may appear on a screen.
- the son or daughter may asks questions. Such questions may include life's questions, by way of example, and not by way of limitation, should I buy a house? to virtual father.
- a system may access the simulated personality of a virtual father, integrated with a knowledge bank 440 .
- step 1008 the virtual father asks questions, by way of example, and not by way of limitation, did you find a stable job? to the son or daughter for clarification and background of the question asked by the son or daughter.
- step 1010 virtual father responds to son's or daughter's question.
- step 1012 if the son or daughter is satisfied with virtual father's answer, then the son or daughter stops, otherwise the process is reiterated until the son or daughter is satisfied with the answers from the virtual father.
- a flow diagram illustrates data flow for providing close simulation of a personality of a first user (such as the virtual father of FIG. 3 ) by an expert algorithm, artificial intelligence algorithm and/or a neural network algorithm.
- the closely simulated personality 120 of a user 100 may be configured to synthesize various input modules 160 to 380 , as listed below.
- Input Modules Description of Input Modules For a User 160 Relational Psychology 180 Personality Profiles 200 Inner Traits 220 Love Test 240 Shape Test 260 Food Test 280 Color Test 300 Internal Beliefs 320 Personal Details 340 Personal Interviews 360 Personal Library of Audios, Videos & Wisdom 380 What Others Think & Say
- Such a system may be use the above inputs to create a closely simulated personality 120 is available from Electronic Arts of Redwood City, Calif., United States, as used in The Sims® series of video games.
- personality is split into 5 sections: Niceness, neatness or cleanliness, outgoingness, activeness, and playfulness. It is normally related to a Sim's Zodiac sign, but there are exceptions.
- Each section uses a zero-to-ten scale.
- the scale is actually 0 to 10.00 in The Sims® game, which could be applied to input modules 160 , 180 , 200 , 220 , 240 , 260 , 280 , 300 320 , 340 , 360 and 380 .
- each section is split into three parts; 0-2, 3-7, and 8-10, each of which has a name and a description.
- the Sims® 2 uses personality scales that can be divided into low, medium, and high, which can be alternatively used in the present system to store the personality 120 too.
- low is generally less than 4
- high is generally 8 (or 8.01) and above.
- the personality points system was replaced with traits, which could also be used as an alternative to store the personality 120 .
- the Sims® Livin' Large®, the chemistry set can make a potion that reverses personality points. The only way to get them back would be to recreate the potion. A Sim abducted by aliens would return with an altered personality. Sims can change their personality one point at a time by using what is called the Crystal Ball. In The Sims® 2, aspects of a simulated personality 120 may be changed via what is called the Encourage interaction.
- a snapshot of the user's simulated personality 120 may be stored.
- a user may decide to add another snapshot simulated personality 120 to the system, along with the user's age at the time in order to organize the snapshots into age personality simulation 120 snapshots.
- integration of a simulated personality 120 of a user 100 with one or more image(s) of the person may further be taken and stored.
- a three-dimensional (3-D) telepresence-like holographic image may be captured in near-real-time to represent the user to produce a combination 3-D representation and combined simulated personality 120 that can be termed an image personality ( 420 in FIG. 5 ) of a first user for each time t n .
- the image personality may include only two dimensional (2-D) representations of the user at each time t n .
- this will give the user's relatives and friends the ability to converse with the user's simulated personalities 120 based on the age of the user.
- a user's friends and family may be able to interact with the user's simulated personality 120 at the age of 10, then again at the age of 15, and then at the age of 20, 35, and so on, at their choosing.
- Each image personality of a user may be stored in a cloud-based server to provide network-based access to friends or family members.
- an internet server may be connected to the internet for access through a secure connection such as secure socket layer (SSL) to the image personality snapshots.
- SSL secure socket layer
- Social media connections may be used such as through Google+® or Face Book®.
- FIG. 5 shown is a diagramatic representation of an interaction between the image personality 420 of a first user, a knowledge bank 440 and an interactive query with another user 460 according to the embodiment of FIG. 1 .
- Interaction between the image personality 420 and the second user 460 may occur via voice, using voice recognition, or text, depending on the system.
- the image personality 420 and second user 460 may interact through voice conversation, wherein the second user 460 interactively communicates with a holographic 3D representation 400 of the first user.
- interaction between the image personality 420 and the second user 460 may occur by means of e-mail or text communications.
- interaction may occur with real-time chat through a Facebook®-type interface or other social-media-type chat interface.
- communications with the user 460 may be through a wide area network such as the internet 10 or world-wide-web.
- image 400 may comprise an animated or still 2D image for the image personality 420 at time t n .
- the second user 460 who is communicating with the image personality 420 at time t 1 may see or interact with the representation 400 of the first user that the user uploaded when he or she was 15 . If the next time the first user had provided a personality profile was when he or she was 25 , then if the second user 460 communications with the t 2 image personality 420 , the second user 460 would be viewing the image of the first user that she provided when he or she was 25 .
- a graphical front-end comprises of a 3D talking head capable of rendering speech and facial expressions in synchrony with synthetic speech.
- the facial animation system interprets the emotional tags in the responses, generates lip movements for the speech and blends the appropriate expressions for rendering in real-time with lip synchronization. Facial dynamics are considered during the expression change, and appropriate temporal transition functions are selected for facial animation.
- MPEG-4 facial animation parameters are used as low level facial deformation parameters.
- a system that provides the details of the deformation algorithm that can be used in one embodiment are explained in S. Kshirsagar, S. Garchery, and N. Magnenat-Thalmann, Deformable Avatars , Feature Point Based Mesh Deformation Applied to MPEG-4 Facial Animation, pages 33-43, Kluwer Academic Publishers, July 2001.
- the principle components are derived from the statistical analysis of the facial motion data and reflect independent facial movements observed during fluent speech. They are used as high level parameters for defining the facial expressions and visemes. The use of principal components facilitates realistic speech animation, especially blended with various facial expressions.
- the main steps incorporated in the visual front-end are the following:
- FAPs facial animation parameters
- TTS available text-to-speech
- Co-articulation rules may be applied based on the algorithm described in M. M. Cohen and D. W. Massaro, Modelling Co Articulation In Synthetic Visual Speech , pages 139-156, Springer-Verlag, 1993, which may be adopted for use with the principal components.
- the dialogue system may output expression tags with the text response. Each expression is associated with an intensity value.
- An attack-sustaindecay-release type of envelope may be applied for the expressions and it is blended with previously calculated co-articulated phoneme trajectories. This blending is based on observed facial dynamics, incorporating the constraints on facial movements wherever necessary in order to avoid excessive/unrealistic deformations.
- Periodic facial movements Periodic eye-blinks and minor head movements may be applied to the face for increased believability.
- an online knowledge bank may be accessed by the image personality 420 .
- the first user created the image personality he or she may have inputted as much has he or she could regarding how he or she would respond to one or more second users 460 who may ask questions of his or her image personality, 460 , it unlikely that the first user could have provided all of the knowledge necessary for any conceivable present or future knowledge-based question that the second user 460 could ask.
- a second user's grandfather who might be the first user may not have much knowledge of smart phone technology when he created his image personality at time t n .
- the image personality 420 may access a knowledge bank 440 in order to gain knowledge regarding the subject matter of a question to provide an answer that is commensurate with the personality of the image personality 420 at time t n .
- a knowledge bank may comprise, by way of example and not by way of limitation, Wikipedia®, provided by the Wikipedia Foundation, Inc. of San Francisco, Calif., accessed through the internet 10 .
- a cognitive system such as the Watson® system provided by IBM Corporation of Armonk, New York, may be used to then interpret the question from the second user 460 , and access the knowledge bank 440 to understand the question and determine one or more possible responses based on the knowledge bank 440 . Which of the one or more possible responses is actually provided to the second user 460 depends on the image personality 420 of the first user.
- the possible responses may be put into, by way of example, and not by way of limitation, the OCC model discussed by Arjan Egges, Sumedha Kshirsagar and Nadia Magnenat-Thalmann of MIRALab—University of Geneva, in their paper titled Generic Personality and Emotion Simulation for Conversational Agents, Published in Computer Animation and Virtual Worlds archive, Volume 15 Issue 1, March 2004 Pages 1-13.
- the OCC model may suggest that to the second user 460 that the cheapest smart phone should be purchased the second user 460 .
- certain levels of services in one embodiment may be offered for free, with banner, pop-up, impression, or other types of paid advertiser support.
- support may be provided by charging for paid subscription services, including pay-per-question (PPQ) charging for asking questions of the image personality 420 of a user 100 .
- PPQ pay-per-question
- the above-described paid support mechanisms may be combined with, or replaced by set-up fees paid by the first user whose personality is imaged.
- a dataflow diagram shows dataflow in a social media relationship according to the embodiment of FIG. 3 .
- the dataflow diagram of these figures describes a real-life application of the image personality 420 of the second user 460 in social relationship with the image personality 420 .
- the embodiment of FIGS. 6A-6C might describe such a relationship in a dating or friend finder website, although not exclusively.
- the first user 100 logs into a social media or social networking web portal 500 (called a social media portal herein).
- the first user 100 authenticates in social media portal 500 .
- the first user 100 transmits his or her image to social media portal 500 .
- the first user 100 answers (in the form of a voice and/or short message text and/or e-mail command) all questions related to the input modules 160 to 380 .
- the expert system 140 which, by way of example, and not by way of limitation, may comprise an artificial intelligence system, or a neural network system, simulates the image personality 460 of the first user 100 .
- the expert system 140 is shown as co-existing with social media portal 500 server, but those skilled in the art would recognize that the expert system 140 may be located off site from the social media portal 500 server with a connection through the internet ( 10 in FIG. 5 ).
- the first user 100 may search through model or previously entered image personalities that may closely match his or her own image personality. For example, text or voice-driven searches may be performed describing basics of the first user's personality for compatible image personalities 420 of other users in the social media portal 500 . As part of this search, in step 2110 , the first user 100 may ask questions to one or more compatible image personalities 420 of other users in social media portal 500 to determine if one or more of those image personalities 420 are close enough to his or her own personality.
- step 1214 if the first user 100 is satisfied with answers from his or her search of similar image personalities 420 , then search of compatible image personalities 420 of other users in social media portal 500 is stopped. Otherwise in step 1216 , the search of compatible image personalities 420 of other users continues.
- a dataflow diagram illustrates dataflow during an interaction of the image personality 420 of a first user 100 applying for employment in an employment system according to the embodiment of FIG. 3 .
- a first user 100 applying for a position logs in to the social media web portal 500 .
- the first user 100 authenticates in social media web portal 500 .
- the first user 100 may transmit his or her image to the social media web portal 500 .
- the first user 100 may transmit images of his or her educational certificates and diplomas to the social media web portal 500 .
- the first user 100 transmits his or her resume to the social media web portal 500 .
- the first user 100 receives recommendations from other users to social media web portal 500 based on the first user's uploaded data.
- the second user 460 a recruiter, asks employment related questions of the image personalities 420 of all first users 100 who uploaded their job data and created image personalities 420 to seek employment.
- step 1264 if the second user (recruiter) 460 is satisfied with answers to his or her questions by one of the image personalities 420 , then search of other compatible candidates is stopped, otherwise in step 1266 , the search for other compatible candidates is iterated until the second user 460 is satisfied in finding other compatible candidates.
- advertisements for a product and/or service can be targeted to a set of users (men or women) belonging to different groups of image personalities 420 .
- five major personality traits in the image personalities 420 can be targeted using the system in the context of social media and/or social networking.
- men and women can have a difference in personality.
- those five major personality traits might be extraversion, agreeableness, conscientiousness, emotional stability and openness to experience.
- a system dataflow diagram illustrates data flow during application of the embodiment of FIG. 3 in an advertising system using social media in which a set of first users 100 are targeted by virtue of their image personalities 420 and genders specified within their gender personalities 420 .
- An automated search agent 540 may be configured to scan social the media web portal 500 for image personalities 420 with, by way of example, and not by way of limitation, the five major personality traits referred to above, namely extraversion, agreeableness, conscientiousness, emotional stability and openness to experience. Further, the search agent 440 may also search the knowledge bank 440 to match certain key words and definitions to these five traits and other traits of the image personalities 420 to best provide a set of target users 100 for advertisements. Just as the knowledge bank 440 may comprise an online dictionary accessed through the internet 10 , the search agent 540 may also access the knowledge bank 440 and the social media web portal 500 through the internet 10 .
- a dataflow diagram illustrates dataflow during application of image personalities 420 and genders of a set of first users 100 in the context of real-time location based social advertisement or social commerce application according to the embodiment if FIGS. 3 and 8 .
- the first user 100 who have created image personalities 420 log into the social media web portal 500 , presumable either at home or on their smart phones or notebook computers.
- the users authenticate in the social media web portal 500 .
- the respective electronic devices of the first users 100 transmit their location to the social media web portal 500 .
- the advertising search agent 540 may scan for the image personalities 420 of the users 100 with the traits desired by one or more advertisers.
- the advertising search agent 540 may further segment the image personalities 420 found in the step 1286 by gender.
- the advertising search agent 540 matches products and/or services for the first users 100 with their respective image personalities and genders.
- the social media web portal 500 may then be directed by the advertising search agent 540 to transmit coupons or advertisements for the advertiser's products and/or services to the users 100 of the matched image personalities.
- Advertising success may be measured as a response rate percentage. For example, advertisers may consider ten percent (10%) to be a response rate that indicates a successful advertising campaign.
- the social media portal 500 may receive the use or response percentage directly or from the advertisers. In step 1294 , the social media portal 500 may determine if a threshold desired response percentage, otherwise the advertisement is reiterated in step 1296 , with, for example, adjusted traits searched.
- a flow diagram illustrates a method in which one embodiment may be used to provide assistance to a shopper trying to purchase a gift for a recipient.
- the user 100 would be a recipient of the gift whose image personality 420 has been stored in the social media portal 500
- second user 460 would be a shopper.
- a shopper logs into the website of the social media portal 500 .
- an image of the first user 100 in this example, a virtual gift recipient, may appear on a screen.
- the shopper (second user 460 ) may ask questions.
- Such questions may include questions regarding likes, dislikes, and other indicators of what the recipient may like, by way of example, and not by way of limitation, do you like diamonds, or what is your birth stone? to the virtual recipient (image personality 420 ).
- a system may access the simulated personality 420 of the recipient (user 100 ), integrated with the knowledge bank 440 .
- the virtual recipient may ask questions, by way of example, and not by way of limitation, is there a sale on perfume at the department store? to the shopper to try to direct the shopper in his or her purchasing decision.
- the virtual recipient may respond to shopper's questions.
- step 3012 if the shopper has enough information to make a buying decision, then the shopper stops. Otherwise the process is reiterated until the shopper is satisfied with the answers from the virtual recipient enough to make such a purchasing decision.
- a first user 100 may enter/answer his or her own personality 420 and, based on his or her wants, needs, likes, or dislikes, the system could recommend purchases and/or deliver answers that are more suited to the first user 100 .
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Strategic Management (AREA)
- Software Systems (AREA)
- Finance (AREA)
- Marketing (AREA)
- Development Economics (AREA)
- General Business, Economics & Management (AREA)
- Economics (AREA)
- Accounting & Taxation (AREA)
- General Engineering & Computer Science (AREA)
- Databases & Information Systems (AREA)
- Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Game Theory and Decision Science (AREA)
- General Health & Medical Sciences (AREA)
- Tourism & Hospitality (AREA)
- Entrepreneurship & Innovation (AREA)
- Primary Health Care (AREA)
- Human Resources & Organizations (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Signal Processing (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Acoustics & Sound (AREA)
- Multimedia (AREA)
- Quality & Reliability (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
A system and method provides for an interactive query comprising a first input module capable of receiving input for creating a simulated personality for a first user. An expert system is capable of creating and storing the simulated personality. An output module is used for presenting the simulated personality to a second user. An interactive query module is capable of allowing the second user to communicate with the simulated personality of the first user.
Description
- The present invention relates to a system and method for an interactive query utilizing a simulated personality. More specifically, the present invention relates to systems and methods of simulating a personality of a user, providing one or more still or animated images of a user, and providing interactive queries with another user, preferably in the context of social media or social networking.
- Techniques for personality and emotion simulation have been contemplated and described. For example, Arjan Egges, Sumedha Kshirsagar and Nadia Magnenat-Thalmann of MIRALab—University of Geneva, describe a generic model for personality, mood and emotion simulation for conversational virtual humans in their paper titled Generic Personality and Emotion Simulation for Conversational Agents, Published in Computer Animation and Virtual Worlds archive, Volume 15 Issue 1, March 2004 Pages 1-13. They further present a generic model for updating the parameters related to emotional behavior, as well as a linear implementation of the generic update mechanisms describe a prototype system that uses models in combination with a dialogue system and a talking head with synchronized speech and facial expressions.
- With the emergence of 3D graphics, we are now able to create very believable 3D characters that can move and talk. Multi-modal interaction with such characters is possible as the required technologies are getting mature (speech recognition, natural language dialogues, speech synthesis, animation, and so on). However, an important part often missing in this picture is the definition of the force that drives these techniques: the individuality. Egges, Kshirsagar and Magnenat-Thalmann explore structure of this entity as well as its link with perception, dialogue and expression.
FIG. 1 depicts how they view the role of personality and emotion as a glue between perception, dialogue and expression. - In emotion simulation research so far, appraisal (obtaining emotional information from perceptive data) is popularly done by a system based on the Ortony, Clore, and Collins model (OCC model). This model specifies how events, agents and objects from the universe are appraised according to respectively their desirability, praiseworthiness and appealingness. The latter three factors are decided upon by a set of parameters: the goals, standards and attitudes. The model delivers us emotional information (i.e. the influence on the emotional state) with respect to the universe and the things that happen/exist in it. In order to have a working model for simulation, one is of course obliged to define the goals, standards and attitudes of the simulator. These factors can be considered as the personality of the simulator. In this case, the personality of a simulator is (partly) domain-dependent. However, more recent research—the OCC model dates from 1988—indicates that personality can be modeled in a more abstract, domain-independent way. In this case, personality is an ensemble of factors/dimensions that each denote an influence on how perception takes place and how behavior is shown. An interface between multi-dimensional and domain-independent personality models and the OCC model does not yet exist. In order to create an integrated model that can handle both appraisal and emotion-based behavior, we need to define how we can use a domain-independent personality model and still use the OCC model for appraisal.
- Egges, Kshirsagar and Magnenat-Thalmann, in their paper, investigate the nature of this relationship and propose how to parameterize it so that it can be used in concrete applications.
- The effect of personality and emotion on agent behaviour has been researched quite a lot, whether it concerns a general influence on behaviour, or a more traditional planningbased method. Various rule-based models, probabilistic models and fuzzy logic systems have been reported in the past. The Egges, Kshirsagar and Magnenat-Thalmann model is not targeted for one specific kind of behaviour synthesizer. They developed a personality and emotion simulator that can be used as a separate module and that can be customized depending on which dialogue system, planning system or reasoning system is used. How behaviour should be influenced by personality and emotion is depending on the application and the system type that is used and it is out of the scope of this paper.
- Finally, personality and emotion will have an effect on how behaviour is expressed (speech will have different intonations, a face will make expressions reflecting the emotional state, a body will make different gestures according to the personality and the emotions).
- A first step towards such an approach is a system that can simulate personalized facial animation with speech and expressions, modulated through mood. There have been very few researchers who have tried to simulate mood. Vel'asquez proposed a model of emotions, mood and temperament that provides a flexible way of controlling the behaviour of the autonomous entities. Generally, moods and emotions are only differentiated in terms of levels of arousal. Simple models have been proposed for blending, mixing and decaying emotions to subsequently select actions of the agent.
- Personality and emotion are basically the same mechanisms only differentiated by two cognitive variables time and duration, and personality can be seen as consistent expression of emotion. An individual is an entity that is constantly changing (having different emotions, moods, etc.). So, when someone speaks of an individual, they always refer to it relative to a time t. The moment that the individual starts existing is defined by t=0. The abstract entity that represents the individual at a time t can be called It. In the simple case, an individual has a personality and an emotional state (not yet taking mood into consideration). The model based on this assumption is called PE. In one framework, the personality is constant and initialized with a set of values on t=0. The emotional state is dynamic and it is initialized to 0 at t=0. Thus It as a tuple (p, et), where p represents the personality and et represents the emotional state at time t. For example, a person will portray emotions (that change over time) based on what happens, but how she obtains these emotions and the behavior that results from it, depends on a static part of her being, the personality.
- From psychology research, there are many personality models that consist of a set of dimensions, where every dimension is a specific property of the personality. Take for example the OCEAN model, which has five dimensions (see Table 1) or the PEN model that has three dimensions.
-
TABLE 1 The OCEAN model of personality Adjectives used Factor Description to describe Openness Open mindedness, interest in Imaginative, creative, culture explorative Conscientiousness Organized, persistent in Methodical, well achieving goals organized, dutiful Extraversion Preference for and behaviour Talkative, energetic, in social situations social Agreeableness Interactions with others Trusting, friendly, cooperative Neuroticism Tendency to experience Insecure, emotionally negative thoughts distressed - Generalizing from these models, it is assumed that a personality has n dimensions, where each dimension is represented by a value in the interval. A value of 0 corresponds to an absence of the dimension in the personality; a value of 1 corresponds to a maximum presence of the dimension in the personality. The personality p of an individual can then be represented by the following vector:
-
- As an example, an OCEAN personality can be specified (thus n=5) that is very open, very extravert but not very conscientious, quite agreeable and not very neurotic:
-
- Emotional state has a similar structure as personality. The emotional state is a set of emotions with a certain intensity. The size of this set depends on the theory that is used. For example, in the OCC model, 22 emotions are defined, while others may define 6 that are used as a basis for facial expression classification. The emotional state is something that can change over time (for example due to a decay factor). Therefore, an emotional state can be relative to a time t. The emotional state can be defined et as an m-dimensional vector, where all m emotion intensities are represented by a value in the interval [0, 1]. A value of 0 corresponds to an absence of the emotion; a value of 1 corresponds to a maximum intensity of the emotion. This vector is given as follows:
-
- Furthermore, an emotional state history can be defined Ωt that contains all emotional states until et, thus:
-
Ω t=(e 0 , e 1 , . . . , e t) - An extended version of the PE model can be given by including mood. As such, the individual It can be defined as a triple (p, mt, et), where mt represents the mood at a time t. Mood has been accepted as the notions of personality and emotional state. Mood is less static than personality and less fluent than emotional state. It is an intermediate form that exists between the two and that describes a rather static state of being that lasts longer than the average emotion as illustrated in
FIG. 2 . This state of being can be one-dimensional (being in a good or a bad mood) or perhaps multi-dimensional (feeling in love, feeling depressed). - A possibility of having multiple mood dimensions can be defined so a how many dimensions mood actually has can be selected. A mood dimension can be defined as a value that is either negative or positive and that lies in the interval [−1, 1]. Supposing that there are k mood dimensions, the mood can be described by a vector:
-
- Just like for the emotional state, there is also a history of mood, σt, that contains the moods m0 until mt:
-
σt=(m 0 , m 1 , . . . , m t) - Systems such as that provided by Memoirs From Heaven of Temecula, Calif. provide for posthumous letter delivery for men and women of United States armed forces and first responders. Memoirs from Heaven allows for a person's deepest thoughts and feelings, as well as the words left unsaid, to be delivered to your loved ones.
- Memoirs From Heaven checks with the Social Security Death Index (SDDI) periodically to determine when to start delivering a person's pre-written letters to pre-determined recipients at pre-determined times based on the scheduled information the user provides when they sign up. The writer may review and revise the letters as often as they would like before they pass away.
- However, using the Memoirs From Heaven system and other systems like it, once the writer-user passes away, the writer-user's letters are static. Further, there is no way for a user to impart situation specific advice, encouragement or to provide interaction with his children, relatives or friends have he or she has passed. The system and method of the present invention solves these and other problems in the prior art.
- According to one preferred embodiment, a system for system for an interactive query comprises a first input module capable of receiving input for creating a simulated personality for a first user; an expert system capable of creating and storing the simulated personality; an output module for presenting the simulated personality to a second user; and an interactive query module capable of allowing the second user to communicate with the simulated personality of the first user.
- According to another preferred embodiment, a method for an interactive query comprising: receiving input for creating a simulated personality for a first user; creating and storing the simulated personality; presenting the simulated personality to a second user; and allowing the second user to communicate with the simulated personality of the first user.
-
FIG. 1 diagramatic overview of a prior art intelligent agent framework for an emotional state and personality; -
FIG. 2 is a prior art personality, mood and emotion scale from static to dynamic according to the prior art system ofFIG. 1 ; -
FIG. 3 is a flow diagram illustrating the steps performed by a system and method for an interactive query utilizing a simulated personality in accord with one embodiment; -
FIG. 4 is a flow diagram illustrating data flow for providing close simulation of a personality of a user by an expert algorithm according to the embodiment ofFIG. 3 ; -
FIG. 5 is a diagramatic representation of an interaction between a user, a knowledge bank and an interactive query with another user according to the embodiment ofFIG. 3 ; -
FIGS. 6A-6C illustrate a dataflow diagram showing dataflow in a social relationship according to the embodiment ofFIG. 3 ; -
FIGS. 7A-7C illustrates a dataflow diagram showing dataflow during the interaction of the image personality of a user in employment screen according to the embodiment ofFIG. 3 ; -
FIG. 8 is a dataflow diagram illustrating the data flow during application of the embodiment ofFIG. 3 in an advertising system using social media in which a set of first users are targeted by virtue of their image personalities and genders; -
FIGS. 9A-9C illustrates a dataflow diagram showing dataflow during application of image personalities and genders of a set of first users in the context of real-time location based social advertisement or social commerce according to the embodiment ifFIGS. 3 and 8 ; and -
FIG. 10 is a flow diagram illustrating the steps performed by a system and method for an interactive query utilizing a simulated personality in accord with one embodiment to aid a shopper in buying a gift. - The following detailed description is of the best currently contemplated modes of carrying out exemplary embodiments of the invention. The description is not to be taken in a limiting sense, but is made merely for the purpose of illustrating the general principles of the invention, since the scope of the invention is best defined by the appended claims.
- Various inventive features are described below that can each be used independently of one another or in combination with other features.
- Fundamental characteristics of personality of a
user 100 can include: (1) consistency—essentially, any user act in the similar ways in a variety of situations and there is a recognizable order and regularity to behaviors, (2) psychological and physiological—personality is a psychological construct, but it can be influenced by biological processes and needs, (3) behaviors and actions—personality not only influence how a user respond in a particular environment, but also causes a user to act in certain ways, (4) multiple expressions—personality is displayed in more than just behavior. It can also be seen in thoughts, feelings, close relationships and other social interactions. - Various parameters can be used and updated in order to simulate personality of a user at any point in their life. These parameters may include, by way of example, and not by way of limitation: (1) relational psychology—situations and decisions that can prove revealing, (2) personality profiles—questions that define the character, (3) inner traits—analysis of preferences such as drawing style and handshake, (4) love test—scenarios that attempt to explain outlook on love, (5) shape test—shapes and symbols that reveal personality characteristics, (6) food test—how food and drink preferences relate to personality, (7) color test—color preferences, (8) internal beliefs -values, moral beliefs, political beliefs and spiritual beliefs, (9) personal details—age, family background, education, profession and external environment, (10) personal interviews—utilizing a speech recognition algorithm and/or a natural language processing algorithm, (11) personal library of audios and videos, and (12) what others think & say.
- An expert algorithm, by way of example, and not by way of limitation, an artificial intelligence algorithm and/or a neural network algorithm, can synthesize results of the above parameters and can simulate closely the personality of a user.
- Referring to
FIG. 3 , a flow diagram illustrates the steps performed by a system and method for an interactive query utilizing a simulated personality in accord with one embodiment. Instep 1000, a family member, friend, coworker or employer, by way of example, and not by way of limitation, a son or daughter, logs into a website. Instep 1002 an image of the user, in this example, a virtual father, may appear on a screen. Instep 1004 the son or daughter may asks questions. Such questions may include life's questions, by way of example, and not by way of limitation, should I buy a house? to virtual father. In step 1006 a system may access the simulated personality of a virtual father, integrated with aknowledge bank 440. Instep 1008, the virtual father asks questions, by way of example, and not by way of limitation, did you find a stable job? to the son or daughter for clarification and background of the question asked by the son or daughter. Instep 1010 virtual father responds to son's or daughter's question. Instep 1012, if the son or daughter is satisfied with virtual father's answer, then the son or daughter stops, otherwise the process is reiterated until the son or daughter is satisfied with the answers from the virtual father. - Referring to
FIG. 4 , a flow diagram illustrates data flow for providing close simulation of a personality of a first user (such as the virtual father ofFIG. 3 ) by an expert algorithm, artificial intelligence algorithm and/or a neural network algorithm. For example, the system may store a snapshot of apersonality 120 of a user at a time t=t1, as the system may store several snapshots of the user'spersonality 120 at different times tn. The closelysimulated personality 120 of auser 100 may be configured to synthesizevarious input modules 160 to 380, as listed below. -
Input Modules Description of Input Modules For a User 160 Relational Psychology 180 Personality Profiles 200 Inner Traits 220 Love Test 240 Shape Test 260 Food Test 280 Color Test 300 Internal Beliefs 320 Personal Details 340 Personal Interviews 360 Personal Library of Audios, Videos & Wisdom 380 What Others Think & Say - Such a system may be use the above inputs to create a closely
simulated personality 120 is available from Electronic Arts of Redwood City, Calif., United States, as used in The Sims® series of video games. For example, in both The Sims® and The Sims® 2, personality is split into 5 sections: Niceness, neatness or cleanliness, outgoingness, activeness, and playfulness. It is normally related to a Sim's Zodiac sign, but there are exceptions. - Each section uses a zero-to-ten scale. The scale is actually 0 to 10.00 in The Sims® game, which could be applied to input
modules personality 120 too. In The Sims® 2, low is generally less than 4, and high is generally 8 (or 8.01) and above. In The Sims® 3, the personality points system was replaced with traits, which could also be used as an alternative to store thepersonality 120. - In The Sims®, children get personality points when they grow up from being babies. Most of what they call the townies get random personality points, though they may not receive them until they first appear in-game.
- In The Sims®: Livin' Large®, the chemistry set can make a potion that reverses personality points. The only way to get them back would be to recreate the potion. A Sim abducted by aliens would return with an altered personality. Sims can change their personality one point at a time by using what is called the Crystal Ball. In The Sims® 2, aspects of a
simulated personality 120 may be changed via what is called the Encourage interaction. - These qualities in The Sims® are sometimes referred to as traits, many of which mirror the above described
input modules - Over the user's lifetime, a snapshot of the user's
simulated personality 120 may be stored. By way of example, and not by way of limitation, every year, two years, or five years, a user may decide to add another snapshot simulatedpersonality 120 to the system, along with the user's age at the time in order to organize the snapshots intoage personality simulation 120 snapshots. - Personality questions provided to the user in
input modules personality 120 at time t2, t3 . . . tn. - With each snapshot, integration of a
simulated personality 120 of auser 100 with one or more image(s) of the person may further be taken and stored. In one embodiment, a three-dimensional (3-D) telepresence-like holographic image may be captured in near-real-time to represent the user to produce a combination 3-D representation and combinedsimulated personality 120 that can be termed an image personality (420 inFIG. 5 ) of a first user for each time tn. In other embodiments, although less preferably because of lesser impact, the image personality may include only two dimensional (2-D) representations of the user at each time tn. - As explained below, this will give the user's relatives and friends the ability to converse with the user's
simulated personalities 120 based on the age of the user. In other words, a user's friends and family may be able to interact with the user'ssimulated personality 120 at the age of 10, then again at the age of 15, and then at the age of 20, 35, and so on, at their choosing. - Each image personality of a user may be stored in a cloud-based server to provide network-based access to friends or family members. By way of example, and not by way of limitation, an internet server may be connected to the internet for access through a secure connection such as secure socket layer (SSL) to the image personality snapshots. Social media connections may be used such as through Google+® or Face Book®.
- With reference to
FIG. 5 , shown is a diagramatic representation of an interaction between theimage personality 420 of a first user, aknowledge bank 440 and an interactive query with anotheruser 460 according to the embodiment ofFIG. 1 . Interaction between theimage personality 420 and thesecond user 460 may occur via voice, using voice recognition, or text, depending on the system. By way of example, and not by way of limitation, theimage personality 420 andsecond user 460 may interact through voice conversation, wherein thesecond user 460 interactively communicates with aholographic 3D representation 400 of the first user. In another embodiment, by way of example, and not by way of limitation, on the other end of the scale of interactivity, interaction between theimage personality 420 and thesecond user 460 may occur by means of e-mail or text communications. In yet another embodiment, by way of example and not by way of limitation, interaction may occur with real-time chat through a Facebook®-type interface or other social-media-type chat interface. - For any of these embodiments, communications with the
user 460 may be through a wide area network such as theinternet 10 or world-wide-web. - In some embodiments,
image 400 may comprise an animated or still 2D image for theimage personality 420 at time tn. For example, if the firs user first provided a personality profile and image of him or herself when he was 15, then thesecond user 460 who is communicating with theimage personality 420 at time t1 may see or interact with therepresentation 400 of the first user that the user uploaded when he or she was 15. If the next time the first user had provided a personality profile was when he or she was 25, then if thesecond user 460 communications with the t2 image personality 420, thesecond user 460 would be viewing the image of the first user that she provided when he or she was 25. - Arj an Egges, Sumedha Kshirsagar and Nadia Magnenat-Thalmann of MIRALab—University of Geneva, describe an application with a visual front end that can be used in some embodiments of the present system in their paper titled Generic Personality and Emotion Simulation for Conversational Agents, Published in Computer Animation and Virtual Worlds archive, Volume 15 Issue 1, March 2004 Pages 1-13. As described therein, a graphical front-end comprises of a 3D talking head capable of rendering speech and facial expressions in synchrony with synthetic speech. The facial animation system interprets the emotional tags in the responses, generates lip movements for the speech and blends the appropriate expressions for rendering in real-time with lip synchronization. Facial dynamics are considered during the expression change, and appropriate temporal transition functions are selected for facial animation.
- MPEG-4 facial animation parameters are used as low level facial deformation parameters. A system that provides the details of the deformation algorithm that can be used in one embodiment are explained in S. Kshirsagar, S. Garchery, and N. Magnenat-Thalmann, Deformable Avatars, Feature Point Based Mesh Deformation Applied to MPEG-4 Facial Animation, pages 33-43, Kluwer Academic Publishers, July 2001. However, for defining the visemes and expressions, principle components described by Kshirsagar et al. in S. Kshirsagar, T. Molet, and N. Magnenat-Thalmann, Principal components of expressive speech animation, Proceedings Computer Graphics International, pages 59-69, 2001. The principle components are derived from the statistical analysis of the facial motion data and reflect independent facial movements observed during fluent speech. They are used as high level parameters for defining the facial expressions and visemes. The use of principal components facilitates realistic speech animation, especially blended with various facial expressions. The main steps incorporated in the visual front-end are the following:
- 1. Generation of facial animation parameters (FAPs) from text: Available text-to-speech (TTS) software that provides phonemes with temporal information may be used for this component. Co-articulation rules may be applied based on the algorithm described in M. M. Cohen and D. W. Massaro, Modelling Co Articulation In Synthetic Visual Speech, pages 139-156, Springer-Verlag, 1993, which may be adopted for use with the principal components.
- 2. Expression blending: The dialogue system may output expression tags with the text response. Each expression is associated with an intensity value. An attack-sustaindecay-release type of envelope may be applied for the expressions and it is blended with previously calculated co-articulated phoneme trajectories. This blending is based on observed facial dynamics, incorporating the constraints on facial movements wherever necessary in order to avoid excessive/unrealistic deformations.
- 3. Periodic facial movements: Periodic eye-blinks and minor head movements may be applied to the face for increased believability.
- In order to help the
image personality 420 to respond to the questions and interactions of thesecond user 460, an online knowledge bank may be accessed by theimage personality 420. Although when the first user created the image personality, he or she may have inputted as much has he or she could regarding how he or she would respond to one or moresecond users 460 who may ask questions of his or her image personality, 460, it unlikely that the first user could have provided all of the knowledge necessary for any conceivable present or future knowledge-based question that thesecond user 460 could ask. For example, a second user's grandfather who might be the first user may not have much knowledge of smart phone technology when he created his image personality at time tn. - If the
second user 460 asks a question related to smart phones of theimage personality 420, it would not be desirable for the image personality to decline to answer thesecond user 460 because of such a lack of knowledge. Thus, instead of declining to answer, theimage personality 420 may access aknowledge bank 440 in order to gain knowledge regarding the subject matter of a question to provide an answer that is commensurate with the personality of theimage personality 420 at time tn. Such a knowledge bank may comprise, by way of example and not by way of limitation, Wikipedia®, provided by the Wikipedia Foundation, Inc. of San Francisco, Calif., accessed through theinternet 10. - A cognitive system such as the Watson® system provided by IBM Corporation of Armonk, New York, may be used to then interpret the question from the
second user 460, and access theknowledge bank 440 to understand the question and determine one or more possible responses based on theknowledge bank 440. Which of the one or more possible responses is actually provided to thesecond user 460 depends on theimage personality 420 of the first user. The possible responses may be put into, by way of example, and not by way of limitation, the OCC model discussed by Arjan Egges, Sumedha Kshirsagar and Nadia Magnenat-Thalmann of MIRALab—University of Geneva, in their paper titled Generic Personality and Emotion Simulation for Conversational Agents, Published in Computer Animation and Virtual Worlds archive, Volume 15 Issue 1, March 2004 Pages 1-13.For example, if theimage personality 420 of the grandfather suggests that the grandfather is economically frugal, then the OCC model may suggest that to thesecond user 460 that the cheapest smart phone should be purchased thesecond user 460. - As with social media applications, such as Facebook®, certain levels of services in one embodiment may be offered for free, with banner, pop-up, impression, or other types of paid advertiser support. However, in other embodiments, instead of, or along with, advertiser support, support may be provided by charging for paid subscription services, including pay-per-question (PPQ) charging for asking questions of the
image personality 420 of auser 100. In yet other embodiments, the above-described paid support mechanisms may be combined with, or replaced by set-up fees paid by the first user whose personality is imaged. - With reference to
FIGS. 6A-6C , a dataflow diagram shows dataflow in a social media relationship according to the embodiment ofFIG. 3 . The dataflow diagram of these figures describes a real-life application of theimage personality 420 of thesecond user 460 in social relationship with theimage personality 420. The embodiment ofFIGS. 6A-6C might describe such a relationship in a dating or friend finder website, although not exclusively. - With reference specifically to
FIG. 6A , instep 2100, thefirst user 100 logs into a social media or social networking web portal 500 (called a social media portal herein). Instep 2102, thefirst user 100 authenticates insocial media portal 500. Instep 2104 thefirst user 100 transmits his or her image tosocial media portal 500. Instep 2106, thefirst user 100 answers (in the form of a voice and/or short message text and/or e-mail command) all questions related to theinput modules 160 to 380. In step 4004, theexpert system 140, which, by way of example, and not by way of limitation, may comprise an artificial intelligence system, or a neural network system, simulates theimage personality 460 of thefirst user 100. Theexpert system 140 is shown as co-existing withsocial media portal 500 server, but those skilled in the art would recognize that theexpert system 140 may be located off site from thesocial media portal 500 server with a connection through the internet (10 inFIG. 5 ). - Continuing from the flow diagram in
FIG. 6A , moving toFIG. 6B , instep 1208, as an optional shortcut to create the first user'simage personality 420, thefirst user 100 may search through model or previously entered image personalities that may closely match his or her own image personality. For example, text or voice-driven searches may be performed describing basics of the first user's personality forcompatible image personalities 420 of other users in thesocial media portal 500. As part of this search, in step 2110, thefirst user 100 may ask questions to one or morecompatible image personalities 420 of other users insocial media portal 500 to determine if one or more of thoseimage personalities 420 are close enough to his or her own personality. - In
step 1214, if thefirst user 100 is satisfied with answers from his or her search ofsimilar image personalities 420, then search ofcompatible image personalities 420 of other users insocial media portal 500 is stopped. Otherwise instep 1216, the search ofcompatible image personalities 420 of other users continues. - Those of skill in the art can readily recognize how the system described herein can be applied to an employment screen system to great advantage of in-house or outside employment recruiters. With reference to
FIGS. 7A-7C , a dataflow diagram illustrates dataflow during an interaction of theimage personality 420 of afirst user 100 applying for employment in an employment system according to the embodiment ofFIG. 3 . With specific reference toFIG. 7A , instep 1250, afirst user 100 applying for a position logs in to the socialmedia web portal 500. Instep 1252, thefirst user 100 authenticates in socialmedia web portal 500. Instep 1254 thefirst user 100 may transmit his or her image to the socialmedia web portal 500. Instep 1256, thefirst user 100 may transmit images of his or her educational certificates and diplomas to the socialmedia web portal 500. - Continuing to
FIG. 7B , instep 1258, thefirst user 100 transmits his or her resume to the socialmedia web portal 500. Instep 1260, thefirst user 100 receives recommendations from other users to socialmedia web portal 500 based on the first user's uploaded data. Instep 1262, thesecond user 460, a recruiter, asks employment related questions of theimage personalities 420 of allfirst users 100 who uploaded their job data and createdimage personalities 420 to seek employment. - Continuing to
FIG. 7C , instep 1264, if the second user (recruiter) 460 is satisfied with answers to his or her questions by one of theimage personalities 420, then search of other compatible candidates is stopped, otherwise instep 1266, the search for other compatible candidates is iterated until thesecond user 460 is satisfied in finding other compatible candidates. - Using the embodiment of
FIG. 3 , advertisements for a product and/or service can be targeted to a set of users (men or women) belonging to different groups ofimage personalities 420. By way of example, and not by way of limitation, five major personality traits in theimage personalities 420 can be targeted using the system in the context of social media and/or social networking. Furthermore, men and women can have a difference in personality. In one embodiment, those five major personality traits might be extraversion, agreeableness, conscientiousness, emotional stability and openness to experience. - With reference to
FIG. 8 , a system dataflow diagram illustrates data flow during application of the embodiment ofFIG. 3 in an advertising system using social media in which a set offirst users 100 are targeted by virtue of theirimage personalities 420 and genders specified within theirgender personalities 420. Anautomated search agent 540 may be configured to scan social themedia web portal 500 forimage personalities 420 with, by way of example, and not by way of limitation, the five major personality traits referred to above, namely extraversion, agreeableness, conscientiousness, emotional stability and openness to experience. Further, thesearch agent 440 may also search theknowledge bank 440 to match certain key words and definitions to these five traits and other traits of theimage personalities 420 to best provide a set oftarget users 100 for advertisements. Just as theknowledge bank 440 may comprise an online dictionary accessed through theinternet 10, thesearch agent 540 may also access theknowledge bank 440 and the socialmedia web portal 500 through theinternet 10. - With reference to
FIGS. 9A-9C , a dataflow diagram illustrates dataflow during application ofimage personalities 420 and genders of a set offirst users 100 in the context of real-time location based social advertisement or social commerce application according to the embodiment ifFIGS. 3 and 8 . Instep 1280, thefirst user 100 who have createdimage personalities 420 log into the socialmedia web portal 500, presumable either at home or on their smart phones or notebook computers. Instep 1282, the users authenticate in the socialmedia web portal 500. Instep 1284, the respective electronic devices of thefirst users 100 transmit their location to the socialmedia web portal 500. Alternatively, if any of the electronic devices of thefirst users 100 are not able to transmit their locations in real time, then firstusers 100 have the option of checking-in to thesocial media portal 500 with their location. Instep 1286, theadvertising search agent 540 may scan for theimage personalities 420 of theusers 100 with the traits desired by one or more advertisers. - Continuing with
FIG. 9B , instep 1288, theadvertising search agent 540 may further segment theimage personalities 420 found in thestep 1286 by gender. Instep 1290, theadvertising search agent 540 matches products and/or services for thefirst users 100 with their respective image personalities and genders. Instep 1292, the socialmedia web portal 500 may then be directed by theadvertising search agent 540 to transmit coupons or advertisements for the advertiser's products and/or services to theusers 100 of the matched image personalities. - Advertising success may be measured as a response rate percentage. For example, advertisers may consider ten percent (10%) to be a response rate that indicates a successful advertising campaign. In one embodiment after the coupons are sent to the
users 100 with the matched image personalities and genders, when coupons are used at point of sale systems or scanned by service providers, thesocial media portal 500 may receive the use or response percentage directly or from the advertisers. Instep 1294, thesocial media portal 500 may determine if a threshold desired response percentage, otherwise the advertisement is reiterated instep 1296, with, for example, adjusted traits searched. - With reference to
FIG. 10 , a flow diagram illustrates a method in which one embodiment may be used to provide assistance to a shopper trying to purchase a gift for a recipient. In this regard, theuser 100 would be a recipient of the gift whoseimage personality 420 has been stored in thesocial media portal 500, andsecond user 460 would be a shopper. In this regard, instep 3000, a shopper logs into the website of thesocial media portal 500. Instep 3002 an image of thefirst user 100, in this example, a virtual gift recipient, may appear on a screen. Instep 3004 the shopper (second user 460) may ask questions. Such questions may include questions regarding likes, dislikes, and other indicators of what the recipient may like, by way of example, and not by way of limitation, do you like diamonds, or what is your birth stone? to the virtual recipient (image personality 420). In step 3006 a system may access thesimulated personality 420 of the recipient (user 100), integrated with theknowledge bank 440. In step 3008, the virtual recipient may ask questions, by way of example, and not by way of limitation, is there a sale on perfume at the department store? to the shopper to try to direct the shopper in his or her purchasing decision. Instep 3010 the virtual recipient may respond to shopper's questions. Instep 3012, if the shopper has enough information to make a buying decision, then the shopper stops. Otherwise the process is reiterated until the shopper is satisfied with the answers from the virtual recipient enough to make such a purchasing decision. - Finally, with reference back to
FIGS. 8, 9A-9C, and 10 , those of skill would recognize that such an embodiment may be used for one or morefirst users 100 to receive suggestions or help with decisions based on the first user's own personality rather than a virtual one. As an example, afirst user 100 may enter/answer his or herown personality 420 and, based on his or her wants, needs, likes, or dislikes, the system could recommend purchases and/or deliver answers that are more suited to thefirst user 100. - The above disclosed descriptions are only the most preferred embodiment of the present invention. However, it is not intended to be limiting to the most preferred embodiment of the present invention. Numerous variations and/or modifications are possible within the scope of the present invention.
Claims (19)
1. A system for system for an interactive query, comprising:
a first input module capable of receiving input for creating a simulated personality for a first user;
an expert system capable of creating and storing the simulated personality;
an output module for presenting the simulated personality to a second user; and
an interactive query module capable of allowing the second user to communicate with the simulated personality of the first user.
2. The system of claim 1 , wherein the first input module comprises an interactive question and answer module for receiving input regarding personality traits of the first user.
3. The system of claim 2 , wherein the input module is further configured for receiving input for creating plurality if simulated personalities for the first user, each of the plurality of simulated personalities relating to a personality of the user at a time tn.
4. The system of claim 3 , wherein the interactive query module is further configured for allowing the second user to select which of the simulated personalities of the first user with which to communicate.
5. The system of claim 4 , wherein each time tn represents different age of the first user.
6. The system of claim 1 , the interactive query module is e-mail based.
7. The system of claim 1 , the interactive query module comprises a two-dimensional animated image of the first user.
8. The system of claim 1 , the interactive query module comprises a holographic animated image of the first user.
9. The system of claim 1 , wherein the second user comprises a relative of the first user postmortem.
10. The system of claim 1 , wherein the second user comprises a potential dating match of the first user.
11. The system of claim 1 , wherein the second user comprises a potential employer of the first user.
12. The system of claim 1 , wherein the second user comprises a potential advertiser to the first user.
13. The system of claim 1 , wherein the second user comprises a buyer of a gift for the first user.
14. The system of claim 1 , wherein the second user comprises a automobile dealer and the first user comprises a potential buyer of an automobile.
15. A method for an interactive query, comprising:
receiving input for creating a simulated personality for a first user;
creating and storing the simulated personality;
presenting the simulated personality to a second user; and
allowing the second user to communicate with the simulated personality of the first user.
16. The method of claim 15 , comprising receiving input regarding personality traits of the first user.
17. The method of claim 16 , comprising receiving input for creating plurality if simulated personalities for the first user, each of the plurality of simulated personalities relating to a personality of the user at a time tn.
18. The method of claim 17 , comprising allowing the second user to select which of the simulated personalities of the first user with which to communicate.
19. The method of claim 17 , wherein each time tn represents different age of the first user.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/345,327 US20170269946A1 (en) | 2012-10-22 | 2016-11-07 | System and method for an interactive query utilizing a simulated personality |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/656,757 US9489679B2 (en) | 2012-10-22 | 2012-10-22 | System and method for an interactive query utilizing a simulated personality |
US15/345,327 US20170269946A1 (en) | 2012-10-22 | 2016-11-07 | System and method for an interactive query utilizing a simulated personality |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/656,757 Continuation US9489679B2 (en) | 2012-10-22 | 2012-10-22 | System and method for an interactive query utilizing a simulated personality |
Publications (1)
Publication Number | Publication Date |
---|---|
US20170269946A1 true US20170269946A1 (en) | 2017-09-21 |
Family
ID=50486256
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/656,757 Active 2033-09-02 US9489679B2 (en) | 2012-10-22 | 2012-10-22 | System and method for an interactive query utilizing a simulated personality |
US15/345,327 Abandoned US20170269946A1 (en) | 2012-10-22 | 2016-11-07 | System and method for an interactive query utilizing a simulated personality |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/656,757 Active 2033-09-02 US9489679B2 (en) | 2012-10-22 | 2012-10-22 | System and method for an interactive query utilizing a simulated personality |
Country Status (1)
Country | Link |
---|---|
US (2) | US9489679B2 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPWO2020075647A1 (en) * | 2018-10-12 | 2021-02-15 | 株式会社豊崎会計事務所 | Information processing device |
US11341962B2 (en) | 2010-05-13 | 2022-05-24 | Poltorak Technologies Llc | Electronic personal interactive device |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102222122B1 (en) * | 2014-01-21 | 2021-03-03 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
US9542648B2 (en) * | 2014-04-10 | 2017-01-10 | Palo Alto Research Center Incorporated | Intelligent contextually aware digital assistants |
US9807046B2 (en) * | 2014-09-22 | 2017-10-31 | Peter Mendez | Automated messaging system survivor |
US11188809B2 (en) * | 2017-06-27 | 2021-11-30 | International Business Machines Corporation | Optimizing personality traits of virtual agents |
US10783329B2 (en) * | 2017-12-07 | 2020-09-22 | Shanghai Xiaoi Robot Technology Co., Ltd. | Method, device and computer readable storage medium for presenting emotion |
WO2019133848A1 (en) | 2017-12-30 | 2019-07-04 | Graphen, Inc. | Persona-driven and artificially-intelligent avatar |
US11380094B2 (en) | 2019-12-12 | 2022-07-05 | At&T Intellectual Property I, L.P. | Systems and methods for applied machine cognition |
US12045639B1 (en) * | 2023-08-23 | 2024-07-23 | Bithuman Inc | System providing visual assistants with artificial intelligence |
-
2012
- 2012-10-22 US US13/656,757 patent/US9489679B2/en active Active
-
2016
- 2016-11-07 US US15/345,327 patent/US20170269946A1/en not_active Abandoned
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11341962B2 (en) | 2010-05-13 | 2022-05-24 | Poltorak Technologies Llc | Electronic personal interactive device |
US11367435B2 (en) | 2010-05-13 | 2022-06-21 | Poltorak Technologies Llc | Electronic personal interactive device |
JPWO2020075647A1 (en) * | 2018-10-12 | 2021-02-15 | 株式会社豊崎会計事務所 | Information processing device |
JP7002085B2 (en) | 2018-10-12 | 2022-01-20 | 株式会社豊崎会計事務所 | Information processing equipment |
Also Published As
Publication number | Publication date |
---|---|
US20140114886A1 (en) | 2014-04-24 |
US9489679B2 (en) | 2016-11-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9489679B2 (en) | System and method for an interactive query utilizing a simulated personality | |
Juska | Integrated marketing communication: advertising and promotion in a digital world | |
Kar et al. | Unravelling the techno-functional building blocks of metaverse ecosystems–a review and research agenda | |
Pearson | Personalisation the artificial intelligence way | |
Aw et al. | Tap here to power up! Mobile augmented reality for consumer empowerment | |
Yang | Augmented reality in experiential marketing: The effects on consumer utilitarian and hedonic perceptions and behavioural responses | |
Kisiel | Working through synthetic worlds | |
Jung et al. | The effects of Experience-Technology Fit (ETF) on consumption behavior: Extended Reality (XR) visitor experience | |
Seligman | Artificial intelligence and machine learning and marketing management | |
Elsharnouby et al. | Avatar taxonomy: a new technological tool to enhance the consumer-brand relationships | |
Sweeney | Digital marketing QuickStart guide: The simplified beginner’s guide to developing a scalable online strategy, finding your customers, and profitably growing your business | |
JP2005326670A (en) | Mobile terminal device, information processing method, and service providing system | |
McMurtry | Marketing for dummies | |
Ho | 18 Emotion and communication design | |
Hinson et al. | Social Media Marketing Management: How to Penetrate Emerging Markets and Expand Your Customer Base | |
Rainsberger | Sales technology: An ocean of possibilities | |
Ishigaki et al. | A narrative review of three streams of avatar marketing with potential, examples, and challenges | |
Azman et al. | The Foresight Study of Virtual Reality as An Advertising Tool | |
Polat | Creating Loyal Customers with Digital Marketing Applications: The 5A Model | |
Ertemel | Illusional Marketing: The Use of Storytelling, User Experience and Gamification in Business | |
Guan | The Influence of Customer Three-Stage Experience on Purchase Intention of Derivative Works in Service-Intensive Industry | |
Huang | Real but Fictional: A Research Agenda of Virtual Influencers for Brand Communications in Social Media Marketing | |
Isem Cáceres | Just as Advertised: The Perception and Effectiveness of Artificial Intelligence in Digital Advertising | |
Alm et al. | The Role of AI in Modern Marketing Practices: A multi-method study of how AI can be integrated into the strategic and creative process of marketing creation and the perceptions of Swedish consumers. | |
Vrublevskaia | Effectiveness and universality of artificial intelligence implementation in modern marketing: media and cosmetics industry |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |