[go: up one dir, main page]

CN119620865A - Notification method for guiding user's visual attention to specific interaction targets using facial tactile feedback - Google Patents

Notification method for guiding user's visual attention to specific interaction targets using facial tactile feedback Download PDF

Info

Publication number
CN119620865A
CN119620865A CN202411709291.1A CN202411709291A CN119620865A CN 119620865 A CN119620865 A CN 119620865A CN 202411709291 A CN202411709291 A CN 202411709291A CN 119620865 A CN119620865 A CN 119620865A
Authority
CN
China
Prior art keywords
user
target object
feedback
target
tactile feedback
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202411709291.1A
Other languages
Chinese (zh)
Inventor
王巍
高欣怡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan University
Original Assignee
Hunan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan University filed Critical Hunan University
Priority to CN202411709291.1A priority Critical patent/CN119620865A/en
Publication of CN119620865A publication Critical patent/CN119620865A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本发明提供利用面部触觉反馈引导用户视觉注意力至特定交互目标的通知方法,属于人机交互领域,包括通过识别目标对象于用户眉心位置连线,来判断物体于用户之间的距离远近以及相对于当前用户面向方向的方位角度;通过实时驱动对应位置的反馈单元来引导用户看向对应方位;并通过触觉反馈单元的力度、振幅、持续时长、驱动数量等,来指示目标物体的距离的远近;当用户开始向目标物体移动时,实时根据用户与其距离和方位的动态变化持续进行反馈变化,如越近越强、从左到右;通过控制触觉反馈单元的力度、振幅、时空模式、持续时长等设计参数来展示来源交互目标的多维信息通知;通过眼动焦点判定用户的视线是否定位到目标物体,停止触觉反馈。

The present invention provides a notification method for guiding a user's visual attention to a specific interactive target by using facial tactile feedback, which belongs to the field of human-computer interaction. The method comprises: determining the distance between the object and the user and the azimuth angle relative to the current user's facing direction by identifying a line connecting a target object and the user's eyebrow center; guiding the user to look at the corresponding direction by real-time driving a feedback unit at a corresponding position; indicating the distance of the target object by the strength, amplitude, duration, driving quantity, etc. of the tactile feedback unit; when the user starts to move toward the target object, continuously performing feedback changes in real time according to the dynamic changes of the distance and orientation between the user and the target object, such as the closer the user is, the stronger the feedback is, and from left to right; displaying multi-dimensional information notification of the source interactive target by controlling the strength, amplitude, spatiotemporal pattern, duration, etc. of the tactile feedback unit; determining whether the user's line of sight is located at the target object by eye movement focus, and stopping the tactile feedback.

Description

Notification method for guiding visual attention of user to specific interaction target by using facial tactile feedback
Technical Field
The invention relates to the field of human-computer interaction, in particular to a notification method for guiding visual attention of a user to a specific interaction target by using facial tactile feedback.
Background
In the use scenario of a head-mounted virtual (augmented, mixed) reality device (XR/VR/AR-oriented field), some tasks require a user to quickly transfer line of sight to an interactive target located outside the peripheral or field of view in a complex visual environment. These interaction targets include, but are not limited to, people, items, windows, icons, etc. that exist in a virtual or real scene. Task scenes such as three-dimensional space navigation, finding items in a virtual/real environment, perceiving social objects, identifying virtual devices or windows, etc. In contrast to audiovisual perception, haptic sensations have the potential to provide more accurate spatial orientation information than auditory perception while avoiding the excessive visual sensory burden.
The prior art has provided haptic feedback for the head and face through the wearable device itself (e.g. a head mounted display HMD) for spatial guidance. Related studies have demonstrated that specific haptic feedback for the head and face can intuitively coordinate the mapping with the user's related body movements and eye movements, helping people divert attention on both in-line and out-of-line targets. Facepush is a device that can generate normal force on the left or right side of the user's face, directing the user's attention to off-site objects, feedback intensity versus rotation angle mapping. Masque add 6 actuators to the facial area in contact with the HMD to provide lateral skin stretching. They devised several patterns of changing both stretch direction and stretch point to guide the user to see or move to a particular direction in the room. Tseng et al propose a skin-stroking haptic device mounted inside an HMD, producing stroking feedback on eye circles, which can be used as an off-screen indicator. In general, these devices use tactile stimuli to indicate the basic direction of body movement, taking into account the number and position of the actuators. Some studies explored the use of tactile stimuli to navigate both in-line and out-of-line. VirtualWhiskers are mounted with 2 robotic arms on either side of the face and stimulate the target point calculated from the relative position of the face, covering 180 of the azimuth plane and 90 of the elevation plane. deJesusOliveira et al deployed five vibration actuators at the forehead and two vibration actuators near the temple. The vibrotactile HMD points in the direction of an object in the azimuth plane (202.5 ° field of view). The vibration frequency peaks at the correct elevation angle (from-22.5 deg. to 45 deg.). In addition to devices connected to the HMD, the tabletop ultrasound device may also provide haptic feedback to the face. Whiskers apply a set of 3.5 mm apart tactile stimuli over the user's cheeks, forehead centre and eyebrows. The direction of the haptic stimulus motion directs the user's line of sight within a 60 ° field of view.
These studies contain basic bearing cues for the interaction target, but lack an indication of the spatial distance of the interaction target relative to the user. This results in a user being unable to precisely divert visual attention to a particular interactive target when multiple interactive targets are in the same orientation but at different distances from the user, resulting in some degree of occlusion of the interactive target. In addition, the former approach is not a feedback cell array, and can provide lower haptic resolution. Finally, the foregoing approaches lack the ability to provide a variety of haptic stimuli such as force pressure, vibration, temperature, etc., and fail to provide a rich haptic feedback effect.
Disclosure of Invention
The present invention provides a notification method that uses facial haptic feedback to direct the visual attention of a user to a particular interaction target. The invention provides the tactile stimulus with azimuth direction for the surface of the skin around the eyes of the user through the tactile feedback unit which simultaneously provides force, vibration and temperature feedback, such as the Shape Memory Alloy (SMA) tactile feedback device, thereby helping the user to perform visual touch fusion man-machine interaction. Several haptic feedback units are integrated into existing head mounted displays to provide haptic feedback for the user's forehead, cheeks, eye periphery face of the temple, helping the user locate interaction targets originating from specific spatial locations and distances in virtual/augmented reality scenarios. When the system identifies the object to be indicated, the system calculates the relative position and the relative distance of the current object relative to the center of the visual field of the user, helps the sight of the user to be positioned quickly, and reduces the visual search time. In addition, the invention can display the multidimensional information notification of the source interaction target, such as urgency, event type, progress, emotional state, movement and the like by controlling the haptic stimulus parameters such as the intensity, amplitude, space-time mode and the like of the haptic feedback unit.
Technical proposal
A notification method for guiding a user's visual attention to a specific interaction target using facial tactile feedback is an interaction target that uses tactile feedback generated by a facial tactile array to indicate a relative position and a relative distance with respect to a user's visual field center, and presents multi-dimensional notification information by a change in tactile parameters.
The invention provides a notification method for guiding visual attention of a user to a specific interaction target by using facial tactile feedback, which comprises the following method steps:
(1) Judging the distance between the object and the user and the azimuth angle relative to the facing direction of the current user by identifying the connecting line of the target object at the user's eyebrow position;
(2) When the user starts to move to the target object, continuously carrying out feedback change in real time according to the dynamic change of the user and the direction thereof, such as from left to right;
(3) When a user starts to move towards the target object, continuously carrying out feedback change in real time according to the dynamic change of the distance between the user and the target object, such as the closer the user is, the stronger the feedback change is;
(4) Multidimensional information notification of source interaction targets, such as urgency, information type, progress, emotional state and the like, is displayed by controlling design parameters such as strength, amplitude, space-time mode and the like of the haptic feedback unit;
(5) And judging whether the sight line of the user is positioned to the target object or not through the eye movement focus, and stopping the tactile feedback.
Preferably, in the step (2), the tactile stimulus on the left side of the face directs the user to look to the left, and the tactile stimulus on the right side of the face directs the user to look to the right.
Preferably, the step (3) is performed by using the principle of 'near-far-small' similar to vision, wherein the closer the target is to the user, the larger the touch stimulus range is, and the learning cost of the user is reduced.
Preferably, in step (4), the multi-dimensional information notification of the source interaction target is displayed by controlling the design parameters such as the strength, the amplitude, the space-time mode and the like of the haptic feedback unit, including but not limited to urgency, information type, progress, emotional state, movement and the like, wherein the stronger the strength is represented as more urgent the event, and the two consecutive haptic stimuli are represented as social software information and the like.
The invention has the beneficial effects that:
1. The invention provides real-time and dynamic prompt of relative azimuth and distance prompt, and can more accurately position the interactive target of information source in complex scene.
2. The invention fuses multi-mode haptic feedback of force, vibration, heat and the like and provides rich haptic experience.
3. The invention displays multidimensional information notification of source interaction targets, including but not limited to urgency, information type, progress, emotional state, movement, etc., by controlling design parameters such as force, amplitude, spatiotemporal pattern, etc., of the haptic feedback unit.
4. The haptic feedback cell array of the present invention provides a higher resolution haptic stimulus.
Drawings
In order to more clearly illustrate the technical solutions of the present disclosure, the drawings that need to be used in some embodiments of the present disclosure will be briefly described below, and it is apparent that the drawings in the following description are only drawings of some embodiments of the present disclosure, and other drawings may be obtained according to these drawings to those of ordinary skill in the art. Furthermore, the drawings in the following description may be regarded as schematic diagrams, not limiting the actual size of the products, the actual flow of the methods, the actual timing of the signals, etc. according to the embodiments of the present disclosure.
FIG. 1 is a reference schematic diagram of facial positioning;
FIG. 2 is a schematic view of a user usage scenario 1;
FIG. 3 is a schematic view of a user usage scenario 2;
fig. 4 is a schematic view of a user usage scenario 3.
Detailed Description
In order to make the technical solution of the present application better understood by those skilled in the art, the technical solution of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, but not all embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, shall fall within the scope of the present application.
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to fall within the scope of the application.
The present invention will be described in further detail by way of examples.
Example 1
As shown in fig. 1, a notification method for guiding visual attention of a user to a specific interaction target by using facial tactile feedback comprises the following method steps:
(1) Judging the distance between the object and the user and the azimuth angle relative to the facing direction of the current user by identifying the connecting line of the target object at the user's eyebrow position;
(2) Guiding the user to look at the corresponding azimuth by driving the feedback unit of the corresponding position in real time (e.g., tactile stimulus on the left side of the face guides the user to look to the left);
(3) And indicates the distance of the target object and the like through the quantity, the force, the amplitude, the duration, the driving quantity and the like of the tactile feedback units (for example, the closer the target is to the user, the larger the tactile stimulus range is, and the principle of 'near-far-small' similar to vision is adopted, so that the learning cost of the user is reduced);
(4) Multidimensional information notifications of source interaction targets, including but not limited to urgency, information type, progress, emotional state, etc., are presented by controlling design parameters of the haptic feedback unit, such as force, amplitude, spatiotemporal pattern, etc. For example, the stronger the force is, the more urgent the event is represented, the two successive tactile stimulations are represented by social software information and the like, and when the user starts to move towards the target object, feedback changes are continuously carried out in real time according to the dynamic changes of the distance and the direction of the user, such as the closer the user is, the stronger the user is, and the left to right.
(5) And judging whether the sight line of the user is positioned to the target object or not through the eye movement focus, and stopping the tactile feedback.
Example 2
As shown in FIG. 2, a user uses social object recognition in a scene 1:XR scene
In an XR scenario, a user needs to perceive the location of a virtual social object and a real social object at the same time. When a user is immersed in a virtual reality scene, the relative position and distance of social objects in the reality scene can be perceived through haptic feedback without disrupting the sense of immersion in the virtual reality. The user may perceive the social state and intent of the social object through haptic feedback. When the user exits from the virtual reality, seeing the real social object, the haptic feedback stops.
Firstly, a camera identifies people in a room, the space coordinates of a real object and the space coordinates of a virtual object are converted into coordinate positions taking the eyebrow of a user as the center, the central sight line is taken as an X-axis, the distance between the camera and the user is output according to the coordinate positions, then the actuating parameters (such as a space-time mode, duration, pressure, temperature and the like) of a touch feedback unit are output according to the coordinate positions, the position relative to the center of the visual field of the user is output, the touch feedback unit for the position starts actuating the touch unit according to the fed-back parameters and the unit, the eye movement data of the user is output to gaze the social object, and finally the actuation is stopped.
Example 3
As shown in FIG. 3, a user uses scenario 2, message tracing for multiple virtual windows
In a home scenario of XR, a user may place multiple windows simultaneously at different spatial locations. For example, the kitchen has a timer window, the living room has a television window, etc. If the timer of the kitchen has a notification pop-up when the user views the television, the user can accurately perceive the window of the notification from a far kitchen direction through tactile feedback, so that the spatial source of the notification is accurately positioned. And judging the progress state of cooking by the intensity of the touch stimulus.
The method comprises the specific steps of firstly notifying a chat window of information, obtaining related information (such as urgency, information type and the like) of the notification, outputting space coordinates of a virtual window according to the information notification of the chat window, converting the coordinates into coordinate positions which take the eyebrow of a user as a center and the central sight line as an X axis, outputting the distance from the user according to the coordinate positions, then outputting the position relative to the visual field center of the user according to the coordinate positions, outputting the position of the haptic feedback unit according to the actuating parameters (such as the driving quantity and the like), and starting to actuate the haptic unit according to the actuating parameters and the unit fed back for the position, outputting eye movement data of the user to gaze the window, and finally stopping actuation.
Example 4
As shown in FIG. 4, the user uses scene 3-find objects in a real scene
In some situations, such as at a museum visit, a user may want to view a particular drawing, but not know the specific location, and need to find it. The position of the target drawing relative to the center of the visual field of the user can be judged through the position information combined by the camera and indoor positioning, and the user can be helped to find the corresponding drawing through the prompt of the tactile feedback. As the user approaches the target object, the haptic feedback is gradually enhanced and navigation instructions are made through the dynamic changes in orientation, such as left to right, bottom to top, etc., of the haptic array until the target object appears within the user's field of view.
Firstly, a camera recognizes the drawing in a plurality of directions, recognizes the spatial information of a museum and the current position of a user to obtain the spatial coordinates of a real object, converts the coordinates into coordinate positions which take the eyebrow of the user as the center and the central sight line as the X axis, outputs the distance from the user according to the coordinate positions, then outputs the position relative to the center of the visual field of the user according to the coordinate positions by an actuating parameter (such as a space-time mode and the like) of a tactile feedback unit, and starts to actuate the tactile unit according to the fed-back actuating parameter and the unit to output eye movement data staring window of the user, and finally stops actuation.
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather is intended to cover all modifications, equivalents, alternatives, and improvements that fall within the spirit and scope of the invention.
While the fundamental principles and main features of the present invention and advantages of the present invention have been described above and illustrated, it will be apparent to those skilled in the art that the present invention is not limited to the details of the above exemplary embodiments, but may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. The present embodiments are, therefore, to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned.
Furthermore, it should be understood that although the present disclosure describes embodiments, not every embodiment is provided with a separate embodiment, and that this description is provided for clarity only, and that the disclosure is not limited to the embodiments described in detail below, and that the embodiments described in the examples may be combined as appropriate to form other embodiments that will be apparent to those skilled in the art.

Claims (5)

1. The notification method for guiding visual attention of a user to a specific interaction target by using facial tactile feedback is characterized by comprising the following method steps:
(1) Judging the distance between the object and the user and the azimuth angle relative to the facing direction of the current user by identifying the connecting line of the target object at the user's eyebrow position;
(2) When the user starts to move to the target object, continuously carrying out feedback change in real time according to the dynamic change of the user and the direction thereof, such as from left to right;
(3) When a user starts to move towards the target object, continuously carrying out feedback change in real time according to the dynamic change of the distance between the user and the target object, such as the closer the user is, the stronger the feedback change is;
(4) Multidimensional information notification of source interaction targets, such as urgency, information type, progress, emotional state and the like, is displayed by controlling design parameters of strength, amplitude, space-time mode and the like of the haptic feedback unit.
2. (5) And judging whether the sight line of the user is positioned to the target object or not through the eye movement focus, and stopping the tactile feedback.
3. The method of claim 1, wherein step (2) is performed such that the tactile stimulus on the left side of the face directs the user to look to the left and the tactile stimulus on the right side of the face directs the user to look to the right.
4. The method of claim 1, wherein step (3) is performed by using a principle of "near-far-small" similar to visual sense, wherein the closer the target is to the user, the larger the haptic stimulus range is, and the learning cost of the user is reduced.
5. The method according to claim 1, wherein the step (4) is performed by controlling the design parameters of the haptic feedback unit such as intensity, amplitude, space-time pattern, etc. to display the multi-dimensional information notification of the source interaction target, including but not limited to urgency, information type, progress, emotional state, movement, etc., wherein the stronger the intensity is, the more urgent the event is represented, and the two consecutive haptic stimuli are, etc. represented by social software information.
CN202411709291.1A 2024-11-27 2024-11-27 Notification method for guiding user's visual attention to specific interaction targets using facial tactile feedback Pending CN119620865A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202411709291.1A CN119620865A (en) 2024-11-27 2024-11-27 Notification method for guiding user's visual attention to specific interaction targets using facial tactile feedback

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202411709291.1A CN119620865A (en) 2024-11-27 2024-11-27 Notification method for guiding user's visual attention to specific interaction targets using facial tactile feedback

Publications (1)

Publication Number Publication Date
CN119620865A true CN119620865A (en) 2025-03-14

Family

ID=94886335

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202411709291.1A Pending CN119620865A (en) 2024-11-27 2024-11-27 Notification method for guiding user's visual attention to specific interaction targets using facial tactile feedback

Country Status (1)

Country Link
CN (1) CN119620865A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130218456A1 (en) * 2006-02-16 2013-08-22 John S. Zelek Wearable tactile navigation system
CN103282742A (en) * 2010-11-10 2013-09-04 高通股份有限公司 Haptic based personal navigation
CN106843475A (en) * 2017-01-03 2017-06-13 京东方科技集团股份有限公司 A kind of method and system for realizing virtual reality interaction
CN108334190A (en) * 2016-12-27 2018-07-27 意美森公司 Use the touch feedback of visual field
CN116829906A (en) * 2021-01-29 2023-09-29 圆点流明有限公司 Computer-implemented method, wearable device, non-transitory computer-readable storage medium, computer program, and system for assisting visually impaired user movement

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130218456A1 (en) * 2006-02-16 2013-08-22 John S. Zelek Wearable tactile navigation system
CN103282742A (en) * 2010-11-10 2013-09-04 高通股份有限公司 Haptic based personal navigation
CN108334190A (en) * 2016-12-27 2018-07-27 意美森公司 Use the touch feedback of visual field
CN106843475A (en) * 2017-01-03 2017-06-13 京东方科技集团股份有限公司 A kind of method and system for realizing virtual reality interaction
CN116829906A (en) * 2021-01-29 2023-09-29 圆点流明有限公司 Computer-implemented method, wearable device, non-transitory computer-readable storage medium, computer program, and system for assisting visually impaired user movement

Similar Documents

Publication Publication Date Title
US12086379B2 (en) Devices, methods, and graphical user interfaces for providing computer-generated experiences
JP7587689B2 (en) DEVICE, METHOD AND GRAPHICAL USER INTERFACE FOR INTERACTING WITH A THREE-DIM
US20220091723A1 (en) Devices, Methods, and Graphical User Interfaces for Interacting with Three-Dimensional Environments
EP4217838B1 (en) Devices, methods, and graphical user interfaces for interacting with three-dimensional environments
US20230092874A1 (en) Devices, Methods, and Graphical User Interfaces for Interacting with Three-Dimensional Environments
Hirzle et al. A design space for gaze interaction on head-mounted displays
CN110187855B (en) Intelligent adjusting method for near-eye display equipment for avoiding blocking sight line by holographic image
CN101610715B (en) Feedback device for guiding and supervising physical exercises
US9292089B1 (en) Gestural object selection
Hu et al. Stereopilot: A wearable target location system for blind and visually impaired using spatial audio rendering
EP2946264B1 (en) Virtual interaction with image projection
Broussard et al. An interface for enhanced teacher awareness of student actions and attention in a vr classroom
CN108508629A (en) Intelligent contact eyeglass and method with eyes driving control system
CN107885124A (en) Brain eye cooperative control method and system in a kind of augmented reality environment
WO2017115635A1 (en) Information processing method and information processing program
Wang et al. Control with vergence eye movement in augmented reality see-through vision
CN119620865A (en) Notification method for guiding user's visual attention to specific interaction targets using facial tactile feedback
CN119576126B (en) Mixed reality eye-tracking interaction system and method based on dense map semantic segmentation
US20250044911A1 (en) Devices, Methods, and Graphical User Interfaces for Interacting with Three-Dimensional Environments
CN111651043B (en) An Augmented Reality System Supporting Customized Multi-Channel Interaction
JPWO2017169001A1 (en) Information processing apparatus, information processing method, and program
Deng Multimodal interactions in virtual environments using eye tracking and gesture control.
Zhou et al. A survey of 3D eye model based gaze tracking
Lugtenberg et al. Effects of User Perspective, Visual Context, and Feedback on Interactions With AR Targets on Magic-Lens Displays
Fuchs et al. Investigation on Tactile Perception of Ultrasonic Haptics Devices

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination