[go: up one dir, main page]

CN113469058B - Method and mobile device for preventing myopia - Google Patents

Method and mobile device for preventing myopia Download PDF

Info

Publication number
CN113469058B
CN113469058B CN202110751416.7A CN202110751416A CN113469058B CN 113469058 B CN113469058 B CN 113469058B CN 202110751416 A CN202110751416 A CN 202110751416A CN 113469058 B CN113469058 B CN 113469058B
Authority
CN
China
Prior art keywords
user
viewing
posture
prompt
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110751416.7A
Other languages
Chinese (zh)
Other versions
CN113469058A (en
Inventor
贺曙
高炜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Future Technology Co ltd
Original Assignee
Guangdong Future Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Future Technology Co ltd filed Critical Guangdong Future Technology Co ltd
Priority to CN202110751416.7A priority Critical patent/CN113469058B/en
Publication of CN113469058A publication Critical patent/CN113469058A/en
Application granted granted Critical
Publication of CN113469058B publication Critical patent/CN113469058B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

本发明提供了一种预防近视的方法及移动设备,用于提醒用户用眼正确以预防近视,所述方法包括:当屏幕点亮时,通过前置摄像头追踪用户的人脸图像,并标定所述人脸图像上各特征点的坐标;根据所述人脸图像上各特征点的坐标,确定所述用户是否符合正确的观看姿势;若否,则确定用户维持不正确的观看姿势的持续时间;当所述持续时间大于第一阈值时,向所述用户发送第一提示消息,所述第一提示消息用于提示所述用户保持正确用眼姿势。

The present invention provides a method for preventing myopia and a mobile device, which are used to remind users to use their eyes correctly to prevent myopia. The method includes: when the screen is turned on, tracking the user's facial image through a front camera, and calibrating the coordinates of each feature point on the facial image; determining whether the user has a correct viewing posture based on the coordinates of each feature point on the facial image; if not, determining the duration of the user maintaining an incorrect viewing posture; when the duration is greater than a first threshold, sending a first prompt message to the user, the first prompt message being used to prompt the user to maintain a correct eye posture.

Description

Myopia prevention method and mobile equipment
[ Field of technology ]
The invention relates to the field of eye detection, in particular to a method and mobile equipment for preventing myopia.
[ Background Art ]
In the development process of children, the frequency of reading, writing and using various electronic equipment of children is higher, myopia, strabismus, humpback and the like are common diseases of students, and the main reasons are that the bad habit of the students is formed for a long time and is difficult to be solved by reminding a teacher only because the sitting posture of the students is not correct in the long-term learning process. Especially for myopia, in recent years, the incidence rate of myopia gradually rises, and meanwhile, the incidence rate of myopia also has a trend of lower incidence age. In the learning process, students can keep correct sitting postures, proper reading and watching angles, proper reading light intensity and proper eyes are very important for preventing myopia. Various devices or apparatuses have been developed in the prior art for correcting bad eye habits of students, such as sitting posture corrector and various desk lamps, etc., to prevent myopia.
It should be noted that, nowadays, more and more myopia is caused by using mobile devices, such as mobile phones, ipad, etc., so how to correct bad eye habits of users when they use the mobile devices is a problem that needs to be solved currently.
[ Invention ]
In order to solve the problem of myopia caused by bad eye habit of a user when the user uses the mobile equipment, the invention provides a method for preventing myopia and the mobile equipment.
The technical scheme includes that when a screen is lightened, a front camera is used for tracking a face image of a user and calibrating coordinates of feature points on the face image, whether the user accords with a correct watching posture or not is determined according to the coordinates of the feature points on the face image, if not, duration of the user maintaining the incorrect watching posture is determined, and when the duration is larger than a first threshold, a first prompt message is sent to the user, and the first prompt message is used for prompting the user to maintain the correct eye posture.
Preferably, the first prompting message is used for prompting that the viewing distance of the user is unhealthy, or the first prompting message is used for prompting that the viewing angle of the user is unhealthy, or the first prompting message is used for prompting that the viewing gesture of the user is unhealthy.
Preferably, when the first prompting message is used for prompting that the viewing distance of the user is unhealthy, the determining whether the user accords with the correct viewing posture according to the coordinates of each feature point on the face image includes determining the comprehensive imaging size of the feature part of the user according to the coordinates of each feature point, calculating the viewing distance according to the comprehensive imaging size of the feature part and preset standard data, and determining that the user does not accord with the correct viewing posture when the viewing distance is greater than the eye safety distance.
Preferably, the feature is a pupil, the integrated imaging size of the feature is a pupil distance, and the determining the integrated imaging size of the feature of the user includes calculating the pupil distance L between the two eyes of the user by the following formula: The coordinates of the left pupil are (x 1, y 1), and the coordinates of the right pupil are (x 2, y 2).
The calculating of the viewing distance comprises the steps of calculating a transverse viewing angle and a longitudinal viewing angle according to the deviation of the left pupil and the right pupil from a picture origin based on the following formula, namely alpha= | (x 2 +x1) |, beta= | (y 2 +y1) |, wherein alpha is related to the transverse viewing angle, beta is related to the longitudinal viewing angle, the picture center point is a coordinate system origin calibrated in advance, when alpha is smaller than a second threshold value and beta is smaller than a third threshold value, a viewing angle is calculated according to the transverse viewing angle and the longitudinal viewing angle, and the viewing distance is calculated according to the viewing angle and the pupil distance of two eyes of the user through the following formula, namely M=L/cos delta, wherein delta is used for representing the viewing angle, and L is used for representing the pupil distance of the two eyes of the user.
Preferably, when the α is greater than the second threshold or the β is greater than the third threshold, a first prompting message for prompting that the viewing angle of the user is unhealthy is sent to the user.
Preferably, the first prompting message is used for prompting that the user is unhealthy in viewing posture, the determining whether the user accords with the correct viewing posture comprises the steps of connecting a pupil center point in the face image to a human middle hole and extending the pupil center point in a two-way mode to divide the face image, determining a left face area and a right face area respectively according to the sum of skin color pixels counted by the divided face image, calculating the strabismus degree according to a formula d= (L-R)/(L+R), wherein d is used for representing the strabismus degree, L is used for representing the left face area, R is used for representing the right face area, and determining that the user does not accord with the correct viewing posture when the strabismus degree is larger than a fourth threshold.
Preferably, the first prompting message is used for prompting that the user is unhealthy in viewing posture, and the determining whether the user accords with the correct viewing posture includes determining that the user is viewing the mobile device according to a face image of the user, deducing the face space posture from a relative spatial position of a mobile phone and a face and horizontal posture data of a nine-axis gyroscope in the mobile device, and determining whether the user is in a side prone posture or a supine posture for viewing if the horizontal posture data is in a preset interval range and the face space posture is in a preset interval, and determining that the user does not accord with the correct viewing posture.
Preferably, the first prompt message comprises a message displayed above a screen of the mobile device, or covering the screen of the mobile device, or sending information to a preset telephone number.
The invention further provides mobile equipment for preventing myopia, which comprises a camera, a processor and a display module, wherein the camera is used for tracking a face image of a user when a screen is lightened and calibrating coordinates of each characteristic point on the face image, the processor is used for determining whether the user accords with a correct viewing posture according to the coordinates of each characteristic point on the face image, if not, the duration of the user maintaining the incorrect viewing posture is determined, and the display module is used for sending a first prompt message to the user when the duration is larger than a first threshold value, wherein the first prompt message is used for prompting the user to maintain the correct viewing posture.
The embodiment of the application provides a myopia prevention method, which specifically comprises the steps of tracking a face image of a user through a front camera when a screen is lightened, calibrating coordinates of each feature point on the face image, determining whether the user accords with a correct watching posture according to the coordinates of each feature point on the face image, if not, determining the duration of the user maintaining the incorrect watching posture, and sending a first prompting message to the user when the duration is larger than a first threshold value, wherein the first prompting message is used for prompting the user to maintain the correct eye using posture. When the user is determined to be incorrect in using the mobile device, reminding the user of maintaining the correct eye-using posture so as to prevent myopia.
[ Description of the drawings ]
FIG. 1 is a schematic flow chart of a method for preventing myopia according to the embodiments of the present application;
Fig. 2 is a flowchart illustrating a process of determining whether a user accords with a correct viewing gesture according to an embodiment of the present application.
[ Detailed description ] of the invention
The following description of the technical solutions in the embodiments of the present invention will be clear and complete, and it is obvious that the described embodiments are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
In order to solve the problem of myopia caused by bad eye habit of a user when the user uses the mobile equipment, the invention provides a method for preventing myopia and the mobile equipment, which are used for prompting the user to correct the bad eye habit when the user uses the mobile equipment so as to prevent myopia.
Referring to fig. 1, a flowchart of a myopia prevention method provided by the present invention specifically includes the following steps:
101. When the screen is lightened, tracking a face image of a user through a front-facing camera, and calibrating coordinates of each characteristic point on the face image;
When a user uses the mobile device and the screen of the mobile device is displayed to be bright, the face image of the user is tracked through the front camera of the mobile device. After the front camera captures the face image, the coordinates of each feature point are calibrated on the face image. The feature points in the face image may be pupil, corner feature points, center feature points of the eyes, and the like, and are not particularly limited herein.
In addition, various ways of calibrating the coordinates of each feature point on the face image are available, for example, preprocessing the face image, extracting the corner points in the preprocessed face image, filtering and merging the corner points to obtain a connected region of the corner points, and extracting the centroid of the connected region of the corner points may be specifically performed by calculating the brightness difference between the current pixel point and surrounding pixel points according to a predefined 3X3 template, and extracting the pixel point with the brightness difference greater than or equal to a first threshold value as the corner point, where the 3X3 template is formed by using the current pixel point as the center and the pixel points on the left, right, upper, lower, upper left, upper right, lower left and lower right of the current pixel point. And matching the extracted mass center with a face template, calculating the matching probability of the mass center and the face template, and positioning a region formed by the mass center with the matching probability larger than or equal to a preset value as a candidate face region, wherein the face template can be a rectangular template and at least comprises three points, for example, each point is represented by (P, w, h), wherein P is a two-dimensional coordinate of the point, w is a maximum transverse range allowed to appear about the point, and h is a maximum longitudinal range allowed to appear up and down about the point.
Or the coordinates of each characteristic point on the face image can be calibrated through AI face recognition, for example, 2.1 points, 5 points and 6 points are marked, the most critical points of the face are 5 points, namely left and right mouth corners, centers of two eyes and a nose, the 5 key points belong to key points in the face, and the pose of the face can be calculated according to the key points. Of course, there are also schemes of labeling 4 points and 6 points in the early stage. The FRGC key points of eyes, nose, mouth and chin are marked in FRGC-V2 (Face Recognition GRAND CHALLENGE version 2.0) published in 2005. The Caltech 10000Web Faces dataset published in 2007 is labeled with 4 key points of eyes, nose and mouth. The 2013 AFW dataset marked 6 key points for eyes, nose and lips, wherein the lips have 3 points. The MTFL/MAFL dataset published in 2014 labels 5 key points of eyes, nose and 2 corners of mouth. Or 68-point labeling is the most common labeling scheme nowadays, and is proposed in Xm2vtsdb dataset in 1999 in early days, and the 300W dataset, XM2VTS dataset and other datasets also adopt 68-key-point schemes and are adopted in Dlib algorithm in OpenCV.
The labeling of 68 keypoints is also a number of different versions, here we introduce the most general version in Dlib, which divides the face keypoints into internal keypoints comprising 51 total of eyebrows, eyes, nose, mouth and contour keypoints comprising 17 total of keypoints. The adopted 68 face key point marks of Dlib are 5 key points on the single eyebrow, and the left boundary to the right boundary are uniformly sampled, and the total number is 5 multiplied by 2=10. The eye was divided into 6 key points, left and right boundaries, upper and lower eyelids were sampled uniformly, 6×2=12 total. The lips were divided into 20 key points except for 2 corners of the mouth, and into upper and lower lips. The outer boundaries of the upper and lower lips were each evenly sampled at 5 points, and the inner boundaries of the upper and lower lips were each evenly sampled at 3 points, for a total of 20 points. The labeling of the nose increases 4 key points of the nose bridge part, and the nose tip part uniformly collects 5 key points, namely 9 key points. The facial contours uniformly sample 17 keypoints. If the forehead part is added, more, such as 81 key points, can be obtained. Therefore, the technology of calibrating each feature point on the face image is the prior art, and detailed description thereof is omitted here.
Therefore, the method for obtaining the coordinates of each feature point on the face image is not particularly limited.
102. Determining whether the user accords with the correct watching posture according to the coordinates of each characteristic point on the face image;
103. If not, determining a duration for which the user maintains the incorrect viewing pose;
After the coordinates of each feature point on the face image are determined, whether the user accords with the correct viewing posture or not is determined according to the coordinates of each coordinate point, wherein the correct viewing posture can comprise that the viewing distance of the user is correct, the viewing angle is correct and the viewing posture is correct. Referring to fig. 2, a flow chart for determining whether a user accords with a correct viewing posture according to the present invention includes:
1021. determining the comprehensive imaging size of the characteristic part of the user according to the coordinates of the characteristic points
In practical applications, one or more groups of feature points in the face image may be pupils, or feature points equivalent to the pupils, such as corner of eye feature points, center of eye feature points, and the like. In the application, the characteristic part is taken as a pupil example, the comprehensive imaging size of the characteristic part is the pupil distance,
Specifically, the pupil distance L of the two eyes of the user can be calculated by the following formula:
The coordinates of the left pupil are (x 1, y 1), and the coordinates of the right pupil are (x 2, y 2).
1022. Calculating the watching distance according to the comprehensive imaging size of the characteristic part and preset standard data;
After determining the comprehensive imaging size of the user feature, i.e., the pupil distance of both eyes, a lateral viewing angle and a longitudinal viewing angle are calculated from the deviation of the left pupil and the right pupil from the origin of the picture, specifically, the lateral viewing angle and the longitudinal viewing angle are calculated by the following formulas:
α=|(x2+x1)|;
β=|(y2+y1)|;
wherein α is used to denote a correlation with the landscape viewing angle and β is used to denote a correlation with the portrait viewing angle.
Alternatively, the lateral and longitudinal viewing angles may also be calculated by:
α=|(x2+x1)|;
β=|(y2+y1)|;
The alpha is related to the transverse view angle, the beta is related to the longitudinal view angle, and the picture center point is a coordinate system origin point calibrated in advance.
When the alpha is less than the second threshold and the beta is less than the third threshold,
The method comprises the following steps:
and calculating the watching distance according to the observing angle and the pupil distance of the two eyes of the user.
The viewing distance is calculated by the following formula m=l x/cos δ, wherein δ is used to represent the viewing angle and L is used to represent the pupillary distance of the eyes of the user.
1023. When the viewing distance is greater than the eye-safe distance, then it is determined that the user does not conform to the correct viewing pose.
And comparing the viewing distance with the eye-use safety distance after the viewing distance is obtained, and determining that the user does not accord with the correct viewing posture when the viewing distance is larger than the eye-use safety distance.
Optionally, in the embodiment of the present application, the manner of determining whether the user accords with the correct viewing gesture may further include the following manners:
1. And when the alpha is larger than the second threshold value or the beta is larger than the third threshold value, sending a first prompt message for prompting that the viewing angle of the user is unhealthy to the user.
2. The method comprises the steps of connecting a pupil center point in a face image to a human middle hole and extending the pupil center point in a two-way mode to divide the face image, calculating skin color pixel sum according to the divided face image, respectively determining left face area and right face area, calculating strabismus degree according to a formula d= (L-R)/(L+R), wherein d is used for representing the strabismus degree, L is used for representing the left face area, R is used for representing the right face area, and determining that the user does not accord with the correct watching posture when the strabismus degree is larger than a fourth threshold value.
In addition, in the smart phone, the gyroscope is a sensor which is mainly used for detecting the gesture of the smart phone, the user can not play the smart phone with little sense of motion, the smart phone can be used for preventing shake when photographing, and in addition, the smart phone navigation can be used for better positioning sometimes. The earliest gyroscopes are mechanical, have large volume and really have gyroscopes rotating at high speed, and the mechanical things have very high requirements on processing precision and are afraid of vibration, so the precision of the navigation system based on the mechanical gyroscopes is not very high all the time. In smart phones today, gyroscopic sensors have evolved into a small chip, which belongs to a sensor that is an upgrade to acceleration sensors. The acceleration sensor can monitor and sense a certain axial linear motion, and the gyroscope can detect and sense the linear motion and motion of the 3D space. Thus, the direction can be recognized, the posture can be determined, and the angular velocity can be calculated.
In view of the above, the embodiment of the application can also judge whether the user accords with the correct viewing posture through the gyroscope, specifically, the third embodiment determines that the user is watching the mobile device according to the face image of the user, deduces the face space posture through the relative spatial position of the mobile phone and the face and the horizontal posture data of the nine-axis gyroscope in the mobile device, and determines that the user does not accord with the correct viewing posture if the horizontal posture data is in the preset interval range and the face space posture is in the preset interval range and whether the user watches in the side prone posture or the supine posture.
In summary, the embodiments of the present application provide various methods for determining whether a user meets a correct viewing posture, and may combine various ways to be integrated in the mobile device together, so as to more accurately determine that the user's viewing posture is incorrect, and prompt the user to use eye health to prevent myopia.
104. And when the duration is greater than a first threshold, sending a first prompt message to the user, wherein the first prompt message is used for prompting the user to keep the correct eye-using posture.
When the duration is greater than a first threshold, a first alert message is sent to the user to alert the user to maintain a correct eye pose. Specifically, the first prompt message may be that the first prompt message is displayed above a screen of the mobile device, or the screen of the mobile device is covered to prompt the user that the distance from the screen is smaller than a safety eye distance, or the screen is extinguished and the first prompt message is broadcast in a voice mode, or the first prompt message is sent to a preset telephone number to inform. For example, when the viewing distance is too close, the first prompt message is a viewing distance unhealthy early warning signal, or when the viewing angle is greater than a second threshold, the first prompt message is a viewing angle unhealthy early warning signal, or when the strabismus degree is greater than a third threshold, the user is determined to be not in accordance with the correct viewing posture, and the first prompt message is a viewing posture unhealthy early warning signal.
In addition, the embodiment of the application also provides mobile equipment for preventing myopia, which comprises a camera, a processor and a display module, wherein the camera is used for tracking a face image of a user when a screen is lightened and calibrating coordinates of all feature points on the face image, the processor is used for determining whether the user accords with a correct viewing posture according to the coordinates of all feature points on the face image, if not, the duration of the user maintaining the incorrect viewing posture is determined, and the display module is used for sending a first prompt message to the user when the duration is larger than a first threshold value, wherein the first prompt message is used for prompting the user to maintain the correct viewing posture.
The foregoing description is only one or several embodiments of the present invention, and is not intended to limit the scope of the invention, and all equivalent structures or equivalent processes using the descriptions of the present invention and the accompanying drawings, or direct or indirect application in other related technical fields, are included in the scope of the present invention.

Claims (5)

1.一种预防近视的方法,所述方法应用于移动设备,其特征在于,包括如下步骤:1. A method for preventing myopia, the method being applied to a mobile device, characterized in that it comprises the following steps: 当屏幕点亮时,通过前置摄像头追踪用户的人脸图像,并标定所述人脸图像上各特征点的坐标;When the screen is on, the user's face image is tracked through the front camera, and the coordinates of each feature point on the face image are calibrated; 根据所述人脸图像上各特征点的坐标,确定所述用户是否符合正确的观看姿势;Determining whether the user has a correct viewing posture according to the coordinates of each feature point on the face image; 若否,则确定用户维持不正确的观看姿势的持续时间;If not, determining the duration for which the user maintains the incorrect viewing posture; 当所述持续时间大于第一阈值时,向所述用户发送第一提示消息,所述第一提示消息用于提示所述用户保持正确用眼姿势,When the duration is greater than a first threshold, a first prompt message is sent to the user, wherein the first prompt message is used to prompt the user to maintain a correct eye posture. 所述第一提示消息用于提示所述用户观看距离不健康;The first prompt message is used to prompt the user that the viewing distance is unhealthy; 或,所述第一提示消息用于提示所述用户观看角度不健康;Or, the first prompt message is used to prompt the user that the viewing angle is unhealthy; 或,所述第一提示消息用于提示所述用户观看姿态不健康,Or, the first prompt message is used to prompt the user that the viewing posture is unhealthy, 当所述第一提示消息用于提示所述用户观看距离不健康时,所述根据所述人脸图像上各特征点的坐标,确定所述用户是否符合正确的观看姿势包括:When the first prompt message is used to prompt the user that the viewing distance is unhealthy, determining whether the user conforms to the correct viewing posture according to the coordinates of each feature point on the face image includes: 根据所述各特征点的坐标,确定所述用户的特征部位的综合成像尺寸;Determining the comprehensive imaging size of the characteristic parts of the user according to the coordinates of each characteristic point; 根据所述特征部位的综合成像尺寸和预置的标准数据,计算所述观看距离;Calculating the viewing distance according to the comprehensive imaging size of the characteristic part and preset standard data; 当所述观看距离大于用眼安全距离时,则确定所述用户不符合正确的观看姿势,When the viewing distance is greater than the eye safety distance, it is determined that the user does not conform to the correct viewing posture. 所述特征部位为瞳孔,所述特征部位的综合成像尺寸为瞳孔距离,所述确定所述用户的特征部位的综合成像尺寸包括:The characteristic part is a pupil, the integrated imaging size of the characteristic part is a pupil distance, and determining the integrated imaging size of the characteristic part of the user includes: 通过以下公式计算所述用户两眼的瞳孔距离L:The pupil distance L between the two eyes of the user is calculated by the following formula: 其中,左瞳孔的坐标为(x1,y1),右瞳孔的坐标为(x2,y2),Among them, the coordinates of the left pupil are (x1, y1), and the coordinates of the right pupil are (x2, y2). 所述计算所述观看距离包括:The calculating the viewing distance comprises: 基于以下公式,根据所述左瞳孔和所述右瞳孔与画面原点的偏离计算横向视角和纵向视角:Based on the following formula, the horizontal viewing angle and the vertical viewing angle are calculated according to the deviation of the left pupil and the right pupil from the origin of the picture: α=|(x2+x1)|;α=|(x2+x1)|; β=|(y2+y1)|;β=|(y2+y1)|; 其中,所述α与所述横向视角相关,所述β与所述纵向视角相关,画面中心点为预先标定的坐标系原点;Wherein, the α is related to the horizontal viewing angle, the β is related to the vertical viewing angle, and the center point of the picture is the origin of the pre-calibrated coordinate system; 当所述α小于第二阈值且所述β小于第三阈值时,根据所述横向视角和所述纵向视角,计算观察角度;When the α is less than a second threshold value and the β is less than a third threshold value, calculating an observation angle according to the horizontal viewing angle and the vertical viewing angle; 根据所述观察角度、所述用户两眼的瞳孔距离,通过以下公式计算所述观看距离:According to the observation angle and the pupil distance between the two eyes of the user, the viewing distance is calculated by the following formula: M=L/cosδ;M = L/cosδ; 其中,所述δ用于表示所述观察角度,所述L用于表示所述用户两眼的瞳孔距离,Wherein, δ is used to represent the observation angle, and L is used to represent the pupil distance between the two eyes of the user. 当所述α大于所述第二阈值或者所述β大于所述第三阈值时,向所述用户发送用于提示所述用户观看角度不健康的第一提示消息。When the α is greater than the second threshold or the β is greater than the third threshold, a first prompt message is sent to the user to prompt the user that the viewing angle is unhealthy. 2.根据权利要求1所述的方法,其特征在于,所述第一提示消息用于提示所述用户观看姿态不健康时,所述确定所述用户是否符合正确的观看姿势包括:2. The method according to claim 1, wherein when the first prompt message is used to prompt the user that the viewing posture is unhealthy, the determining whether the user conforms to the correct viewing posture comprises: 将所述人脸图像中,瞳孔中心点到人中穴进行连线并双向延长,以分割所述人脸图像;Connecting a line from the pupil center point to the Ren Zhong acupoint in the face image and extending the line in both directions to segment the face image; 根据分割后的所述人脸图像统计肤色像素总和,分别确定左脸面积和右脸面积;According to the segmented face image, the total amount of skin color pixels is counted to determine the left face area and the right face area respectively; 根据公式d=(L-R)/(L+R)计算斜视程度,所述d用于表示所述斜视程度,所述L用于表示所述左脸面积,所述R用于表示所述右脸面积;Calculate the degree of strabismus according to the formula d=(L-R)/(L+R), wherein d is used to represent the degree of strabismus, L is used to represent the area of the left face, and R is used to represent the area of the right face; 当所述斜视程度大于第四阈值时,确定所述用户不符合正确的观看姿势。When the squint degree is greater than a fourth threshold, it is determined that the user does not conform to a correct viewing posture. 3.根据权利要求1所述的方法,其特征在于,所述第一提示消息用于提示所述用户观看姿态不健康时,所述确定所述用户是否符合正确的观看姿势包括:3. The method according to claim 1, wherein when the first prompt message is used to prompt the user that the viewing posture is unhealthy, the determining whether the user conforms to the correct viewing posture comprises: 根据所述用户的人脸图像,确定所述用户正在观看移动设备;Determining, based on the facial image of the user, that the user is viewing the mobile device; 由手机与人脸相对空间位置,与所述移动设备中九轴陀螺仪的水平姿态数据,推断出人脸空间姿态;Inferring the spatial posture of the face from the relative spatial position of the mobile phone and the face and the horizontal posture data of the nine-axis gyroscope in the mobile device; 若所述水平姿态数据处于预设区间范围内,且人脸空间姿态处于预设区间内,判断用户是否处于侧卧姿或仰卧姿观看,则确定所述用户不符合正确的观看姿势。If the horizontal posture data is within the preset range and the face space posture is within the preset range, it is determined whether the user is in a side-lying or supine viewing position, and it is determined that the user does not meet the correct viewing posture. 4.根据权利要求1所述的方法,其特征在于,所述第一提示消息包括:4. The method according to claim 1, wherein the first prompt message comprises: 显示在移动终端的屏幕上方;Displayed on the top of the screen of the mobile terminal; 或,or, 覆盖移动终端的屏幕;Covering the screen of a mobile terminal; 或,or, 向预设的电话号码发送信息。Send a message to a preset phone number. 5.一种移动设备,用于预防近视,其特征在于,包括:5. A mobile device for preventing myopia, comprising: 摄像头,当屏幕点亮时,用于追踪用户的人脸图像,并标定所述人脸图像上各特征点的坐标;The camera is used to track the user's facial image and calibrate the coordinates of each feature point on the facial image when the screen is lit; 处理器,用于根据所述人脸图像上各特征点的坐标,确定所述用户是否符合正确的观看姿势;若否,则确定用户维持不正确的观看姿势的持续时间;A processor, configured to determine whether the user conforms to a correct viewing posture according to the coordinates of each feature point on the face image; if not, determine a duration for which the user maintains an incorrect viewing posture; 显示模块,用于当所述持续时间大于第一阈值时,向所述用户发送第一提示消息,所述第一提示消息用于提示所述用户保持正确用眼姿势,The display module is configured to send a first prompt message to the user when the duration is greater than a first threshold, wherein the first prompt message is configured to prompt the user to maintain a correct eye posture. 其中,所述第一提示消息用于提示所述用户观看距离不健康;Wherein, the first prompt message is used to prompt the user that the viewing distance is unhealthy; 或,所述第一提示消息用于提示所述用户观看角度不健康;Or, the first prompt message is used to prompt the user that the viewing angle is unhealthy; 或,所述第一提示消息用于提示所述用户观看姿态不健康,Or, the first prompt message is used to prompt the user that the viewing posture is unhealthy, 当所述第一提示消息用于提示所述用户观看距离不健康时,所述根据所述人脸图像上各特征点的坐标,确定所述用户是否符合正确的观看姿势包括:When the first prompt message is used to prompt the user that the viewing distance is unhealthy, determining whether the user conforms to the correct viewing posture according to the coordinates of each feature point on the face image includes: 根据所述各特征点的坐标,确定所述用户的特征部位的综合成像尺寸;Determining the comprehensive imaging size of the characteristic parts of the user according to the coordinates of each characteristic point; 根据所述特征部位的综合成像尺寸和预置的标准数据,计算所述观看距离;Calculating the viewing distance according to the comprehensive imaging size of the characteristic part and preset standard data; 当所述观看距离大于用眼安全距离时,则确定所述用户不符合正确的观看姿势,When the viewing distance is greater than the eye safety distance, it is determined that the user does not conform to the correct viewing posture. 所述特征部位为瞳孔,所述特征部位的综合成像尺寸为瞳孔距离,所述确定所述用户的特征部位的综合成像尺寸包括:The characteristic part is a pupil, the integrated imaging size of the characteristic part is a pupil distance, and determining the integrated imaging size of the characteristic part of the user includes: 通过以下公式计算所述用户两眼的瞳孔距离L:The pupil distance L between the two eyes of the user is calculated by the following formula: 其中,左瞳孔的坐标为(x1,y1),右瞳孔的坐标为(x2,y2),Among them, the coordinates of the left pupil are (x1, y1), and the coordinates of the right pupil are (x2, y2). 所述计算所述观看距离包括:The calculating the viewing distance comprises: 基于以下公式,根据所述左瞳孔和所述右瞳孔与画面原点的偏离计算横向视角和纵向视角:Based on the following formula, the horizontal viewing angle and the vertical viewing angle are calculated according to the deviation of the left pupil and the right pupil from the origin of the picture: α=|(x2+x1)|;α=|(x2+x1)|; β=|(y2+y1)|;β=|(y2+y1)|; 其中,所述α与所述横向视角相关,所述β与所述纵向视角相关,画面中心点为预先标定的坐标系原点;Wherein, the α is related to the horizontal viewing angle, the β is related to the vertical viewing angle, and the center point of the picture is the origin of the pre-calibrated coordinate system; 当所述α小于第二阈值且所述β小于第三阈值时,根据所述横向视角和所述纵向视角,计算观察角度;When the α is less than a second threshold value and the β is less than a third threshold value, calculating an observation angle according to the horizontal viewing angle and the vertical viewing angle; 根据所述观察角度、所述用户两眼的瞳孔距离,通过以下公式计算所述观看距离:According to the observation angle and the pupil distance between the two eyes of the user, the viewing distance is calculated by the following formula: M=L/cosδ;M = L/cosδ; 其中,所述δ用于表示所述观察角度,所述L用于表示所述用户两眼的瞳孔距离,Wherein, δ is used to represent the observation angle, and L is used to represent the pupil distance between the two eyes of the user. 当所述α大于所述第二阈值或者所述β大于所述第三阈值时,向所述用户发送用于提示所述用户观看角度不健康的第一提示消息。When the α is greater than the second threshold or the β is greater than the third threshold, a first prompt message is sent to the user to prompt the user that the viewing angle is unhealthy.
CN202110751416.7A 2021-07-02 2021-07-02 Method and mobile device for preventing myopia Active CN113469058B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110751416.7A CN113469058B (en) 2021-07-02 2021-07-02 Method and mobile device for preventing myopia

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110751416.7A CN113469058B (en) 2021-07-02 2021-07-02 Method and mobile device for preventing myopia

Publications (2)

Publication Number Publication Date
CN113469058A CN113469058A (en) 2021-10-01
CN113469058B true CN113469058B (en) 2024-12-24

Family

ID=77877622

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110751416.7A Active CN113469058B (en) 2021-07-02 2021-07-02 Method and mobile device for preventing myopia

Country Status (1)

Country Link
CN (1) CN113469058B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104715234A (en) * 2014-12-31 2015-06-17 湘潭大学 Side view detecting method and system
CN106919250A (en) * 2015-12-28 2017-07-04 中国移动通信集团公司 A kind of based reminding method and device
CN107066089A (en) * 2017-03-08 2017-08-18 北京互讯科技有限公司 A kind of mobile phone eye posture guard method based on computer vision technique
CN107426423A (en) * 2017-07-17 2017-12-01 深圳天珑无线科技有限公司 Reminding method, terminal and the computer-readable storage medium of posture are used based on terminal

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101112735B1 (en) * 2005-04-08 2012-03-13 삼성전자주식회사 3D display apparatus using hybrid tracking system
JP5515301B2 (en) * 2009-01-21 2014-06-11 株式会社ニコン Image processing apparatus, program, image processing method, recording method, and recording medium
CN102662476B (en) * 2012-04-20 2015-01-21 天津大学 Gaze estimation method
CN105391997B (en) * 2015-11-05 2017-12-29 广东未来科技有限公司 The 3d viewpoint bearing calibration of 3 d display device
CN109766011A (en) * 2019-01-16 2019-05-17 北京七鑫易维信息技术有限公司 A kind of image rendering method and device
KR102470341B1 (en) * 2019-06-07 2022-11-24 스펙스 리미티드 eye test
CN110381305B (en) * 2019-07-31 2021-06-01 南方医科大学南方医院 Naked eye 3D crosstalk removing method and system, storage medium and electronic equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104715234A (en) * 2014-12-31 2015-06-17 湘潭大学 Side view detecting method and system
CN106919250A (en) * 2015-12-28 2017-07-04 中国移动通信集团公司 A kind of based reminding method and device
CN107066089A (en) * 2017-03-08 2017-08-18 北京互讯科技有限公司 A kind of mobile phone eye posture guard method based on computer vision technique
CN107426423A (en) * 2017-07-17 2017-12-01 深圳天珑无线科技有限公司 Reminding method, terminal and the computer-readable storage medium of posture are used based on terminal

Also Published As

Publication number Publication date
CN113469058A (en) 2021-10-01

Similar Documents

Publication Publication Date Title
TWI704501B (en) Electronic apparatus operated by head movement and operation method thereof
JP5728009B2 (en) Instruction input device, instruction input method, program, recording medium, and integrated circuit
WO2020042542A1 (en) Method and apparatus for acquiring eye movement control calibration data
CN109375765B (en) Eyeball tracking interaction method and device
CN113197542B (en) Online self-service vision detection system, mobile terminal and storage medium
KR20180072734A (en) Identify eye pose using eye features
CN102149325A (en) Line-of-sight direction determination device and line-of-sight direction determination method
CN106529409A (en) Eye ocular fixation visual angle measuring method based on head posture
CN109785396B (en) Writing posture monitoring method, system and device based on binocular camera
CN101872237A (en) Pupil Tracking Method and System and Correction Method and Module for Pupil Tracking
JP2017129904A (en) Information processor, information processing method, and record medium
JP2022538669A (en) Improved eye tracking latency
CN106575439A (en) Image registration device, image registration method, and image registration program
CN107729871A (en) Infrared light-based human eye movement track tracking method and device
WO2019031005A1 (en) Information processing device, information processing method, and program
CN112232128B (en) Eye tracking based method for identifying care needs of old disabled people
CN106618479B (en) Pupil tracking system and method thereof
JP2011090702A (en) Sight line direction estimating device, sight line direction estimating method, and program for executing the sight line direction estimating method by computer
CN109766007A (en) A kind of the blinkpunkt compensation method and compensation device, display equipment of display equipment
JP5103682B2 (en) Interactive signage system
CN114078278A (en) Method and device for positioning fixation point, electronic equipment and storage medium
US12189844B2 (en) Eye-gaze tracking apparatus and a method of eye-gaze tracking
CN115083325A (en) Equipment control method and device, electronic equipment and storage medium
CN113469058B (en) Method and mobile device for preventing myopia
CN109194952A (en) Wear-type eye movement tracing equipment and its eye movement method for tracing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant