CN114689013B - Ranging method, ranging device, ranging equipment and storage medium - Google Patents
Ranging method, ranging device, ranging equipment and storage medium Download PDFInfo
- Publication number
- CN114689013B CN114689013B CN202210147261.0A CN202210147261A CN114689013B CN 114689013 B CN114689013 B CN 114689013B CN 202210147261 A CN202210147261 A CN 202210147261A CN 114689013 B CN114689013 B CN 114689013B
- Authority
- CN
- China
- Prior art keywords
- image
- pupil
- determining
- ranging
- distance
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 51
- 210000001747 pupil Anatomy 0.000 claims abstract description 185
- 238000005259 measurement Methods 0.000 claims abstract description 84
- 238000004590 computer program Methods 0.000 claims description 8
- 238000010586 diagram Methods 0.000 description 14
- 230000006870 function Effects 0.000 description 12
- 238000012545 processing Methods 0.000 description 9
- 230000000007 visual effect Effects 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 4
- 238000003491 array Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 239000000835 fiber Substances 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000001902 propagating effect Effects 0.000 description 2
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 229910052802 copper Inorganic materials 0.000 description 1
- 239000010949 copper Substances 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
Landscapes
- Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Studio Devices (AREA)
Abstract
The application discloses a ranging method, a ranging device, ranging equipment and a storage medium, and relates to the technical field of ranging. The method comprises the following steps: acquiring a second image of the pupil when the first ranging point is checked and a third image of the pupil when the second ranging point is checked; determining the rotation angle of the pupil from the first ranging point to the second ranging point according to the second image and the third image; determining a first pupil aperture value according to the second image; determining a second pupil aperture value according to the third image; and determining a first distance value between the first distance measurement point and the second distance measurement point according to the rotation angle, the first pupil aperture value and the second pupil aperture value. By this method, the user is not required to be located at the position of at least one of the two points in space, which greatly reduces the cost and complexity of measuring the distance between the two points in space.
Description
Technical Field
The present application relates to the field of rangeable technologies, and more particularly, to a ranging method, a ranging apparatus, a ranging device, and a computer-readable storage medium.
Background
Currently, in many scenarios, it is necessary to calculate the distance between two points in space. For example, measuring space area, space volume, building AR scenes, assisting in automatic navigation, etc. in engineering cost.
In the conventional art, a ruler, a range finder, or the like is generally used to calculate the distance between two points in a space.
However, when measuring the distance between two points in space using a ruler or a range finder or the like, it is often necessary that the user be located at the position of at least one of the two points in space, which makes the measurement of the distance between the two points in space costly and complicated.
Disclosure of Invention
It is an object of the present application to provide a new solution that enables ranging.
According to a first aspect of the present application, there is provided a ranging method, the method comprising:
acquiring a second image of the pupil when the first ranging point is checked and a third image of the pupil when the second ranging point is checked;
Determining the rotation angle of the pupil from the first ranging point to the second ranging point according to the second image and the third image;
determining a first pupil aperture value according to the second image;
determining a second pupil aperture value according to the third image;
and determining a first distance value between the first distance measurement point and the second distance measurement point according to the rotation angle, the first pupil aperture value and the second pupil aperture value.
Optionally, the determining, according to the second image and the third image, the rotation angle of the pupil from the first ranging point to the second ranging point comprises:
determining a second distance value between the image acquisition module and the pupil when the first distance measurement point is checked according to the second image;
determining a third distance value between the image acquisition module and the pupil when the second distance measurement point is checked according to the third image;
And determining the rotation angle of the pupil from the first distance measurement point to the second distance measurement point according to the second distance value and the third distance value.
Optionally, the determining, according to the second distance value and the third distance value, the rotation angle of the pupil from the first ranging point to the second ranging point includes:
acquiring a fourth distance value between a pupil and an image acquisition module when the image acquisition module is checked;
And determining the rotation angle of the pupil from the first distance measurement point to the second distance measurement point according to the fourth distance value, the second distance value and the third distance value.
Optionally, before determining the rotation angle of the pupil from viewing the first ranging point to viewing the second ranging point according to the second image and the third image, the method includes:
Acquiring a first image, wherein the first image is an image of a pupil when an image acquisition module is checked;
acquiring a fourth distance value between a pupil and an image acquisition module when the image acquisition module is checked;
the determining, according to the second image and the third image, a rotation angle of the pupil from viewing the first ranging point to viewing the second ranging point includes:
determining a fifth distance value between the pupil when the image acquisition module is checked and the pupil when the first distance measurement point is checked according to the first image and the second image;
determining a sixth distance value between the pupil when the image acquisition module is checked and the pupil when the second distance measurement point is checked according to the first image and the third image;
And determining the rotation angle of the pupil from the first distance measurement point to the second distance measurement point according to the fourth distance value, the fifth distance value and the sixth distance value.
Optionally, the acquiring the second image of the pupil when viewing the first ranging point and the third image of the pupil when viewing the second ranging point includes:
Acquiring an image set;
Sequentially selecting a first sub-image set and a second sub-image set which have the same image content in a preset time period from the image set;
determining a second image according to any image in the first sub-image set;
a third image is determined from any of the images in the second set of sub-images.
Optionally, the determining a first distance value between the first ranging point and the second ranging point according to the rotation angle, the first pupil aperture value and the second pupil aperture value includes:
determining a seventh distance value between the human eye and the first distance measurement point according to the first pupil aperture value;
determining an eighth distance value between the human eye and the second distance measurement point according to the second pupil aperture value;
And determining a first distance value between the first ranging point and the second ranging point according to the seventh distance value, the eighth distance value and the rotation angle.
Optionally, the method further comprises:
and outputting a first distance value between the first distance measurement point and the second distance measurement point.
According to a second aspect of the present application, there is provided a ranging apparatus comprising:
The acquisition module is used for acquiring a second image of the pupil when the first ranging point is checked and a third image of the pupil when the second ranging point is checked;
The first determining module is used for determining the rotation angle of the pupil from the first ranging point to the second ranging point according to the second image and the third image;
the second determining module is used for determining a first pupil aperture value according to the second image;
the third determining module is used for determining a second pupil aperture value according to the third image;
and the fourth determining module is used for determining a first distance value between the first distance measurement point and the second distance measurement point according to the rotation angle, the first pupil aperture value and the second pupil aperture value.
According to a third aspect of the present application there is provided an electronic device comprising an apparatus as described in the second aspect above;
Or comprises a memory for storing computer instructions and a processor for calling the computer instructions from the memory to perform the method according to any of the first aspects.
According to a fourth aspect of the present application, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method according to any of the first aspects.
In this embodiment, there is provided a ranging method including: acquiring a second image of the pupil when the first ranging point is checked and a third image of the pupil when the second ranging point is checked; determining the rotation angle of the pupil from the first ranging point to the second ranging point according to the second image and the third image; determining a first pupil aperture value according to the second image; determining a second pupil aperture value according to the third image; and determining a first distance value between the first distance measurement point and the second distance measurement point according to the rotation angle, the first pupil aperture value and the second pupil aperture value. In this embodiment, the measurement of the distance between two points in space can be achieved by image processing, so that the user is not required to be located at the position of at least one point in the two points in space, which greatly reduces the cost and complexity of the measurement of the distance between the two points in space.
Other features of the present application and its advantages will become apparent from the following detailed description of exemplary embodiments of the application, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description, serve to explain the principles of the application.
Fig. 1 is a schematic flow chart of a ranging method according to the embodiment of the present application;
fig. 2 is a schematic diagram of a ranging principle according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a correspondence relationship between a line of sight and a pupil aperture value according to an embodiment of the present application;
fig. 4 is a plane formed by a human eye, a first ranging point and a second ranging point according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a ranging device according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Various exemplary embodiments of the present application will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present application unless it is specifically stated otherwise.
The following description of at least one exemplary embodiment is merely exemplary in nature and is in no way intended to limit the application, its application, or uses.
Techniques, methods, and apparatus known to one of ordinary skill in the relevant art may not be discussed in detail, but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any specific values should be construed as merely illustrative, and not a limitation. Thus, other examples of exemplary embodiments may have different values.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further discussion thereof is necessary in subsequent figures.
< Method example >
The execution subject of the ranging method provided by the embodiment of the application is a ranging device. The apparatus may be an external electronic device connected to the image acquisition module. The apparatus may also be an electronic device comprising an image acquisition module. Or the apparatus may also be a hardware module and/or a software module in the aforementioned electronic device.
The embodiment of the application provides a ranging method, as shown in fig. 1, which comprises the following steps of S1100-S1500:
S1100, acquiring a second image of the pupil when the first distance measurement point is checked and a third image of the pupil when the second distance measurement point is checked.
In this embodiment, the first ranging point and the second ranging point are two points of space to be ranging. And the second image and the third image are acquired by an image acquisition module.
In one embodiment, as shown in fig. 2, the image acquisition module, such as a camera, related to the embodiment may be disposed on a lens of the glasses. In implementing the ranging method provided by the embodiment of the application, the user wears glasses as shown in fig. 2 in advance. On the basis, the implementation convenience of the ranging method provided by the embodiment of the application can be improved.
In the embodiment of the present application, the position of the image capturing module is not limited, as long as the image capturing module can capture the images related to the embodiment of the present application, for example, the second image and the third image.
In one example, the image acquisition module may be disposed at one side or behind the pupil, and the image acquisition module may acquire the first image and the second image by means of light reflection.
In one embodiment, the image acquisition device acquires an image, typically in a trigger motion, or acquires multiple frames of images in succession. Based on this, the above S1100 can be achieved by the following two ways:
Mode one, the following is performed by S1110 to S1113:
s1110, acquiring an image set.
In this embodiment, when the ranging method provided in this embodiment is executed, the user sequentially checks the first ranging point and checks the second ranging point for a preset period of time. Meanwhile, the image acquisition module continuously acquires images until an image set with the same image content in a preset time period is acquired for the second time. On this basis, the images acquired by the image acquisition module constitute an image set as in S1110 described above.
S1111, sequentially selecting a first sub-image set and a second sub-image set which have the same image content in a preset time period from the image set.
In combination with the above step S1110, the first sub-image set is an image set of the eye when viewing the first ranging point, and the second sub-image set is an image set of the eye when viewing the second ranging point.
It should be noted that the preset time period may be set empirically, and for example, the settable time period may be 3s. The present embodiment is not limited in this regard.
S1112, determining a second image according to any image in the first sub-image set.
In this embodiment, since the image contents of the images in the first sub-image set are all the same, one frame of image can be optionally selected to determine the second image.
In this embodiment, the implementation of S1112 is as follows: and carrying out image recognition on any frame image in the first sub-image set to recognize partial images of pupils contained in the frame image, and taking the recognized partial images of the pupils as a second image.
S1113, determining a third image from any image in the second sub-image set.
Note that, the specific implementation of S1113 is similar to the specific implementation of S1112, and will not be repeated here.
Mode two: the button is arranged on the execution main body for executing the distance measuring method provided by the embodiment of the application, and the button is triggered to collect images when a user views the first distance measuring point. And carrying out image recognition on the acquired image to identify the image of the pupil contained in the acquired image, and taking the identified image of the pupil as a second image.
It should be noted that, the method for acquiring the third image in the second mode is the same as the method for acquiring the second image in the second mode, and will not be described here again.
S1200, determining the rotation angle of the pupil from the first ranging point to the second ranging point according to the second image and the third image.
In the first embodiment, the specific implementation of S1200 may be: splicing the second image and the third image; splicing the second image and the third image into the same image in a splicing mode; and (4) carrying out image recognition on the same spliced image to obtain the rotation angle in the S1212.
In the second embodiment, the specific implementation of S1200 may include the following S1210-S1212:
S1210, determining a second distance value between the image acquisition module and the pupil when the first distance measurement point is checked according to the second image.
S1211, determining a third distance value between the image acquisition module and the pupil when the second distance measurement point is checked according to the third image.
In this embodiment, the second distance value between the image acquisition module and the pupil when viewing the first ranging point may be determined according to the depth information of the second image. Similarly, a third distance value between the image acquisition module and the pupil when the second ranging point is checked can be determined through the depth information of the third image.
S1212, determining the rotation angle of the pupil from the first distance measurement point to the second distance measurement point according to the second distance value and the third distance value.
In one embodiment, the implementation of S1212 is as follows: and determining a distance value between the pupil when the first ranging point is checked and the pupil when the second ranging point is checked according to the second image and the third image, and combining the distance value, the second distance value and the third distance value (the specific principle can be seen in the principle shown in fig. 4) to obtain the rotation angle of the pupil from the first ranging point to the second ranging point.
In another embodiment, in the case that the image capturing device is located right in front of the pupil in the plane view, the implementation of S1212 may be the following S1212-1 and S1212-2:
S1212-1, obtaining a fourth distance value between the pupil and the image acquisition module when the image acquisition module is checked.
In one embodiment, the fourth distance is an empirical value or calculated from the first image. The first image is an image of a pupil when the image acquisition module is checked.
S1212-2, determining the rotation angle of the pupil from the first distance measurement point to the second distance measurement point according to the fourth distance value, the second distance value and the third distance value.
In this embodiment, as shown in fig. 2, the fourth distance value is denoted as d4, the second distance value is denoted as d2, and the third distance value is denoted as d3. The specific implementation of the above S1212-2 is as follows:
Because of the structure of the human eye, the pupil is approximately on the same horizontal line when the human eye views different objects, so the rotation angle of the pupil from the first ranging point to the second ranging point can be calculated based on the trigonometric function. Based on this:
α=arccosd2/d4;
β=arccosd3/d4;
γ=α+β;
Wherein, alpha is the rotation angle of the pupil from the pupil when looking over the image acquisition module to the pupil when looking over the first ranging point, beta is the rotation angle of the pupil from the pupil when looking over the image acquisition module to the pupil when looking over the second ranging point, and gamma is the rotation angle of the pupil from the first ranging point to the pupil when looking over the second ranging point.
In the third embodiment, in the case where the image capturing device is located right in front of the pupil in the plane view, the specific implementation manner of S1200 may further be:
In this embodiment, the ranging method provided in the embodiment of the present application further includes, before S1200 described above:
S1220, a first image is acquired.
The first image is an image of a pupil when the image acquisition module is checked.
S1221, acquiring a fourth distance value between the pupil and the image acquisition module when the image acquisition module is checked.
Based on S1220 and S1221, the implementation of S1200 is as follows:
s1222, determining a fifth distance value between the pupil when the image acquisition module is checked and the pupil when the first distance measurement point is checked according to the first image and the second image.
In this embodiment, the specific implementation of S1222 may be: and comparing the first image with the second image, determining the offset of the pupil, and determining a fifth distance value between the pupil when the image acquisition module is checked and the pupil when the first distance measurement point is checked based on the offset.
S1223, determining a sixth distance value between the pupil when the image acquisition module is checked and the pupil when the second distance measurement point is checked according to the first image and the third image.
The determination method of the sixth distance value is the same as the determination method of the fifth distance value, and will not be described again here.
S1224, determining the rotation angle of the pupil from the first distance measurement point to the second distance measurement point according to the fourth distance value, the fifth distance value and the sixth distance value.
In this embodiment, as shown in fig. 2, the fourth distance value is denoted as d4, the fifth distance value is denoted as d5, and the sixth distance value is denoted as d6. The implementation of S1223 is as follows:
α=arctand5/d4;
β=arctand6/d4;
γ=α+β。
S1300, determining a first pupil aperture value according to the second image.
In this embodiment, the first pupil aperture value is the aperture value of the pupil when viewing the first ranging point.
In this embodiment, for the second image, a binarization method may be used to separate the pupil area, and further based on the pupil area, the first pupil aperture value may be obtained by calculation. Or otherwise determine the first pupil aperture value based on the second image.
S1400, determining a second pupil aperture value according to the third image.
In this embodiment, the second pupil aperture value is the aperture value of the pupil when viewing the second ranging point.
Note that, the specific implementation of S1400 is similar to the specific implementation of S1300, and will not be repeated here.
S1500, determining a distance value between the first distance measurement point and the second distance measurement point according to the rotation angle, the first pupil aperture value and the second pupil aperture value.
In one embodiment, the above S1500 may be implemented by the following S1510-S1512:
s1510, determining a seventh distance value between the human eye and the first distance measurement point according to the first pupil aperture value.
In this embodiment, the amount of light entering and the eye distance variation of the human eye may cause the pupil aperture value to vary. In the case where the amount of light input is constant, the pupil is contracted as the visual distance becomes closer. As the visual distance becomes longer, the pupil becomes larger. Based on this, when the amount of light input is unchanged, a certain mapping relationship exists between the line of sight and the pupil aperture value.
It is understood that the amount of light input hardly changes during the ranging. Therefore, the applicant fits the aperture value and the visual distance of the pupil to obtain a fitting function reflecting the visual distance and the aperture value of the pupil.
In one embodiment, the fitting process described above may be: and recording the vision distance and the corresponding pupil caliber value when a plurality of space points are checked. Fitting the vision distance and the corresponding pupil caliber value when a plurality of recorded space points are performed.
In this embodiment, the line of sight and corresponding pupil aperture values are recorded when viewing a plurality of spatial points, as shown in fig. 3. Wherein. The X-axis represents pupil aperture values, and the Y-axis represents visual distance. Based on fig. 3, the fitting function obtained by fitting is:
Y=0.076X2+0.136;
wherein X represents the aperture value of the pupil, and the value range of X is usually 1.5mm-8mm; y represents the viewing distance, and the value range of Y is usually 0.3m-5m.
And taking the first pupil aperture value as X based on the fitting function, and taking the first pupil aperture value into the fitting function, wherein the obtained sight distance Y is a seventh distance value between the human eye and the first distance measurement point.
S1511, determining an eighth distance value between the human eye and the second distance measurement point according to the second pupil aperture value.
Note that, the specific implementation of S1511 and S1510 is the same, and will not be repeated here.
S1512, determining a distance value between the first distance measurement point and the second distance measurement point according to the seventh distance value, the eighth distance value and the rotation angle.
In this embodiment, it can be understood that, as shown in fig. 4, the human eye, the first ranging point, and the second ranging point form a plane, and the distance between the first ranging point and the second ranging point can be obtained based on the geometric relationship. The specific geometrical relationship is as follows:
Y3 2=Y1 2+Y2 2-2*Y1Y2*cosγ;
Wherein γ is the rotation angle in S1512, Y1 is the seventh distance value between the human eye and the first distance measurement point, Y2 is the eighth distance value between the human eye and the second distance measurement point, and Y3 is the first distance value between the first distance measurement point and the second distance measurement point.
It should be noted that, the above S1500 may be implemented in other manners, and any manner of determining the first distance value between the first ranging point and the second ranging point according to the first distance value, the second distance value and the rotation angle is within the scope of the embodiments of the present application.
In this embodiment, there is provided a ranging method including: acquiring a second image of the pupil when the first ranging point is checked and a third image of the pupil when the second ranging point is checked; determining the rotation angle of the pupil from the first ranging point to the second ranging point according to the second image and the third image; determining a first pupil aperture value according to the second image; determining a second pupil aperture value according to the third image; and determining a first distance value between the first distance measurement point and the second distance measurement point according to the rotation angle, the first pupil aperture value and the second pupil aperture value. In this embodiment, the measurement of the distance between two points in space can be achieved by image processing, so that the user is not required to be located at the position of at least one point in the two points in space, which greatly reduces the cost and complexity of the measurement of the distance between the two points in space.
On the basis of any one of the above embodiments, the ranging method provided in the embodiment of the present application further includes the following S1600:
s1600, outputting a first distance value between the first distance measurement point and the second distance measurement point.
In this embodiment, the first distance value between the first ranging point and the second ranging point may be output by outputting voice, text, etc.
< Device example >
As shown in fig. 5, an embodiment of the present application provides a ranging apparatus 500, including an acquisition module 510, a first determination module 520, a second determination module 530, a third determination module 540, and a fourth module 550, wherein:
the obtaining module 510 is configured to obtain a second image of the pupil when viewing the first ranging point and a third image of the pupil when viewing the second ranging point.
The first determining module 520 is configured to determine, according to the second image and the third image, a rotation angle of the pupil from when the first ranging point is viewed to when the second ranging point is viewed.
A second determining module 530, configured to determine a first pupil aperture value according to the second image.
And a third determining module 540, configured to determine a second pupil aperture value according to the third image.
A fourth determining module 550, configured to determine a first distance value between the first ranging point and the second ranging point according to the rotation angle, the first pupil aperture value, and the second pupil aperture value.
In one embodiment, the first determining module 520 is specifically configured to determine, according to the second image, a second distance value between the image acquisition module and the pupil when viewing the first ranging point.
And determining a third distance value between the image acquisition module and the pupil when the second distance measurement point is checked according to the third image.
And determining the rotation angle of the pupil from the first distance measurement point to the second distance measurement point according to the second distance value and the third distance value.
In one embodiment, the first determining module 520 includes an obtaining unit and a first determining unit, where:
the acquisition unit is used for acquiring a fourth distance value between the pupil and the image acquisition module when the image acquisition module is checked.
The first determining unit is used for determining the rotation angle of the pupil from the first distance measurement point to the second distance measurement point according to the fourth distance value, the second distance value and the third distance value.
In one embodiment, the obtaining module 510 is further configured to obtain a first image, where the first image is an image of a pupil when the image capturing module is viewed;
and acquiring a fourth distance value between the pupil and the image acquisition module when the image acquisition module is checked.
In this embodiment, the first determining module 520 is specifically configured to: and determining a fifth distance value between the pupil when the image acquisition module is checked and the pupil when the first distance measurement point is checked according to the first image and the second image.
And determining a sixth distance value between the pupil when the image acquisition module is checked and the pupil when the second distance measurement point is checked according to the first image and the third image.
And determining the rotation angle of the pupil from the first distance measurement point to the second distance measurement point according to the fourth distance value, the fifth distance value and the sixth distance value.
In one embodiment, the acquisition module comprises: the device comprises an acquisition unit, a selection unit, a second determination unit and a third determination unit, wherein:
The acquisition unit is used for acquiring the image set.
The selecting unit is used for sequentially selecting a first sub-image set and a second sub-image set which have the same image content in a preset time period from the image set.
The first determining unit is used for determining a second image according to any image in the first sub-image set.
The third determining unit is used for determining a third image according to any image in the second sub-image set.
In one embodiment, the fourth determination module 550 includes: a fourth determination unit, a fifth determination unit, and a sixth determination unit, wherein:
the fourth determining unit is used for determining a seventh distance value between the human eye and the first distance measurement point according to the first pupil caliber value.
And the fifth determining unit is used for determining an eighth distance value between the human eye and the second distance measurement point according to the second pupil caliber value.
The sixth determining unit is configured to determine a first distance value between the first ranging point and the second ranging point according to the seventh distance value, the eighth distance value, and the rotation angle.
In one embodiment, the ranging apparatus 500 provided in the embodiment of the present application further includes an output module, where:
the output module is used for outputting a first distance value between the first distance measurement point and the second distance measurement point.
< Device example >
An embodiment of the present application provides an electronic device 600, where the electronic device 600 includes any of the ranging apparatuses 500 provided in the apparatus embodiments described above.
Or as shown in fig. 6, comprises a memory 610 and a processor 620, the memory 610 is used for storing computer instructions, and the processor 620 is used for calling the computer instructions from the memory 610 to execute the ranging method according to any one of the above method embodiments.
< Storage Medium embodiment >
An embodiment of the present application provides a computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements a ranging method according to any of the above-described method embodiments.
The present application may be a system, method, and/or computer program product. The computer program product may include a computer readable storage medium having computer readable program instructions embodied thereon for causing a processor to implement aspects of the present application.
The computer readable storage medium may be a tangible device that can hold and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: portable computer disks, hard disks, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), static Random Access Memory (SRAM), portable compact disk read-only memory (CD-ROM), digital Versatile Disks (DVD), memory sticks, floppy disks, mechanical coding devices, punch cards or in-groove structures such as punch cards or grooves having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media, as used herein, are not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through waveguides or other transmission media (e.g., optical pulses through fiber optic cables), or electrical signals transmitted through wires.
The computer readable program instructions described herein may be downloaded from a computer readable storage medium to a respective computing/processing device or to an external computer or external storage device over a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmissions, wireless transmissions, routers, firewalls, switches, gateway computers and/or edge servers. The network interface card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium in the respective computing/processing device.
Computer program instructions for carrying out operations of the present application may be assembly instructions, instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as SMALLTALK, C ++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer readable program instructions may be executed entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, aspects of the present application are implemented by personalizing electronic circuitry, such as programmable logic circuitry, field Programmable Gate Arrays (FPGAs), or Programmable Logic Arrays (PLAs), with state information for computer readable program instructions, which can execute the computer readable program instructions.
Various aspects of the present application are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable medium having the instructions stored therein includes an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. It is well known to those skilled in the art that implementation by hardware, implementation by software, and implementation by a combination of software and hardware are all equivalent.
The foregoing description of embodiments of the application has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the various embodiments described. The terminology used herein was chosen in order to best explain the principles of the embodiments, the practical application, or the technical improvement of the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein. The scope of the application is defined by the appended claims.
Claims (9)
1. A ranging method, the method comprising:
acquiring a second image of the pupil when the first ranging point is checked and a third image of the pupil when the second ranging point is checked;
Determining the rotation angle of the pupil from the first ranging point to the second ranging point according to the second image and the third image;
determining a first pupil aperture value according to the second image;
determining a second pupil aperture value according to the third image;
Determining a first distance value between the first ranging point and the second ranging point according to the rotation angle, the first pupil aperture value and the second pupil aperture value;
wherein the determining a first distance value between the first ranging point and the second ranging point according to the rotation angle, the first pupil aperture value and the second pupil aperture value includes:
determining a seventh distance value between the human eye and the first distance measurement point according to the first pupil aperture value;
determining an eighth distance value between the human eye and the second distance measurement point according to the second pupil aperture value;
And determining a first distance value between the first ranging point and the second ranging point according to the seventh distance value, the eighth distance value and the rotation angle.
2. The method of claim 1, wherein determining the rotation angle of the pupil from viewing the first ranging point to viewing the second ranging point from the second image and the third image comprises:
determining a second distance value between the image acquisition module and the pupil when the first distance measurement point is checked according to the second image;
determining a third distance value between the image acquisition module and the pupil when the second distance measurement point is checked according to the third image;
And determining the rotation angle of the pupil from the first distance measurement point to the second distance measurement point according to the second distance value and the third distance value.
3. The method of claim 2, wherein determining the rotation angle of the pupil from viewing the first ranging point to viewing the second ranging point based on the second distance value and the third distance value comprises:
acquiring a fourth distance value between a pupil and an image acquisition module when the image acquisition module is checked;
And determining the rotation angle of the pupil from the first distance measurement point to the second distance measurement point according to the fourth distance value, the second distance value and the third distance value.
4. The method of claim 1, wherein the method comprises, prior to determining the rotation angle of the pupil from viewing the first ranging point to viewing the second ranging point, from the second image and the third image:
Acquiring a first image, wherein the first image is an image of a pupil when an image acquisition module is checked;
acquiring a fourth distance value between a pupil and an image acquisition module when the image acquisition module is checked;
the determining, according to the second image and the third image, a rotation angle of the pupil from viewing the first ranging point to viewing the second ranging point includes:
determining a fifth distance value between the pupil when the image acquisition module is checked and the pupil when the first distance measurement point is checked according to the first image and the second image;
determining a sixth distance value between the pupil when the image acquisition module is checked and the pupil when the second distance measurement point is checked according to the first image and the third image;
And determining the rotation angle of the pupil from the first distance measurement point to the second distance measurement point according to the fourth distance value, the fifth distance value and the sixth distance value.
5. The method of claim 1, wherein the acquiring a second image of the pupil while viewing the first ranging point and a third image of the pupil while viewing the second ranging point comprises:
Acquiring an image set;
Sequentially selecting a first sub-image set and a second sub-image set which have the same image content in a preset time period from the image set;
determining a second image according to any image in the first sub-image set;
a third image is determined from any of the images in the second set of sub-images.
6. The method according to claim 1, wherein the method further comprises:
and outputting a first distance value between the first distance measurement point and the second distance measurement point.
7. A ranging apparatus, comprising:
The acquisition module is used for acquiring a second image of the pupil when the first ranging point is checked and a third image of the pupil when the second ranging point is checked;
The first determining module is used for determining the rotation angle of the pupil from the first ranging point to the second ranging point according to the second image and the third image;
the second determining module is used for determining a first pupil aperture value according to the second image;
the third determining module is used for determining a second pupil aperture value according to the third image;
A fourth determining module, configured to determine a first distance value between the first ranging point and the second ranging point according to the rotation angle, the first pupil aperture value, and the second pupil aperture value;
wherein the fourth determination module includes: a fourth determination unit, a fifth determination unit, and a sixth determination unit, wherein:
The fourth determining unit is used for determining a seventh distance value between the human eye and the first distance measurement point according to the first pupil caliber value;
the fifth determining unit is used for determining an eighth distance value between the human eye and the second distance measurement point according to the second pupil caliber value;
the sixth determining unit is configured to determine a first distance value between the first ranging point and the second ranging point according to the seventh distance value, the eighth distance value, and the rotation angle.
8. An electronic device comprising the apparatus of claim 7;
Or comprises a memory for storing computer instructions and a processor for invoking the computer instructions from the memory to perform the method of any of claims 1-6.
9. A computer readable storage medium, characterized in that a computer program is stored thereon, which, when being executed by a processor, implements the method according to any of claims 1-6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210147261.0A CN114689013B (en) | 2022-02-17 | 2022-02-17 | Ranging method, ranging device, ranging equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210147261.0A CN114689013B (en) | 2022-02-17 | 2022-02-17 | Ranging method, ranging device, ranging equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114689013A CN114689013A (en) | 2022-07-01 |
CN114689013B true CN114689013B (en) | 2024-05-14 |
Family
ID=82137025
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210147261.0A Active CN114689013B (en) | 2022-02-17 | 2022-02-17 | Ranging method, ranging device, ranging equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114689013B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115442525A (en) * | 2022-08-31 | 2022-12-06 | 丁非 | A method, system, terminal and storage medium for measuring pupillary distance |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104484649A (en) * | 2014-11-27 | 2015-04-01 | 北京天诚盛业科技有限公司 | Method and device for identifying irises |
CN109443303A (en) * | 2018-09-14 | 2019-03-08 | 杭州宇泛智能科技有限公司 | The method and system of detection face and camera distance based on Image Acquisition |
CN109696158A (en) * | 2018-05-05 | 2019-04-30 | 福建汇川物联网技术科技股份有限公司 | Distance measurement method, distance-measuring device and electronic equipment |
CN113920195A (en) * | 2021-10-08 | 2022-01-11 | Oppo广东移动通信有限公司 | Distance detection method, control method, device, storage medium and electronic equipment |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7964833B2 (en) * | 2007-08-02 | 2011-06-21 | Elenza, Inc. | Multi-focal intraocular lens system and methods |
-
2022
- 2022-02-17 CN CN202210147261.0A patent/CN114689013B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104484649A (en) * | 2014-11-27 | 2015-04-01 | 北京天诚盛业科技有限公司 | Method and device for identifying irises |
CN109696158A (en) * | 2018-05-05 | 2019-04-30 | 福建汇川物联网技术科技股份有限公司 | Distance measurement method, distance-measuring device and electronic equipment |
CN109443303A (en) * | 2018-09-14 | 2019-03-08 | 杭州宇泛智能科技有限公司 | The method and system of detection face and camera distance based on Image Acquisition |
CN113920195A (en) * | 2021-10-08 | 2022-01-11 | Oppo广东移动通信有限公司 | Distance detection method, control method, device, storage medium and electronic equipment |
Non-Patent Citations (1)
Title |
---|
基于数字图像处理的眼球控制精度提高方法;严德赛;曾诚;;计算机应用;20180725(第10期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN114689013A (en) | 2022-07-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102406354B1 (en) | Video restoration method and apparatus, electronic device and storage medium | |
US20220122260A1 (en) | Method and apparatus for labeling point cloud data, electronic device, and computer-readable storage medium | |
CN113822918B (en) | Scene depth and camera motion prediction method and device, electronic equipment and medium | |
EP3627821B1 (en) | Focusing method and apparatus for realizing clear human face, and computer device | |
KR20220053670A (en) | Target-object matching method and apparatus, electronic device and storage medium | |
WO2019205605A1 (en) | Facial feature point location method and device | |
US11741671B2 (en) | Three-dimensional scene recreation using depth fusion | |
WO2019119986A1 (en) | Image processing method and device, computer readable storage medium, and electronic apparatus | |
JP7182020B2 (en) | Information processing method, device, electronic device, storage medium and program | |
US9167147B2 (en) | Mobile device field of view region determination | |
CN114689013B (en) | Ranging method, ranging device, ranging equipment and storage medium | |
CN111860388A (en) | Image processing method and device, electronic equipment and storage medium | |
KR102337209B1 (en) | Method for notifying environmental context information, electronic apparatus and storage medium | |
CN113055593B (en) | Image processing method and device | |
TW202301275A (en) | Depth detection method and device, electronic equipment and storage medium | |
CN111860373A (en) | Target detection method and device, electronic equipment and storage medium | |
CN112541875A (en) | Depth image processing method and device, electronic equipment and storage medium | |
CN112200820A (en) | Three-dimensional image processing method and device, electronic device and storage medium | |
Javidnia et al. | Application of preconditioned alternating direction method of multipliers in depth from focal stack | |
CN108984628B (en) | Loss value obtaining method and device of content description generation model | |
CN112767541B (en) | Three-dimensional reconstruction method and device, electronic equipment and storage medium | |
CN112837372A (en) | Data generation method and device, electronic equipment and storage medium | |
US12192618B2 (en) | Automatic photography composition recommendation | |
CN114442321B (en) | Display control method and device of display equipment, display equipment and electronic equipment | |
JP7320400B2 (en) | VIDEO PRODUCTION PROCESSING DEVICE AND PROGRAM THEREOF |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |