CN116886881B - Projector based on omnidirectional trapezoidal technology - Google Patents
Projector based on omnidirectional trapezoidal technology Download PDFInfo
- Publication number
- CN116886881B CN116886881B CN202310928829.7A CN202310928829A CN116886881B CN 116886881 B CN116886881 B CN 116886881B CN 202310928829 A CN202310928829 A CN 202310928829A CN 116886881 B CN116886881 B CN 116886881B
- Authority
- CN
- China
- Prior art keywords
- projector
- angle
- camera
- module
- feature
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000005516 engineering process Methods 0.000 title claims abstract description 25
- 230000003287 optical effect Effects 0.000 claims abstract description 60
- 239000011159 matrix material Substances 0.000 claims abstract description 46
- 238000000034 method Methods 0.000 claims abstract description 32
- 230000008878 coupling Effects 0.000 claims abstract description 15
- 238000010168 coupling process Methods 0.000 claims abstract description 15
- 238000005859 coupling reaction Methods 0.000 claims abstract description 15
- 230000008569 process Effects 0.000 claims abstract description 15
- 238000004364 calculation method Methods 0.000 claims abstract description 11
- 230000006870 function Effects 0.000 claims description 39
- 239000003990 capacitor Substances 0.000 claims description 35
- 239000013598 vector Substances 0.000 claims description 23
- 238000010586 diagram Methods 0.000 claims description 19
- 238000001514 detection method Methods 0.000 claims description 17
- 230000009466 transformation Effects 0.000 claims description 14
- 238000012545 processing Methods 0.000 claims description 11
- 230000031836 visual learning Effects 0.000 claims description 9
- 230000013016 learning Effects 0.000 claims description 7
- 239000011324 bead Substances 0.000 claims description 6
- 230000000007 visual effect Effects 0.000 claims description 6
- 230000001960 triggered effect Effects 0.000 claims description 5
- 101100172705 Arabidopsis thaliana ESD4 gene Proteins 0.000 claims description 4
- 238000011156 evaluation Methods 0.000 claims description 3
- 238000010606 normalization Methods 0.000 claims description 3
- 238000012937 correction Methods 0.000 abstract description 23
- 230000000694 effects Effects 0.000 abstract description 10
- 238000005259 measurement Methods 0.000 abstract description 3
- 238000013461 design Methods 0.000 description 11
- 230000001133 acceleration Effects 0.000 description 7
- 238000011161 development Methods 0.000 description 7
- 230000004927 fusion Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 230000007547 defect Effects 0.000 description 2
- 230000005484 gravity Effects 0.000 description 2
- 101000767160 Saccharomyces cerevisiae (strain ATCC 204508 / S288c) Intracellular protein transport protein USO1 Proteins 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000011900 installation process Methods 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/55—Optical parts specially adapted for electronic image sensors; Mounting thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
- H04N9/3185—Geometric adjustment, e.g. keystone or convergence
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Geometry (AREA)
- Studio Devices (AREA)
Abstract
The invention provides a projector based on an omnidirectional trapezoidal technology, which comprises: the optical coupling feedback module comprises a first optical coupling feedback unit and a second optical coupling feedback unit, and the first optical coupling feedback unit and the second optical coupling feedback unit are respectively connected to the inner side and the outer side of the telescopic travel of the optical lens of the projector; the driving chip of the stepping motor control module is connected to the stepping motor of the stepping motor control module through a driving interface; in the projection control process, a preset feature map is projected through a projector, feature points are detected, the corresponding gesture of each feature point is calculated through a gesture measurement, and calculation, setting and corresponding adjustment are carried out according to a gesture matrix. The invention realizes trapezoidal correction of the projection picture with lower cost and high reliable correction effect, and can effectively reduce the use limitation of products.
Description
Technical Field
The present disclosure relates to projectors, and particularly to a projector with an omni-directional trapezoid technology.
Background
With the development of technology, projectors have been widely used as a high-tech product in the fields of entertainment, education, business, and the like. In daily life, we can see various types of projection devices, such as home theatres, school classrooms, conference rooms, and the like. However, although the projector plays an important role in facilitating information acquisition and improving work efficiency, there are some problems in practical application.
Among them, distortion and deformation of the projection screen are one of the most common problems of the projector during use. This not only affects the user's look and feel experience, but may also lead to inaccuracy in the information transfer. To solve this problem, various projector trapezoidal correction methods and techniques are emerging on the market.
Currently, the projector trapezoidal correction technology mainly comprises a TOF (time of flight) -based ranging technology, a gyroscope-based technology, a camera-based technology, a projection picture characteristic point detection technology, a triangulation principle-based technology and the like. The techniques are characterized by being selected according to actual application scenes and requirements. The real-time performance based on the TOF ranging technology is good, but the cost is high; although the gyroscope-based technology has good real-time performance and does not need additional hardware module support, the gyroscope-based technology is not ideal for scene effects with frequent gesture changes; the realization process based on the camera technology is convenient, but is influenced by factors such as ambient light and the like, and the correction effect is not reliable; the technology for detecting the characteristic points based on the projection picture does not need additional hardware module support, but the effect of the technology is not ideal for complex scenes; the accuracy based on the triangulation principle is high, but the installation process is complex and the cost is high. Therefore, none of the prior art has a projector that can well meet both low cost and highly reliable correction effects through an efficient fusion design between hardware and software.
Disclosure of Invention
The invention aims to provide a projector based on an omnidirectional trapezoidal technology, which aims to well meet the requirements of low cost and high reliable correction effect by means of high-efficiency fusion design between hardware and software, and provides a complete hardware technical scheme and a software control process matched with the complete hardware technical scheme.
In this regard, the present invention provides a projector based on an omni-directional trapezoidal technique, comprising: the optical coupling feedback module comprises a first optical coupling feedback unit and a second optical coupling feedback unit, and the first optical coupling feedback unit and the second optical coupling feedback unit are respectively connected to the inner side and the outer side of the telescopic travel of the optical lens of the projector; the driving chip of the stepping motor control module is connected to the stepping motor of the stepping motor control module through a driving interface; the gyroscope module comprises a six-axis gyroscope chip U3, a resistor R892, a capacitor C501, a resistor R893 and a capacitor C502, wherein a first input pin and a second input pin of the six-axis gyroscope chip U3 are connected to the input end of the motion sensor through the resistor R892 and the resistor R893 respectively; the power pin, the enable pin and the data pin of the six-axis gyroscope chip U3 are connected to one end and the power end of the capacitor C501 in time Zhong Guanjiao and are respectively connected to one end of the capacitor C502, the other end of the capacitor C501 and the other end of the capacitor C502 are grounded, and the capacitance values of the capacitor C501 and the capacitor C502 are equal;
the projection control process of the projector based on the omnidirectional trapezoidal technology comprises the following steps of:
step S1, a preset feature map is projected through an optical lens of a projector, after parameter information of feature points in the feature map is identified, the parameter information is stored in a storage module, the preset feature map comprises a middle checkerboard and four corner points, and the checkerboard of the corner points is smaller than the middle checkerboard;
step S2, obtaining rotation angle information of a projector by reading data of the gyroscope module, and triggering an omnidirectional trapezoidal function when a data jitter variation value exceeds a preset threshold value;
step S3, after the projector automatically focuses, starting an omnidirectional trapezoidal function, and projecting a preset characteristic diagram through the optical lens;
s4, firstly, obtaining a calibration image, carrying out corner detection and line length calculation on the calibration image, and identifying all feature points of the feature image;
S5, obtaining coordinates of each corner point in a camera coordinate system corresponding to the camera module according to the relation between the feature map and the calibration image;
S6, obtaining an attitude matrix of the projector in the camera coordinate system according to the relation between the feature points and the camera coordinate system;
S7, calculating included angles between the x-axis component, the y-axis component and the z-axis component and the vector module length respectively by using a gesture matrix, and setting omnidirectional trapezoidal parameters of the projector image;
And S8, adjusting the projector image through the stepping motor control module according to the omnidirectional trapezoidal parameters.
The invention further improves that the visual angle range of the optical lens of the projector is 40 degrees, the visual angle range of the camera module is 59 degrees, and the visual angle of the FOV of the camera module covers the projection image of the optical lens within the range of 2 meters.
In a further development of the invention, the distance between the optical lens of the projector and the camera of the camera module is 135.74mm, and the camera is offset in the direction of the optical lens during installation by an angle of 3 °.
The camera module comprises a camera connector J8, a magnetic bead FB2, a capacitor C948, a resistor R5, a resistor R6, a protection tube ESD3 and a protection tube ESD4, wherein a power end is connected to one end of the capacitor C948 and the camera connector J8 through the magnetic bead FB2, and the other end of the capacitor C948 is grounded; the data pins of the camera connector J8 are respectively connected to the USB pins of the main control module through the resistor R5 and the resistor R6, and are respectively grounded through the protection tube ESD3 and the protection tube ESD 4.
A further development of the invention is that said step S1 comprises the sub-steps of:
Step S101, a preset characteristic diagram is projected through an optical lens of a projector, and then a plurality of characteristic diagrams with different positions and angles are shot through a camera module to serve as calibration images;
Step S102, performing corner detection on each calibration image through findChessboardCorners () function of visual learning software, and returning a detection result;
step S103, carrying out sub-pixel level processing on the detected corner points through cornerSubPix () functions of vision learning software to obtain corner point coordinates;
Step S104, calculating the line length of four corner points of each calibration image, identifying the middle checkerboard, establishing an image space coordinate system and a camera coordinate system, and corresponding the corner point coordinates detected in each calibration image with the three-dimensional coordinates of the camera coordinate system so as to determine the distortion condition of the calibration image and the projection matrix of the camera;
Step S105, performing precision evaluation on a transformation matrix through calibrationMatrixValues () function of vision learning software, wherein the transformation matrix refers to a transformation matrix between an image space coordinate system and a camera coordinate system, so as to obtain offset values between a projector optical lens and a central axis of the camera, and forming a device posture parameter with a front projection left-right angle of 0 through a relation between the transformation matrix and five feature images, so as to serve as parameter information of feature points in the feature images;
and step S106, saving the parameter information to a storage module.
In the step S2, after the data of the gyroscope module is read, angle information of the projector rotating around the X axis, the Y axis and the Z axis is obtained, the angle information of the three axes is weighted and overlapped to obtain rotation angle information of the projector, the sum of the weights of the three axes is 1, the rotation angle information within 3 seconds is calculated, the rotation angle information is used as a data jitter variation value, and an omnidirectional trapezoidal function is triggered when the data jitter variation value exceeds a preset threshold value.
The invention further improves that after triggering the omnidirectional trapezoidal function, the method comprises the following triggering steps:
Step A1, a gyroscope module is called, the rotation speed and the angle of a projector are measured, and the read original data are analyzed into angle values of three axes of Roll, pitch and Yaw according to the data format and the rule of the gyroscope module;
A2, determining the posture of the projector according to the angle values of the three axes of Roll, pitch and Yaw;
A3, determining the rotation angle and the Pitch angle of the projector according to the posture of the projector, wherein Roll represents the rotation angle of the projector around the X axis, pitch represents the rotation angle of the projector around the Y axis, and Yaw represents the rotation angle of the projector around the z axis; jump to step S3.
A further development of the invention is that said step S4 comprises the sub-steps of:
step S401, firstly, a calibration image is obtained, and a preset feature map is shot through a camera module;
step S402, performing corner detection on the image through findChessboardCorners () function of visual learning software;
Step S403, processing the detected corner points at sub-pixel level through cornerSubPix () function of visual learning software;
step S404, calculating to obtain four line lengths of each feature image by calculating the distance between every two adjacent feature points according to the feature point coordinates in the calibration image;
step S405, calculating the side length of the middle checkerboard by the detected feature point coordinates.
A further development of the invention is that said step S6 comprises the sub-steps of:
step S601, setting four corner coordinate information of a characteristic diagram as (x 1, y1, z 1), (x 2, y2, z 2), (x 3, y3, z 3) and (x 4, y4, z 4) respectively, and setting a relation matrix R between the four corner points and a camera space coordinate system;
step S602, calculating the coordinate d of the first corner restored to the left-right angle of 0 by the formula d=r×xxz '″ wherein [ xyz ]' represents the column vector corresponding to the relation matrix R;
step S603, returning to step S602, replacing the coordinates with the coordinates of the remaining corner points respectively, and calculating the coordinates e, f and g of the coordinates of the remaining three corner points under the spatial coordinate system of the camera;
Step S604, calculating the angle and direction relationship between the five feature graphs by using the comparison of the triangle relationship and the line length through the four corner coordinates d, e, f and g calculated in step S602 and step S603;
Step S605, determining the gesture between the feature graphs according to the angle and direction relation obtained in the step S604, and storing the gesture information as an R_hat one-dimensional matrix.
A further development of the invention is that said step S7 comprises the sub-steps of:
Step S701, converting the gesture matrix R_hat into a quaternion form through a cv:Rodrigues function;
step S702, performing normalization processing on the quaternion to obtain normalized vectors V, v= (Vx, vy, vz);
Step S703, calculating the modulo length V of the vector V by the formula v=sqrt (Vx 2+ Vy2+ Vz2);
step S704, calculating an angle_x between the x-axis component and the modular length of the vector V by the formula angle_x= arccos (Vx/|v|);
In step S705 the step of, by the formula angle_y= arccos #; vy/||v|) computing the y-axis angle_y between the component and the modular length of vector V;
in step S706, the step of, by the formula angle_z= arccos #; vz/||v|) calculate the z-axis angle_z between the component and the modulus of vector V;
In step S707, the angle_z is used as the left and right angles of the projector, and the angles angle_x, angle_y, and angle_z are set as the omni-directional trapezoidal parameters of the projector image.
Compared with the prior art, the invention has the beneficial effects that: the hardware module and the circuit design of the projector based on the omnidirectional trapezoidal technology are optimized and perfected, and on the hardware, the cooperative work of the main control module, the gyroscope module, the stepping motor control module, the camera module, the optocoupler feedback module and the storage module provides a good hardware foundation for the projection control process of the projector, and the defect of high cost can be effectively avoided; on the basis, a matched projection control process is further provided, a preset feature map is projected through a projector, then feature points of the feature map are detected, the corresponding gesture of each feature point is calculated through a gesture measurement, calculation, setting and corresponding adjustment are carried out according to a gesture matrix, and further feature point detection and gesture detection are reasonably and efficiently organically fused, so that the gesture of the projector is automatically and accurately estimated, and further efficient fusion design between hardware and software is achieved. The invention can adapt to different environments to realize high-precision trapezoidal correction, is not influenced by complex scenes, ambient light and other external factors, realizes trapezoidal correction on projection pictures with lower cost and high reliable correction effect, and effectively reduces the use limitation of products.
Drawings
FIG. 1 is a schematic block diagram of one embodiment of the present invention;
FIG. 2 is a schematic circuit diagram of a gyroscope module according to one embodiment of the invention;
FIG. 3 is a schematic circuit diagram of a stepper motor control module according to one embodiment of the invention;
FIG. 4 is a schematic circuit diagram of a camera module according to an embodiment of the present invention;
FIG. 5 is a schematic circuit diagram of an optocoupler feedback module according to an embodiment of the invention;
FIG. 6 is a schematic view of the relationship between the FOV field angle of a camera module and the projection angle of a projector optical lens according to one embodiment of the present invention;
FIG. 7 is a schematic diagram of the distance and angular relationship between a camera and an optical lens according to one embodiment of the present invention;
FIG. 8 is a schematic workflow diagram of one embodiment of the present invention;
fig. 9 is a schematic diagram of a feature map of an embodiment of the present invention.
The attached drawings are identified: 1-a main control module; a 2-gyroscope module; 3-a stepper motor control module; 4-a camera module; 401-a camera; a 5-optocoupler feedback module; 501-a first optocoupler feedback unit; 502-a second optocoupler feedback unit; 6-a memory module; 7-optical lens.
Detailed Description
In the description of the present invention, if an orientation description such as "upper", "lower", "front", "rear", "left", "right", etc. is referred to, it is merely for convenience of description and simplification of the description, and does not indicate or imply that the apparatus or element referred to must have a specific orientation, be configured and operated in a specific orientation, and thus should not be construed as limiting the invention. If a feature is referred to as being "disposed," "secured," "connected," or "mounted" on another feature, it can be directly disposed, secured, or connected to the other feature or be indirectly disposed, secured, connected, or mounted on the other feature.
In the description of the invention, if reference is made to "a number", it means more than one; if "a plurality" is referred to, it means more than two; if "greater than", "less than", "exceeding" are referred to, they are understood to not include the present number; references to "above," "below," "within," and "within" are to be construed as including the present number. If reference is made to "first," "second," etc., it is to be understood that the same or similar technical feature names are used only for distinguishing between them, and it is not to be understood that the relative importance of a technical feature is implied or indicated, or that the number of technical features is implied or indicated, or that the precedence of technical features is implied or indicated.
Preferred embodiments of the present invention will be described in further detail below with reference to the accompanying drawings.
As shown in fig. 1 to 9, the present embodiment provides a projector based on an omnidirectional trapezoidal technology, including: the optical fiber projector comprises a main control module 1, a gyroscope module 2, a stepping motor control module 3, a camera module 4, an optical coupler feedback module 5 and a storage module 6, wherein the gyroscope module 2, the stepping motor control module 3, the camera module 4, the optical coupler feedback module 5 and the storage module 6 are respectively connected with the main control module 1, the optical coupler feedback module 5 comprises a first optical coupler feedback unit 501 and a second optical coupler feedback unit 502, and the first optical coupler feedback unit 501 and the second optical coupler feedback unit 502 are respectively connected to the inner side and the outer side of the telescopic stroke of an optical lens 7 of the projector; the driving chip of the stepping motor control module 3 is connected to the stepping motor thereof through a driving interface; the gyroscope module 2 comprises a six-axis gyroscope chip U3, a resistor R892, a capacitor C501, a resistor R893 and a capacitor C502, wherein a first input pin (INT pin) and a second input pin (INT 1 pin) of the six-axis gyroscope chip U3 are respectively connected to an input end LSM6DSL_INT of the motion sensor through the resistor R892 and the resistor R893; the power supply pin (VDD pin), the enable pin (SENB pin), the data pin (MSDA pin) and the time Zhong Guanjiao (MCLK pin) of the six-axis gyroscope chip U3 are all connected to one end and the power supply end of the capacitor C501 and are respectively connected to one end of the capacitor C502, the other end of the capacitor C501 is grounded, and the capacitance values of the capacitor C501 and the capacitor C502 are equal; unlike the prior art that only one capacitor is adopted for grounding, the embodiment is grounded at the power supply end through the capacitor C501 and grounded through the capacitor C502 connected in parallel, so that the filtering and protecting functions of the power supply end and the grounding end can be better realized, and the stability and the reliability of the circuit are improved
As shown in fig. 8, the projection control process of the projector based on the omnidirectional trapezoidal technology includes the following steps:
step S1, a preset feature map is projected through an optical lens 7 of a projector, after parameter information of feature points in the feature map is identified, the parameter information is stored in a storage module 6, the preset feature map comprises a middle checkerboard and four corner points, and the checkerboard of the corner points is smaller than the middle checkerboard;
Step S2, obtaining rotation angle information of a projector by reading data of the gyroscope module 2, and triggering an omnidirectional trapezoidal function when a data jitter variation value exceeds a preset threshold value;
step S3, after the projector automatically focuses, starting an omnidirectional trapezoidal function, and projecting a preset characteristic diagram through the optical lens 7;
s4, firstly, obtaining a calibration image, carrying out corner detection and line length calculation on the calibration image, and identifying all feature points of the feature image;
Step S5, obtaining coordinates of each corner point in a camera coordinate system corresponding to the camera module 4 according to the relation between the feature map and the calibration image;
S6, obtaining an attitude matrix of the projector in the camera coordinate system according to the relation between the feature points and the camera coordinate system;
S7, calculating included angles between the x-axis component, the y-axis component and the z-axis component and the vector module length respectively by using a gesture matrix, and setting omnidirectional trapezoidal parameters of the projector image;
And S8, adjusting the projector image by the stepping motor control module 3 according to the omnidirectional trapezoidal parameters.
In this embodiment, the main control module 1 is preferably implemented by using an HI3751 integrated chip, the HI3751 chip module is connected to the driving chip U56 of the stepper motor control module 3 through 4 output pins, the driving chip U56 is preferably an AT8833 integrated chip or a DRC8833C integrated chip, and the 4 output pins of the driving chip U56 are connected to the stepper motor M, so as to control the stepper motor. The first optocoupler feedback unit 501 and the second optocoupler feedback unit 502 are respectively connected to the inner side and the outer side of the telescopic stroke of the optical lens 7 of the projector, namely, the 2 optocouplers are respectively connected to the innermost and the outermost of the telescopic stroke of the optical lens 7 of the projector, and output and feedback of optocoupler signals are respectively realized through a first interface J15 and a second interface J10 of the first optocoupler feedback unit 501 and the second optocoupler feedback unit 502 shown in fig. 5 so as to be used for identifying whether the current state of the optical lens 7 is the innermost or the outermost, when the optical lens 7 is located at the innermost or the outermost, the corresponding levels are triggered, and the output and feedback signals are sent to the HI3751 integrated chip shown in fig. 6 through a pi_det1 feedback pin and a pi_det2 feedback pin of the optocoupler feedback module shown in fig. 5 so as to realize programming control.
As shown in fig. 2, the gyro module 2 is preferably connected to the HI3751 integrated chip via an I 2 C controller, wherein the first input pin lsm6dsl_int of the six-axis gyro chip U3 is referred to as an interrupt pin for triggering data update. The gyroscope module 2 is used for calculating the rotation angle and triggering the calculation of the left angle and the right angle of the projector. Fig. 4 is a schematic circuit diagram of a Camera module 4, the Camera 401 preferably adopts a Camera 30W pixel Camera with a USB interface, and the USB pins of the HI3751 integrated chip are connected to the hub_dm2 and hub_dp2 shown in fig. 4, so that the main control module 1 accesses the Camera module 4 through the USB interface to obtain an image, and provides a better basis for controlling focusing definition by a mobile stepper motor.
As shown in fig. 6, the view angle range of the optical lens 7 of the projector in this embodiment is 40 °, the view angle range of the camera 401 of the camera module 4 is 59 °, and the FOV angle of the camera 401 of the camera module 4 covers the projected image of the optical lens 7 within 2 meters. FOV refers to the Field of View, the angular range over which an image can be received. By such arrangement and control of the distance and angle, the camera 401 can completely cover the projected image of the optical lens 7, so as to ensure that the image shot by the camera 401 completely contains the image of the projector, and avoid correction failure.
As shown in fig. 6 and 7, the distance between the optical lens 7 of the projector and the camera 401 of the camera module 4 is 135.74mm, and the offset angle of the camera 401 is 3 ° when the camera module is installed, so that the special design is found in practical development and testing, if the distance between the optical lens 7 of the projector and the camera 401 of the camera module 4 is not suitable, or the camera 401 and the optical lens 7 are completely parallel, the real-time following correction of the scene with frequent posture change is not facilitated. In this embodiment, through the special design that the distance between the optical lens 7 and the camera 401 of the camera module 4 is 135.74 ±0.1mm and the included angle between the central axes of the camera 401 and the optical lens 7 is 3 °, and the above-mentioned integrated technical scheme combining software and hardware is matched, real-time following correction can be well realized even in a scene with frequent gesture change, so as to realize more accurate angle calculation and correction thereof.
As shown in fig. 4, the camera module 4 of the present embodiment includes a camera connector J8, a magnetic bead FB2, a capacitor C948, a resistor R5, a resistor R6, a protection tube ESD3, and a protection tube ESD4, wherein a power supply terminal is connected to one end of the capacitor C948 and the camera connector J8 through the magnetic bead FB2, and the other end of the capacitor C948 is grounded; the data pins (namely hub_dm2 and hub_dp2) of the camera connector J8 are respectively connected to the USB pin of the main control module 1 through the resistor R5 and the resistor R6, and are respectively grounded through the protection tube ESD3 and the protection tube ESD4, so as to realize an ESD protection function.
In hardware, the gyroscope module 2 obtains rotation angle information of a projector through a gyroscope; the stepping motor control module 3 adopts the existing stepping motor, and can realize good correction control by matching with the optimization circuit shown in fig. 3; the camera module 4 adopts the existing camera to shoot the feature map, does not need TOF or a special high-definition camera, can realize the same effect even adopting a camera with 30 ten thousand pixels, but has very strong competitiveness in cost. It should be noted that, the circuit designs shown in the embodiment are not directly integrated circuit modules in the prior art, but are specifically designed in combination with the projection control process of the projector based on the omnidirectional trapezoidal technology in the embodiment, so that the method has the advantage of low cost, and the method can adapt to different environments to realize high-precision trapezoidal correction through the matching design of software and hardware, and is not influenced by complex scenes, ambient light and other external factors; therefore, the trapezoidal correction of the projection picture can be realized with lower cost and high reliable correction effect, and the use limitation of the product is effectively reduced through the technical scheme of the software and the hardware which cooperate. The storage module 6 may also be a storage module of the projector.
Before entering the main control process from step S2 to step S8, the present embodiment first corrects the projector in step S1, and the specific correction method includes the following sub-steps:
Step S101, a preset feature map as shown in FIG. 9 is projected through an optical lens 7 of a projector, wherein the preset feature map comprises a middle checkerboard and four corner points, the checkerboard of the corner points is smaller than the middle checkerboard, namely, the rows and columns of the checkerboard of the corner points are smaller than the rows and columns of the middle checkerboard, and the corner points are symmetrically arranged around the middle checkerboard; then, a plurality of feature images with different positions and angles are shot through the camera module 4 to serve as calibration images, preferably, feature points of the feature images shown in fig. 9 are identified through vision learning software Opencv, the feature points comprise four 2 x 2 checkerboards and a 5*9 middle checkerboard, the shooting and feature point identification can be better realized through the design, the follow-up calculation and control process is facilitated, and the requirements on a camera are further reduced;
Step S102, performing corner detection on each calibration image through findChessboardCorners () function of visual learning software so as to identify coordinate positions of four checkerboard corner points in the feature map shown in FIG. 9, and returning a detection result;
Step S103, performing sub-pixel level processing on the detected corner points through cornerSubPix () functions of vision learning software, and obtaining corner point coordinates after improving the precision of the corner points through the sub-pixel level processing;
Step S104, calculating the line length of four corner points of each calibration image, namely the distance between every two adjacent corner points, then identifying the middle checkerboard, recording the position relationship between the middle checkerboard and the corner points, calculating the relationship between five feature images by utilizing the triangular relationship, determining the distortion condition of the image and the projection matrix of the camera, and determining the distortion condition of the image and the projection matrix of the camera, wherein the specific method comprises the following steps: firstly establishing an image space coordinate system and a camera coordinate system, wherein the image space coordinate system refers to a space coordinate system established for a feature image, the camera coordinate system is also called a camera space coordinate system and refers to a calibration coordinate system established for a camera image, the camera coordinate system is also called a camera space coordinate system, and then the corner coordinates detected in each calibration image are corresponding to the three-dimensional coordinates of the camera coordinate system, so that the distortion condition of the calibration image and the projection matrix of a camera are determined; the five feature graphs refer to five feature graphs in total of middle checkerboard and four corner points;
Step S105, performing precision evaluation on a transformation matrix through calibrationMatrixValues () function of vision learning software, wherein the transformation matrix refers to a transformation matrix between an image space coordinate system and a camera coordinate system, so as to obtain offset values between the projector optical lens 7 and a central axis of the camera 401, and forming a device posture parameter with a front projection left-right angle of 0 through a relation between the transformation matrix and five feature images, so as to serve as parameter information of feature points in the feature images;
step S106, saving the parameter information to the storage module 6.
Further, in step S105 of the present embodiment, after the device posture parameter is obtained, the device posture parameter and the corresponding feature point information are matched and stored, and a one-to-one index record is established, which is used as the parameter information of the feature point in the feature map.
In step S2 of the present embodiment, after the data of the gyroscope module 2 is read, angle information of rotation of the projector around the X axis, the Y axis and the Z axis is obtained, the angle information of the three axes is weighted and superimposed to obtain rotation angle information of the projector, the sum of the weights of the three axes is 1, rotation angle information within 3 seconds is calculated, the rotation angle information is used as a data jitter variation value, and an omni-directional trapezoid function is triggered when the data jitter variation value exceeds a preset threshold. The weight of the X axis, the weight of the Y axis and the weight of the Z axis can be set and adjusted in a self-defined mode according to actual conditions. In practical application, the magnitude of the weight is preferably determined correspondingly according to the angle information of the three axes, that is, the larger the angle information is, the larger the rotation angle is, the larger the set weight is, and the sum of the weights of the three axes is 1, so that the calculated jitter data is more accurate, and whether the omnidirectional trapezoidal function is triggered is determined.
After triggering the omni-directional trapezoidal function, the embodiment comprises the following triggering steps:
Step A1, calling a gyroscope module 2, measuring the rotation speed and angle of a projector, and analyzing the read original data into angle values of three axes of Roll, pitch and Yaw according to the data format and rule of the gyroscope module 2; the analysis process can be correspondingly realized according to the preset data format and rule of the gyroscope module 2, and can also be adjusted according to actual requirements;
A2, determining the posture of the projector according to the angle values of the three axes of Roll, pitch and Yaw; that is, by determining the rotation angle of the projector with respect to the initial position, the attitude calculation can be performed based on the angle value data of three axes, roll, pitch, and Yaw;
A3, determining the rotation angle and the Pitch angle of the projector according to the posture of the projector, wherein Roll represents the rotation angle of the projector around the X axis, pitch represents the rotation angle of the projector around the Y axis, and Yaw represents the rotation angle of the projector around the z axis; jump to step S3.
The specific method for determining the rotation angle and the pitch angle of the projector in the embodiment is as follows: according to the gyroscope module 2, the reading module obtains acceleration data Ax, ay and Az, wherein Ax is an acceleration value of the pitching state of the projector assuming that Az points to the earth center; ay is the acceleration value of the projector rotator state; az points to the earth's center when the device is flat, 1 gravity acceleration, i.e., 1g. When the projector is tilted upwards by an elevation angle n degrees, acceleration data of Az and Ax axes are changed, and Az=1g×cos (n) and Ax=1g×sin (n) can be obtained through a spatial triaxial and trigonometric function relation; since Az and Ax are known data read from the gyroscope, n can be obtained according to the formula at this time, and the Pitch angle Pitch of the projector is obtained; in the same principle, ay is an acceleration value of a projector rotator, when the device is horizontally placed, az points to the earth center and is 1 gravity acceleration, namely 1g, when the rotation angle of the projector is n degrees, data of the Az and Ay axes can be changed, and az=1g×cos (n ') and ay=1g×sin (n ') can be obtained according to a spatial three-axis and trigonometric function relation, and at the moment, n ' can be obtained as well, so that the rotation angle Roll of the projector is obtained. So far, the final posture of the projector can be determined by further acquiring only one left and right angle, and the embodiment acquires the left and right angle by jumping to step S3 and the subsequent steps.
In the embodiment, in step S3, the projector is focused first, and after waiting for the focus to be clear, the feature map shown in fig. 9 is projected again, and the step S4 is skipped to start the trapezoid program.
In this embodiment, step S4 is used to identify the feature points in the feature map again and calculate the line length, and determine the distortion condition of the image according to the calibration parameters. The method specifically comprises the following substeps:
step S401, firstly, a calibration image is obtained, and a preset feature map is shot through a camera module 4;
Step S402, performing corner detection on the image through findChessboardCorners () functions of visual learning software to obtain feature points on the checkerboard corner;
Step S403, processing the detected corner points at the sub-pixel level through cornerSubPix () function of visual learning software, namely, processing the sub-pixel level through interpolation calculation of the sub-pixel level around the corner points;
Step S404, calculating to obtain four line lengths of each feature image by calculating the distance between every two adjacent feature points according to the feature point coordinates in the calibration image; the lines are formed by characteristic point connecting lines of four angular points, and are obtained by calculating Euclidean distance between two adjacent characteristic points;
Step S405, identifying an intermediate checkerboard in the image, and calculating the edge length of the intermediate checkerboard from the detected feature point coordinates.
In the embodiment, step S5 is used for calculating the relationship between feature graphs, reading a transformation matrix obtained by calibration, that is, reading a transformation matrix of an image space coordinate system and a camera coordinate system, calibrating the coordinates of the detected angular points in the image, and matching the coordinates of the image with the corresponding three-dimensional coordinates in the camera coordinate system to obtain the coordinates of each angular point in the camera coordinate system.
Step S6 in this embodiment calculates the relationship between the five feature maps using the comparison of the triangular relationship and the line length. The step S6 specifically comprises the following substeps:
Step S601, setting four corner coordinate information of a characteristic diagram as (x 1, y1, z 1), (x 2, y2, z 2), (x 3, y3, z 3) and (x 4, y4, z 4) respectively, and setting a relation matrix R between the four corner points and a camera 401 space coordinate system;
step S602, calculating the coordinate d of the first corner restored to the left-right angle of 0 by the formula d=r×xxz '″ wherein [ xyz ]' represents the column vector corresponding to the relation matrix R;
step S603, returning to step S602, replacing the coordinates with the coordinates of the remaining corner points respectively, and calculating the coordinates e, f and g of the coordinates of the remaining three corner points under the spatial coordinate system of the camera 401; after the coordinate d is obtained, calculating the coordinates of the other three corner points continuously to obtain an e coordinate, an f coordinate and a g coordinate under a camera coordinate system, wherein d, e, f and g are respectively used for representing the coordinates of the four corner points;
Step S604, the minimum value of the equation min|r-r_hat| 2 is calculated to obtain a posture matrix r_hat, and at this time, the included angle between the posture matrix r_hat and the direction of the projector can be minimized, and the posture matrix r_hat is expressed as: r_hat= [ a b c d ], wherein a, b, c, d is the rotation angle of the projector in the x, y, z directions respectively.
Step S604, calculating the angle and direction relationship between the five feature graphs by using the comparison of the triangle relationship and the line length through the four corner coordinates d, e, f and g calculated in step S602 and step S603;
Step S605, determining the gesture between the feature graphs according to the angle and direction relation obtained in the step S604, and storing the gesture information as an R_hat one-dimensional matrix.
In step S604 of the present embodiment, the specific method for calculating the angle and direction relationship between the five feature maps by using the comparison of the triangular relationship and the line length is as follows: the line length between the feature maps is calculated. And calculating the length of the lines between the feature graphs by measuring the distance between the corner points. The specific determination mode is as follows: the length L1 of the coordinates d to e, the length L2 of the coordinates d to f, the length L3 of the coordinates d to g, the length L4 of the coordinates e to f, the length L5 of the coordinates e to g, and the length L6 of the coordinates f to g are recorded. The angle and direction relation between the feature graphs can be judged according to the triangular relation, the lengths of L1 and L2 are compared, the angle relation of three corner points with d, e and f coordinates can be determined under the condition of known coordinates, and the angle is assumed to be a1; similarly, comparing the lengths of L1 and L3, the angular relationship of the three corner points with coordinates d, e and g can be determined, assuming an angle of b1; comparing the lengths of L4 and L5, the angular relationship of three angular points with coordinates of e, f and g can be determined, and the assumed angle is c1; comparing the lengths of L4 and L6, the directional relation of the three corner points with the coordinates of e, f and g can be determined, and the angle is assumed to be d1.
The present embodiment step S605 is for determining the pose between feature maps based on the angle and direction relationship. In the step S605, according to the angle and direction relationship obtained in the step S604, the pose between feature maps is determined, and the pose information is stored as an r_hat one-dimensional matrix, that is, an r_hat= [ a1, b1, c1, d1], where a1, b1, c1, d1 are the angles determined in the step S604, and the r_hat one-dimensional matrix is the ceremony data of the projector in three directions of x, y and z.
Step S7 of this embodiment includes the following substeps:
Step S701, converting the gesture matrix R_hat into a quaternion form through cv by Rodrigues function, further obtaining a format matched with the omnidirectional trapezoidal function module, and returning a three-dimensional vector A for representing the quaternion corresponding to the rotation matrix;
Step S702, performing normalization processing on the quaternion to obtain normalized vectors V, v= (Vx, vy, vz) so as to ensure that the length of the normalized vectors is 1;
Step S703, calculating the modulo length V of the vector V by the formula v=sqrt (Vx 2+ Vy2+ Vz2); sqrt represents a square root computation function;
Step S704, calculating an angle_x between the x-axis component and the modular length of the vector V by the formula angle_x= arccos (Vx/|v|); arccos denotes an inverse cosine function;
In step S705 the step of, by the formula angle_y= arccos #; vy/||v|) computing the y-axis angle_y between the component and the modular length of vector V;
in step S706, the step of, by the formula angle_z= arccos #; vz/||v|) calculate the z-axis angle_z between the component and the modulus of vector V;
In step S707, the angle_z is used as the left and right angles of the projector, and the angles angle_x, angle_y, and angle_z are set as the omni-directional trapezoidal parameters of the projector image.
It should be noted that, the included angles angle_x and angle_y are further used for comparing with the angle obtained by the actual gyroscope, confirming the precision change in the conversion process, and further performing final fine adjustment on the projector, thereby increasing the correction precision.
In step S8 of the present embodiment, the included angles angle_x, the included angles angle_y, and the included angles angle_z are input into the omni-directional trapezoidal function module, so that the tri-axial angles of the projector images are respectively adjusted by the omni-directional trapezoidal parameters.
In summary, the embodiment optimizes and perfects the hardware module and the circuit design of the projector based on the omnidirectional trapezoidal technology, and on the hardware, the cooperative work of the main control module 1, the gyroscope module 2, the stepping motor control module 3, the camera module 4, the optocoupler feedback module 5 and the storage module 6 provides a good hardware base for the projection control process of the projector, and can also effectively avoid the defect of high cost; on the basis, a matched projection control process is further provided, a preset feature map is projected through a projector, then feature points of the feature map are detected, the corresponding gesture of each feature point is calculated through a gesture measurement, calculation, setting and corresponding adjustment are carried out according to a gesture matrix, and further feature point detection and gesture detection are reasonably and efficiently organically fused, so that the gesture of the projector is automatically and accurately estimated, and further efficient fusion design between hardware and software is achieved. The embodiment can adapt to different environments to realize high-precision trapezoidal correction, is not influenced by complex scenes, ambient light and other external factors, realizes trapezoidal correction on projection pictures with lower cost and high reliable correction effect, and effectively reduces the use limitation of products.
The foregoing is a further detailed description of the invention in connection with the preferred embodiments, and it is not intended that the invention be limited to the specific embodiments described. It will be apparent to those skilled in the art that several simple deductions or substitutions may be made without departing from the spirit of the invention, and these should be considered to be within the scope of the invention.
Claims (8)
1. A projector based on an omnidirectional trapezoidal technique, comprising: the optical coupling feedback module comprises a first optical coupling feedback unit and a second optical coupling feedback unit, and the first optical coupling feedback unit and the second optical coupling feedback unit are respectively connected to the inner side and the outer side of the telescopic travel of the optical lens of the projector; the driving chip of the stepping motor control module is connected to the stepping motor of the stepping motor control module through a driving interface; the gyroscope module comprises a six-axis gyroscope chip U3, a resistor R892, a capacitor C501, a resistor R893 and a capacitor C502, wherein a first input pin and a second input pin of the six-axis gyroscope chip U3 are connected to the input end of the motion sensor through the resistor R892 and the resistor R893 respectively; the power pin, the enable pin and the data pin of the six-axis gyroscope chip U3 are connected to one end and the power end of the capacitor C501 in time Zhong Guanjiao and are respectively connected to one end of the capacitor C502, the other end of the capacitor C501 and the other end of the capacitor C502 are grounded, and the capacitance values of the capacitor C501 and the capacitor C502 are equal;
the projection control process of the projector based on the omnidirectional trapezoidal technology comprises the following steps of:
step S1, a preset feature map is projected through an optical lens of a projector, after parameter information of feature points in the feature map is identified, the parameter information is stored in a storage module, the preset feature map comprises a middle checkerboard and four corner points, and the checkerboard of the corner points is smaller than the middle checkerboard;
step S2, obtaining rotation angle information of a projector by reading data of the gyroscope module, and triggering an omnidirectional trapezoidal function when a data jitter variation value exceeds a preset threshold value;
step S3, after the projector automatically focuses, starting an omnidirectional trapezoidal function, and projecting a preset characteristic diagram through the optical lens;
Step S4, firstly, obtaining a calibration image, carrying out corner detection and line length calculation on the calibration image, and identifying all feature points of the feature map;
S5, obtaining coordinates of each corner point in a camera coordinate system corresponding to the camera module according to the relation between the feature map and the calibration image;
S6, obtaining an attitude matrix of the projector in the camera coordinate system according to the relation between the feature points and the camera coordinate system;
S7, calculating included angles between the x-axis component, the y-axis component and the z-axis component and the vector module length respectively by using a gesture matrix, and setting omnidirectional trapezoidal parameters of the projector image;
step S8, adjusting a projector image through the stepping motor control module according to the omnidirectional trapezoidal parameters;
the visual angle range of the optical lens of the projector is 40 degrees, the visual angle range of the camera module is 59 degrees, and the visual angle of the FOV of the camera module covers the projection image of the optical lens within the range of 2 meters;
the distance between the optical lens of the projector and the camera of the camera module is 135.74mm, and the camera is offset towards the direction of the optical lens when being installed, and the offset angle is 3 degrees.
2. The projector according to claim 1, wherein the camera module comprises a camera connector J8, a magnetic bead FB2, a capacitor C948, a resistor R5, a resistor R6, a protection tube ESD3, and a protection tube ESD4, and a power supply terminal is connected to one end of the capacitor C948 and the camera connector J8 through the magnetic bead FB2, and the other end of the capacitor C948 is grounded; the data pins of the camera connector J8 are respectively connected to the USB pins of the main control module through the resistor R5 and the resistor R6, and are respectively grounded through the protection tube ESD3 and the protection tube ESD 4.
3. Projector based on the omnidirectional trapezoidal technique according to claim 1 or 2, characterized in that said step S1 comprises the following sub-steps:
Step S101, a preset characteristic diagram is projected through an optical lens of a projector, and then a plurality of characteristic diagrams with different positions and angles are shot through a camera module to serve as calibration images;
Step S102, performing corner detection on each calibration image through findChessboardCorners () function of visual learning software, and returning a detection result;
step S103, carrying out sub-pixel level processing on the detected corner points through cornerSubPix () functions of vision learning software to obtain corner point coordinates;
Step S104, calculating the line length of four corner points of each calibration image, identifying the middle checkerboard, establishing an image space coordinate system and a camera coordinate system, and corresponding the corner point coordinates detected in each calibration image with the three-dimensional coordinates of the camera coordinate system so as to determine the distortion condition of the calibration image and the projection matrix of the camera;
Step S105, performing precision evaluation on a transformation matrix through calibrationMatrixValues () function of vision learning software, wherein the transformation matrix refers to a transformation matrix between an image space coordinate system and a camera coordinate system, so as to obtain offset values between a projector optical lens and a central axis of the camera, and forming a device posture parameter with a front projection left-right angle of 0 through a relation between the transformation matrix and five feature images, so as to serve as parameter information of feature points in the feature images;
and step S106, saving the parameter information to a storage module.
4. The projector according to claim 1 or 2, wherein in the step S2, after reading the data of the gyroscope module, angle information of rotation of the projector around the X axis, the Y axis and the Z axis is obtained, the angle information of the three axes is weighted and superimposed to obtain rotation angle information of the projector, a sum of weights of the three axes is 1, rotation angle information within 3 seconds is calculated, the rotation angle information is used as a data jitter variation value, and an omni-directional trapezoid function is triggered when the data jitter variation value exceeds a preset threshold.
5. Projector based on the omnidirectional trapezoidal technique according to claim 1 or 2, characterized in that it comprises the following triggering steps after triggering the omnidirectional trapezoidal function:
Step A1, a gyroscope module is called, the rotation speed and the angle of a projector are measured, and the read original data are analyzed into angle values of three axes of Roll, pitch and Yaw according to the data format and the rule of the gyroscope module;
A2, determining the posture of the projector according to the angle values of the three axes of Roll, pitch and Yaw;
A3, determining the rotation angle and the Pitch angle of the projector according to the posture of the projector, wherein Roll represents the rotation angle of the projector around the X axis, pitch represents the rotation angle of the projector around the Y axis, and Yaw represents the rotation angle of the projector around the z axis; jump to step S3.
6. Projector based on the omnidirectional trapezoidal technique according to claim 1 or 2, characterized in that said step S4 comprises the following sub-steps:
step S401, firstly, a calibration image is obtained, and a preset feature map is shot through a camera module;
step S402, performing corner detection on the image through findChessboardCorners () function of visual learning software;
Step S403, processing the detected corner points at sub-pixel level through cornerSubPix () function of visual learning software;
step S404, calculating to obtain four line lengths of each feature image by calculating the distance between every two adjacent feature points according to the feature point coordinates in the calibration image;
step S405, calculating the side length of the middle checkerboard by the detected feature point coordinates.
7. Projector based on the omnidirectional trapezoidal technique according to claim 1 or 2, characterized in that said step S6 comprises the following sub-steps:
step S601, setting four corner coordinate information of a characteristic diagram as (x 1, y1, z 1), (x 2, y2, z 2), (x 3, y3, z 3) and (x 4, y4, z 4) respectively, and setting a relation matrix R between the four corner points and a camera space coordinate system;
step S602, calculating the coordinate d of the first corner restored to the left-right angle of 0 by the formula d=r×xxz '″ wherein [ xyz ]' represents the column vector corresponding to the relation matrix R;
step S603, returning to step S602, replacing the coordinates with the coordinates of the remaining corner points respectively, and calculating the coordinates e, f and g of the coordinates of the remaining three corner points under the spatial coordinate system of the camera;
Step S604, calculating the angle and direction relationship between the five feature graphs by using the comparison of the triangle relationship and the line length through the four corner coordinates d, e, f and g calculated in step S602 and step S603;
Step S605, determining the gesture between the feature graphs according to the angle and direction relation obtained in the step S604, and storing the gesture information as an R_hat one-dimensional matrix.
8. The projector according to claim 7, characterized in that said step S7 comprises the following sub-steps:
step S701, converting the gesture matrix R_hat into a quaternion form through a cv: rodrigues function;
step S702, performing normalization processing on the quaternion to obtain normalized vectors V, v= (Vx, vy, vz);
step S703, calculating the modulo length V of the vector V by the formula v=sqrt (Vx 2+ Vy2 + Vz2);
step S704, calculating an angle_x between the x-axis component and the modular length of the vector V by the formula angle_x= arccos (Vx/|v|);
In step S705 the step of, by the formula angle_y= arccos #; vy/||v|) computing the y-axis angle_y between the component and the modular length of vector V;
in step S706, the step of, by the formula angle_z= arccos #; vz/||v|) calculate the z-axis angle_z between the component and the modulus of vector V;
In step S707, the angle_z is used as the left and right angles of the projector, and the angles angle_x, angle_y, and angle_z are set as the omni-directional trapezoidal parameters of the projector image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310928829.7A CN116886881B (en) | 2023-07-26 | 2023-07-26 | Projector based on omnidirectional trapezoidal technology |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310928829.7A CN116886881B (en) | 2023-07-26 | 2023-07-26 | Projector based on omnidirectional trapezoidal technology |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116886881A CN116886881A (en) | 2023-10-13 |
CN116886881B true CN116886881B (en) | 2024-09-24 |
Family
ID=88269726
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310928829.7A Active CN116886881B (en) | 2023-07-26 | 2023-07-26 | Projector based on omnidirectional trapezoidal technology |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116886881B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116886882B (en) * | 2023-07-26 | 2024-07-19 | 深圳市极鑫科技有限公司 | Projection control method and system based on omnidirectional trapezoidal technology |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114401390A (en) * | 2021-11-16 | 2022-04-26 | 海信视像科技股份有限公司 | Projection equipment and projection image correction method based on optical machine camera calibration |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6221287B2 (en) * | 2013-03-22 | 2017-11-01 | セイコーエプソン株式会社 | Projector and projector control method |
JP2015007866A (en) * | 2013-06-25 | 2015-01-15 | ローランドディー.ジー.株式会社 | Projection image correction system, projection image correction method, projection image correction program, and computer-readable recording medium |
WO2019056219A1 (en) * | 2017-09-20 | 2019-03-28 | 神画科技(深圳)有限公司 | Method for horizontal keystone correction of projector |
CN110677634B (en) * | 2019-11-27 | 2021-06-29 | 成都极米科技股份有限公司 | Trapezoidal correction method, device and system for projector and readable storage medium |
CN115086622B (en) * | 2021-03-12 | 2024-05-24 | 中强光电股份有限公司 | Projector and correction method thereof |
CN114071103A (en) * | 2021-11-15 | 2022-02-18 | 四川长虹电器股份有限公司 | Adaptive left-right trapezoidal correction method for projector |
CN116405645A (en) * | 2023-02-09 | 2023-07-07 | 江西星驰电子科技有限公司 | Projector graph correction focusing method and device and readable medium |
-
2023
- 2023-07-26 CN CN202310928829.7A patent/CN116886881B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114401390A (en) * | 2021-11-16 | 2022-04-26 | 海信视像科技股份有限公司 | Projection equipment and projection image correction method based on optical machine camera calibration |
Also Published As
Publication number | Publication date |
---|---|
CN116886881A (en) | 2023-10-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108765498B (en) | Monocular vision tracking, device and storage medium | |
US10645284B2 (en) | Image processing device, image processing method, and recording medium storing program | |
US11748906B2 (en) | Gaze point calculation method, apparatus and device | |
JP6859442B2 (en) | Calibration equipment, calibration system, and calibration method | |
KR20160116075A (en) | Image processing apparatus having a function for automatically correcting image acquired from the camera and method therefor | |
KR20140122126A (en) | Device and method for implementing augmented reality using transparent display | |
CN207854012U (en) | Depth camera based on structure light | |
CN114299156A (en) | Method for calibrating and unifying coordinates of multiple cameras in non-overlapping area | |
CN110490943B (en) | Rapid and accurate calibration method and system of 4D holographic capture system and storage medium | |
JP4974765B2 (en) | Image processing method and apparatus | |
CN111351468A (en) | Electronic device and image ranging method thereof | |
CN116886881B (en) | Projector based on omnidirectional trapezoidal technology | |
CN107560637A (en) | Wear display device calibration result verification method and wear display device | |
CN118411420A (en) | Calibration method, device, equipment and medium for camera on augmented reality equipment | |
CN108205236B (en) | Panoramic camera and lens thereof | |
CN115375773A (en) | External parameter calibration method and related device for monocular laser speckle projection system | |
CN114926538A (en) | External parameter calibration method and device for monocular laser speckle projection system | |
Sahin | Comparison and calibration of mobile phone fisheye lens and regular fisheye lens via equidistant model | |
WO2021134715A1 (en) | Control method and device, unmanned aerial vehicle and storage medium | |
CN107527323B (en) | Calibration method and device for lens distortion | |
CN116886882B (en) | Projection control method and system based on omnidirectional trapezoidal technology | |
CN111649716A (en) | Space point-to-point distance measuring and calculating method based on panoramic image | |
CN117523003A (en) | Camera calibration method and device in multi-camera system and electronic equipment | |
CN111260729B (en) | Method and device for calibrating fisheye lens in vehicle-mounted all-round system | |
KR101882977B1 (en) | Lens Module for Forming 360 Degree Image and Application for Forming 360 Degree Image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |