[go: up one dir, main page]

CN118897401B - An optical system and electronic device - Google Patents

An optical system and electronic device

Info

Publication number
CN118897401B
CN118897401B CN202310498410.2A CN202310498410A CN118897401B CN 118897401 B CN118897401 B CN 118897401B CN 202310498410 A CN202310498410 A CN 202310498410A CN 118897401 B CN118897401 B CN 118897401B
Authority
CN
China
Prior art keywords
distortion
camera
image
optical
optical system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310498410.2A
Other languages
Chinese (zh)
Other versions
CN118897401A (en
Inventor
李先明
石佳
朱帅帅
王久兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202310498410.2A priority Critical patent/CN118897401B/en
Publication of CN118897401A publication Critical patent/CN118897401A/en
Application granted granted Critical
Publication of CN118897401B publication Critical patent/CN118897401B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/02Bodies
    • G03B17/17Bodies with reflectors arranged in beam forming the photographic image, e.g. for reducing dimensions of camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/011Head-up displays characterised by optical features comprising device for correcting geometrical aberrations, distortion
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Studio Devices (AREA)

Abstract

一种光学系统及电子设备,用以精简电子设备在成像过程中的软件算法通路的复杂度。其中,光学系统包括相机、显示屏和成像镜头,相机具有第一光学畸变,成像镜头具有第二光学畸变,第一光学畸变与第二光学畸变的符号相反,且第一光学畸变与第二光学畸变的和值的绝对值小于畸变阈值。通过配置相机和成像镜头的畸变的符号相反且大小相近,使得光线在经过相机和成像镜头后所产生的畸变能够相互抵消,如此可以不再引入反畸变处理算法,从而能够精简软件算法通路的复杂度。

An optical system and electronic device are disclosed to simplify the complexity of software algorithm pathways in the imaging process of electronic devices. The optical system includes a camera, a display screen, and an imaging lens. The camera has a first optical distortion, and the imaging lens has a second optical distortion. The first and second optical distortions have opposite signs, and the absolute value of the sum of the first and second optical distortions is less than a distortion threshold. By configuring the distortions of the camera and imaging lens to have opposite signs and similar magnitudes, the distortions produced by light passing through the camera and imaging lens can cancel each other out. This eliminates the need for anti-distortion processing algorithms, thereby simplifying the complexity of the software algorithm pathways.

Description

Optical system and electronic equipment
Technical Field
The present application relates to the field of optical technologies, and in particular, to an optical system and an electronic device.
Background
Virtual Reality (VR), augmented reality (augmented reality, AR), mixed Reality (MR), extended reality (XR), etc. technologies are common technologies for presenting a virtual world to a user, and in recent years, the development of abrupt progress has been achieved, and all of AR, MR, and XR are based on VR. At present, VR and other technologies are often applied in combination with image perspective technology (video see through, VST). For example, in MR scenes, VR devices can capture real scene information in real time through a camera, so that a user still has the ability of perceiving the surrounding real world when wearing VR devices, and the user can interact with the real world through a screen, so that the virtual reality experience of the user is improved.
However, after acquiring a live-action image captured by a camera, the existing VR device generally needs to send the live-action image to an upper algorithm module to perform a series of processes including anti-distortion and re-projection, and then present the processed image to a user. However, the software algorithm path involved in this manner is long, resulting in relatively high time delay, power consumption and load from motion to imaging (MTP) (i.e., turning from the user to the user seeing the turned picture), which is rather detrimental to improving the VST experience of the user.
Thus, there is currently still a need for further research in the imaging process of electronic devices, such as VR devices.
Disclosure of Invention
The application provides an optical system and electronic equipment, which are used for simplifying the complexity of a software algorithm path of the electronic equipment in the imaging process, reducing the time delay, load and power consumption of MTP and improving the VST experience of a user.
In a first aspect, the present application provides an optical system comprising a camera, a display screen, and an imaging lens, the camera having a first optical distortion, the imaging lens having a second optical distortion, the first optical distortion being of opposite sign to the second optical distortion, and an absolute value of a sum of the first optical distortion and the second optical distortion being less than a distortion threshold. The camera is used for shooting a target object to obtain a first image, the display screen is used for displaying the first image, and the imaging lens is used for amplifying the first image displayed on the display screen.
In the scheme, the distortion generated by the light after passing through the camera and the imaging lens can be offset by configuring the opposite signs and similar magnitudes of the distortion of the camera and the imaging lens, so that an anti-distortion processing algorithm can be not introduced any more, the complexity of a software algorithm path can be reduced, even the software algorithm path can be deleted, and a live-action image shot by the camera can be directly transmitted to a display screen for display, so that the time delay, the load and the power consumption of the MTP are effectively reduced, and the VST experience of a user is improved.
In one possible design, the first optical distortion is negative and the second optical distortion is positive. For example, when designing an optical system, it is possible to configure a lens of a camera as a concave lens, configure a lens of an imaging lens as a convex lens, and make distortion magnitudes of both lenses at the same field of view similar. Therefore, the positive distortion existing in the original convex lens of the imaging lens can be directly utilized to configure the lens of the camera to be the opposite concave lens, so that more lenses are introduced as few as possible, and the cost and the structural complexity of the optical system are reduced.
In one possible design, the distortion threshold may be a value less than 15%. For example, when the distortion threshold is set to 10%, the sum of the first optical distortion and the second optical distortion is in the range of [ -10%,10% ], so that both cost and imaging effect can be achieved.
In one possible design, the optical system may further include a reflection component for reflecting the emitted light from the camera toward the target object and reflecting the reflected light from the target object back to the camera, wherein a distance from the emitted light from the camera to the target object after being reflected by the reflection component is equal to a distance from a line of sight of an eye of the user to the target object.
In the design, the reflection assembly is added in the optical system, so that the distance from the camera to the object after being reflected by the reflection assembly is the same as the distance from the human eye to the object, the axial distance between the camera and the human eye in the conventional optical system is further eliminated, the visual angle range of the object shot by the camera is consistent with the visual angle range of the object seen by the human eye, and the accuracy of the depth information of the image obtained by the object shot by the camera is ensured. In addition, since the depth information of the image directly shot by the camera is accurate, a re-projection algorithm can be not introduced before the image shot by the camera is transmitted to the display screen, so that the complexity of a software algorithm path can be further reduced.
In one possible design, the reflective assembly may include one or more reflective elements. For example, in situations where system cost and structural complexity are desired to be saved, the reflective assembly may be configured to include only one reflective element. In the scene of reducing the overall size of the electronic equipment, the reflection assembly can be configured to comprise at least two reflection elements so as to reduce the size of the optical system in one or more directions by utilizing the reflection between the at least two reflection elements, thereby realizing the miniaturization design of the electronic equipment.
In one possible design, the reflective element may be an optical element having at least one reflective surface, such as a mirror or a reflective prism. By using a relatively common and low cost mirror or prism as the reflective element, the cost and complexity of the optical system can be reduced.
The application provides an optical system, which comprises a camera, a reflecting component, a display screen and an imaging lens, wherein the camera is used for emitting first light rays, receiving second light rays reflected by the reflecting component, processing the second light rays to obtain a first image, the reflecting component is used for reflecting the first light rays from the camera to a target object and reflecting the second light rays from the target object back to the camera, the distance from the first light rays of the camera to the target object after being reflected by the reflecting component is equal to the distance from eyes of a user to the sight of the target object, the display screen is used for displaying the first image, and the imaging lens is used for amplifying the first image displayed on the display screen to obtain the second image.
In the scheme, the reflection assembly is added in the optical system, so that the distance from the camera to the object after being reflected by the reflection assembly is the same as the distance from the human eye to the object, the axial distance between the camera and the human eye in the conventional optical system is further eliminated, the visual angle range of the object shot by the camera is consistent with the visual angle range of the object seen by the human eye, and the accuracy of the depth information of the image obtained by the object shot by the camera is ensured. In addition, because the depth information of the image directly shot by the camera is accurate, a reprojection algorithm can not be introduced before the image shot by the camera is transmitted to the display screen, so that the complexity of a software algorithm path is reduced, even the software algorithm path can be deleted, the live-action image shot by the camera is directly transmitted to the display screen for display, the time delay, the load and the power consumption of the MTP are effectively reduced, and the VST experience of a user is improved.
In one possible design, the camera may have a first optical distortion, the imaging lens may have a second optical distortion, the first optical distortion and the second optical distortion are opposite in sign, and an absolute value of a sum of the first optical distortion and the second optical distortion is less than a distortion threshold.
In one possible design, the first optical distortion may be negative and the second optical distortion may be positive.
In one possible design, the distortion threshold may be less than 15%.
In one possible design, the reflective assembly may include one or more reflective elements.
In one possible design, the reflective element may be a mirror or a prism.
In a third aspect, the present application provides an electronic device, including the optical system according to any one of the first or second aspects.
In one possible design, the electronic device may be a near field display device, such as VR glasses or VR helmets, or may be a terminal device with a display, such as a mobile phone, a display, a television, a HUD, or the like, which is not particularly limited.
The beneficial effects of the second aspect to the third aspect are specifically referred to the technical effects that can be achieved by the corresponding designs in the first aspect, and the detailed description is not repeated here.
Drawings
FIG. 1 schematically illustrates a view field distortion plot for different wavelengths provided by an embodiment of the present application;
FIG. 2 schematically illustrates a negative and positive distortion image distortion morphology provided by an embodiment of the present application;
Fig. 3 schematically illustrates a structural diagram of an electronic device according to an embodiment of the present application;
fig. 4 schematically illustrates a system architecture diagram of an electronic device according to an embodiment of the present application;
FIG. 5 schematically illustrates an imaging scenario of an optical system provided in the industry;
FIG. 6 schematically illustrates an imaging scenario of an optical system provided by an embodiment of the present application;
fig. 7 schematically illustrates a structure of another optical system according to an embodiment of the present application;
fig. 8 schematically illustrates a structure of still another optical system according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, embodiments of the present application will be described in further detail with reference to the accompanying drawings. However, the example embodiments may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. The same reference numerals in the drawings denote the same or similar structures, and thus a repetitive description thereof will be omitted. The words of the expression position and the direction described in the embodiment of the application are described by taking the attached drawings as an example, but can be changed according to the requirement and are all included in the protection scope of the application. The drawings of the embodiments of the present application are merely for illustrating relative positional relationships and are not to scale.
It is noted that in the following description, specific details are set forth in order to provide an understanding of the application. The embodiments of the application may be practiced in a variety of other ways than those described herein, and those of skill in the art will readily appreciate that many modifications are possible without materially departing from the spirit of the application. Therefore, the present application is not limited by the specific embodiments disclosed below.
In the following, some terms in the embodiments of the present application are explained for easy understanding by those skilled in the art.
1. Near eye display (NED to EYE DISPLAY, NED).
NED, also known as head mounted or wearable displays, is one display mode of an augmented reality (augmented reality, AR) display device, a Virtual Reality (VR) display device, a Mixed Reality (MR) display device, or an extended reality (XR) display device. NEDs display near the eyes, creating virtual images in the monocular or binocular vision field, rendering light field information to the human eye through a display device placed within a non-apparent viewing distance (typically less than 25 cm) of the human eye, and reconstructing a virtual scene in front of the eye.
2. Field of view (FOV).
In the optical instrument, a lens of the optical instrument is taken as a vertex, and an included angle formed by two edges of the maximum range of the lens, which can be passed through by an object image of a measured object, is called as an FOV. In brief, FOV is also understood to mean the angle between the edge of the display screen and the line of sight (eye), the larger the display screen, the higher the FOV.
3. Video see-through (VST).
VST technology is one perspective technology that is applicable in AR, VR, MR or XR scenarios. The VST technology can collect a real-time view of the surrounding environment through a camera and display the view through a display screen, so that a human eye of a user can directly see the feeling of the surrounding real world through the display device, and thus, the capability of the user to interact with the surrounding real world can be improved while virtual imaging is provided for the user, and the VST technology has gradually become a mainstream research direction in the NED field in recent years.
4. Dispersion.
Dispersion is a phenomenon in which complex-color light is decomposed into monochromatic light to form a spectrum. For example, the complex color light is decomposed into three different colors of light of red (R), green (G), blue (B). R, G, B three colors of light have different wavelengths.
5. Distortion.
Distortion, also known as distortion, is a single-color optical aberration used to characterize the change in magnification in an image over a field of view at a fixed working distance. Distortion is determined by the optical design of the lens and is typically present in parallel with chromatic dispersion. For example, referring to fig. 1, a view distortion graph with different wavelengths is shown, wherein a curve L R is a distortion curve corresponding to red (R) light, a curve L G is a distortion curve corresponding to green (G) light, and a curve L B is a distortion curve corresponding to blue (B) light. As can be seen from fig. 1, the distortion curves of R, G, B trichromatic light do not coincide with each other, and for any monochromatic light the distortion increases with increasing field of view, i.e. a larger field of view is more prone to distortion than a smaller field of view.
Distortion can be generally classified into negative distortion and positive distortion. Referring to fig. 2, a schematic diagram of an image distortion pattern with negative distortion and positive distortion is shown. In fig. 2 (a), a negative distortion image distortion pattern is shown, in which points in the field of view are closer to the center, and the pattern resembles a barrel, and thus may also be referred to as barrel distortion. In contrast, fig. 2 (B) shows an image distortion pattern of positive distortion, in which points in the field of view are farther from the center, shaped like a pillow, and thus may also be referred to as pillow distortion.
The degree of distortion can be measured by the relative distortion calculated by the following equation (1):
Wherein Dist is relative distortion, y 'is actual image height, which is defined as the height of the actual light on the image plane, and y 0' is ideal image height, which is defined as the height of the reference light on the image plane after scaling through the field of view.
In the above formula (1), the ideal image height y 0' is related to the optical design of the lens, and in general, the lens is manufactured, and the ideal image height is fixed. The true image height y' is related to the FOV of the lens and the optical design and manufacturing deviation of the lens, and the smaller the FOV is, the smaller the true image height is, the lighter the relative distortion is, whereas the larger the FOV is, the larger the true image height is, and the more the relative distortion is.
In the following, some technical features related to the embodiments of the present application are described.
Fig. 3 schematically illustrates a structural diagram of an electronic device according to an embodiment of the present application. The electronic device may be a NED device, such as VR glasses, VR helmets, or AR glasses, AR helmets, or MR glasses, MR helmets, or XR glasses, XR helmets, or the like. The user may wear the NED device to play games, read, watch movies (or television shows), participate in virtual meetings, participate in video education, or video shopping, etc. In some embodiments, the electronic device may also be a terminal device with a display screen, such as a mobile phone, a display, a television, a head-up display system (HUD), and so on. The electronic device of the embodiment shown in fig. 1 is illustrated by taking VR glasses as an example.
Referring to fig. 3, the electronic device may include a display module 100 and a fixing assembly 200, wherein the display module 100 is used to display an image, and the fixing assembly 200 may be used to support the display module 100 and fix the display module 100 in front of eyes of a user when the user wears the electronic device. For example, when the electronic device is AR glasses or VR glasses, the fixing assembly 200 may be a temple and a frame, wherein the frame may be connected between the two temples, and the display module 100 is fixed on the frame. For another example, where the electronic device is an AR helmet or VR helmet, the securing assembly 200 may be a helmet shell. The fixing member 200 may be a metal material or a plastic material, etc., which is not limited in the present application. In addition, the image displayed by the display module 100 may be an image projected onto the display module 100 by a terminal device (for example, a mobile phone, a tablet computer, etc.), or may be an image formed by the display module 100 itself, which is not limited in the present application.
Of course, in other embodiments, when the electronic device is a mobile phone, the fixing assembly 200 may also be a housing of the mobile phone, and the housing may include a middle frame and a rear cover, wherein the rear cover may be fixed on one side of the middle frame, and the display module 100 is fixed on the other side of the middle frame opposite to the rear cover. For example, the display module 100 may be designed with a curved surface, i.e., the display module 100 is a curved screen. At this time, the edge regions of the opposite sides of the display module 100 may be bent toward the rear cover.
Referring to fig. 4 together, fig. 4 is a system architecture diagram of an electronic device according to an embodiment of the present application.
It should be understood that the illustrated electronic device is only one example, and that the electronic device may have more or fewer components than shown in the figures, may combine two or more components, or may have different configurations of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
As shown in fig. 4, the electronic device may include a processor, memory, battery, sensor module, communication module, camera, eye-tracking module, microphone, keys, etc. The various components of the electronic device are described in detail below in conjunction with fig. 4.
The processor is typically used to control the overall operation of the electronic device. The processor may include one or more processing units, e.g., the processor may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (IMAGE SIGNAL processor, ISP), a controller, a video codec, a digital signal processor (DIGITAL SIGNAL processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors. The controller can be a neural center and a command center of the electronic device. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
The memory is used for storing instructions and data. The memory may exist alone or may be configured in a processor. In some embodiments, the memory in the electronic device may be a cache memory. The memory may hold instructions or data that the processor has just used or recycled. If the processor needs to use the instruction or the data again, the instruction or the data can be directly called from the memory, so that repeated access is avoided, the waiting time of the processor is reduced, and the processing efficiency is improved.
In some embodiments, the processor may include one or more input-output interfaces. For example, the input/output interfaces may include an integrated circuit (inter-INTEGRATED CIRCUIT, I 2 C) interface, a universal asynchronous receiver/transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, a serial peripheral interface (SERIAL PERIPHERAL INTERFACE, SPI) interface, and the like.
The I 2 C interface is a bi-directional synchronous serial bus, comprising a serial data line (SERIAL DATA LINE, SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor may contain multiple sets of I 2 C buses.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, UART interfaces are typically used to connect a processor with a communication module. For example, the processor communicates with a Bluetooth module in the communication module through a UART interface to realize the Bluetooth function.
The MIPI interface may be used to connect a processor with peripheral devices such as a display module or camera.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect the processor with a camera, display module, communication module, sensor module, microphone, etc. The GPIO interface may also be configured as an I 2 C interface, an I2S interface, a UART interface, an MIPI interface, or the like.
The USB interface is an interface conforming to the USB standard specification, and specifically may be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface can be used for connecting a charger to charge a battery in the electronic device and can also be used for transmitting data between the electronic device and the peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as cell phones and the like. The USB interface may be USB3.0, which is used for compatible high-speed display interface (DP) signal transmission, and may transmit video and audio high-speed data.
It should be understood that the connection relationship between the modules illustrated in the embodiments of the present application is only illustrative, and does not limit the structure of the electronic device. In other embodiments of the present application, the electronic device may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The communication module may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wi-Fi network (WIRELESS FIDELITY)), bluetooth (BT), global navigation satellite system (global navigation SATELLITE SYSTEM, GNSS), frequency modulation (frequency modulation, FM), near Field Communication (NFC) or Infrared (IR) technology, etc., applied to the electronic device. The communication module may be one or more devices integrating at least one communication processing module. The communication module may receive electromagnetic waves via the antenna, frequency modulate and filter the electromagnetic wave signals, and send the processed signals to the processor. The communication module can also receive the signal to be transmitted from the processor, frequency-modulate and amplify the signal, and convert the signal into electromagnetic waves to radiate through the antenna.
In some embodiments, the electronic device may communicate with the network and other devices through wireless communication techniques. The wireless communication techniques can include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (GENERAL PACKET radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation SATELLITE SYSTEM, GLONASS), a beidou satellite navigation system (beidou navigation SATELLITE SYSTEM, BDS), a quasi zenith satellite system (quasi-zenith SATELLITE SYSTEM, QZSS) and/or a satellite based augmentation system (SATELLITE BASED AUGMENTATION SYSTEMS, SBAS).
Cameras are used to capture still images or video. The object is projected onto the photosensitive element by generating an optical image through the lens of the camera. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, the electronic device may include one or more cameras.
In some embodiments, the camera collects an image including a real object, and the processor may fuse the image collected by the camera with the virtual object and display the fused image through the display module.
In some embodiments, the camera may also be disposed on the lens to capture an image including a human eye, forming an eye tracking module. The processor performs eye movement tracking through the images.
The optical display module is used for displaying the image shot by the camera. The optical display module comprises a display screen, and the display screen comprises a display panel. The display panel may employ a Liquid Crystal Display (LCD) CRYSTAL DISPLAY, an organic light-emitting diode (OLED), an active-matrix organic LIGHT EMITTING diode (AMOLED), a flexible light-emitting diode (FLED), miniled, microLed, micro-oLed, or a quantum dot LIGHT EMITTING diodes (QLED), or the like. The display screen may be a rigid display screen, a flexible display screen, or a spliced display screen formed by combining a rigid screen with a flexible screen, which is not particularly limited.
In some embodiments, the optical display module may further include an imaging lens disposed outside the display screen. After the user wears the electronic equipment, the imaging lens is positioned between the display screen and eyes of the user and is used for amplifying the image displayed by the display screen, so that the user can watch larger and clearer images, and the watching experience of the user is improved.
Microphones, also known as "microphones" or "microphones", are used to convert sound signals into electrical signals. The user may sound near the microphone through the mouth of a person, inputting sound signals to the microphone. The electronic device may be provided with at least one microphone. For example, in some embodiments, the electronic device may be provided with two microphones, and may implement a noise reduction function in addition to collecting sound signals. Or in some embodiments, the electronic device may be further provided with three, four or more microphones, and may identify a sound source, implement a directional recording function, etc., in addition to the functions of collecting sound signals and reducing noise.
The keys are used for a user to input instructions or information. The keys may include a power-on key, a volume key, etc. The keys may be mechanical keys or touch keys. The electronic device may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device.
The sensor module may include one or more of a pressure sensor, a gyroscope sensor, an acceleration sensor, a distance sensor, a temperature sensor, a touch sensor, a bone conduction sensor. Each sensor is described in detail below.
The pressure sensor is used for sensing a pressure signal and can convert the pressure signal into an electric signal. In some embodiments, the pressure sensor may be provided on the display screen. Pressure sensors are of many kinds, such as resistive, inductive or capacitive pressure sensors. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. When a force is applied to the pressure sensor, the capacitance between the electrodes changes. The electronics determine the strength of the pressure from the change in capacitance. When touch operation acts on the display screen, the electronic equipment detects the touch operation intensity according to the pressure sensor. The electronic device may also calculate the location of the touch based on the detection signal of the pressure sensor. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions.
The gyroscopic sensor may be used to determine a motion pose of the electronic device. In some embodiments, the angular velocity of the electronic device about three axes (i.e., x, y, and z axes) may be determined by a gyroscopic sensor. The gyro sensor may be used for photographing anti-shake. The gyro sensor can also be used for somatosensory game scenes and the like.
The acceleration sensor may detect the magnitude of acceleration of the electronic device in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device is stationary. But also for recognizing the gesture of the electronic device.
And a distance sensor for measuring the distance. The electronic device may measure the distance by infrared or laser. In some embodiments, the scene is photographed and the electronic device can range using the distance sensor to achieve quick focus.
The temperature sensor is used for detecting temperature. In some embodiments, the electronic device performs a temperature processing strategy using the temperature detected by the temperature sensor. For example, when the temperature reported by the temperature sensor exceeds a threshold, the electronic device performs a reduction in performance of a processor located in the vicinity of the temperature sensor in order to reduce power consumption to implement thermal protection. In other embodiments, the electronic device heats the battery when the temperature is below another threshold to avoid low temperatures causing the electronic device to shut down abnormally. In other embodiments, the electronic device performs boosting of the output voltage of the battery when the temperature is below a further threshold to avoid abnormal shutdown caused by low temperatures.
Touch sensors, also known as "touch panels". The touch sensor can be arranged on the display screen, and the touch sensor and the display screen form a touch screen, which is also called a touch screen. The touch sensor is used to detect a touch operation acting on or near it. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with a touch operation may be provided through a display screen. In other embodiments, the touch sensor may also be disposed on a surface of the electronic device, different from the location of the display screen.
The bone conduction sensor may acquire a vibration signal. In some embodiments, bone conduction sensor 180M may also be provided in a headset, in combination with an osteoinductive headset. The audio module 170 may analyze the voice signal based on the vibration signal of the sound portion vibration bone block obtained by the bone conduction sensor 180M, so as to implement a voice function. The application processor may analyze the heart rate information based on the blood pressure beat signal acquired by the bone conduction sensor 180M, so as to implement a heart rate detection function.
Although not shown in fig. 4, the electronic device may further include other devices, such as a speaker, a bluetooth device, a positioning device, a flash, a micro-projection device, a Near Field Communication (NFC) device, etc., which are not described herein.
In an electronic device, an optical display module and a camera together constitute an optical system of the electronic device. In an optical system, a video camera is also called a camera. Referring to fig. 5, a schematic diagram of an imaging scheme of an optical system provided in the industry is shown. Wherein:
Fig. 5 (a) shows the layout positional relationship of the components in the optical system, and as described with reference to fig. 5 (a), the optical system includes a camera 501, a display 502, and an imaging lens 503, and the optical axes of the camera 501, the display 502, and the imaging lens 503 are coincident. Wherein the camera 501 is disposed directly in front of the human eye, the display screen 502 and the imaging lens 503 are disposed between the camera 501 and the human eye, and the imaging lens 503 is closer to the human eye than the display screen 502;
Fig. 5 (B) shows an imaging process of the optical system, and referring to fig. 5 (B), the process includes that a camera 501 shoots a target object to obtain a live-action image, and sends the live-action image to an upper processor 510, the upper processor 510 processes the live-action image by using a preset image processing algorithm, and sends the processed image to a display screen 502 for display, and then the image displayed on the display screen 502 is amplified by an imaging lens 503, and the amplified image is presented to a user.
In the imaging procedure described above, the image processing algorithm is preconfigured in the upper layer processor 510, and in the VST scene, an anti-distortion processing algorithm and a re-projection algorithm are generally included. Or may also include a fusion algorithm, such as fusing the live image and the virtual image to obtain a mixed virtual reality image. Or may also include preprocessing algorithms such as blurring the live image, etc. Or may also include other algorithms. The following description will mainly be made of an anti-distortion dispersion processing algorithm and a re-projection algorithm.
Anti-distortion processing algorithm
In optical system designs, the display 502 is typically not very large due to product size and cost constraints. Accordingly, in order to provide a larger FOV to enhance the user's immersion, it may be employed to place an imaging lens 503 in front of the small display screen 502 to achieve the effect of a large FOV by magnifying the image displayed by the display screen 502. However, this also causes distortion of the image after it passes through the lens of the imaging lens 503 because the lens of the imaging lens 503 is designed to enlarge the image with a convex lens, but the convex lens itself has pincushion distortion. Therefore, the live-action image captured by the camera 501 also needs to be subjected to an anti-distortion process before being transferred to the display 502.
The specific flow of the anti-distortion process can be seen from fig. 5 (B), after the live-action image captured by the camera 501 is transferred to the processor 510, the processor 510 performs two processing operations, namely, during the primary processing, the processor 510 performs distortion compensation on the live-action image, and removes the distortion introduced by the camera 501 from the live-action image to obtain an undistorted image, and during the secondary processing, the processor 510 adds a barrel-shaped distortion opposite to the sleeper-shaped distortion of the imaging lens 503 to the undistorted image. In this way, the image to be processed after the two processing operations by the processor 510 is displayed on the display screen 502, and also presents an image with barrel distortion, and the barrel distortion of the image "counteracts" the pincushion distortion generated by the imaging lens 503, so that the human eye sees a normal image from the electronic device.
In addition, the distortion is a monochromic optical aberration, that is, each monochromic light corresponds to one distortion, so when the inverse distortion is added to the live-action image, the inverse distortion needs to be added separately to each monochromic light (such as R light, G light, and B light), so that the distortion and dispersion added to the live-action image "cancel" with the distortion and dispersion of the imaging lens 503 itself, and meanwhile, the distortion and dispersion problems are solved. Since distortion and dispersion are simultaneous, in some scenarios the anti-distortion processing algorithm is also referred to as an anti-distortion dispersion processing algorithm.
Reprojection algorithm
In the optical system design, the camera 501 is disposed directly in front of the human eye, and thus there is also an axial distance between the camera 501 and the human eye, such as Δh illustrated in fig. 5 (a). Due to the axial distance Δh, the depth of view (i.e., h 1 illustrated in fig. 5 (a)) of the image captured by the camera 501 is smaller than the depth of view (i.e., h 2 illustrated in fig. 5 (a)) of the image captured by the human eye, so that the FOV (i.e., V 1 illustrated in fig. 5 (a)) of the image captured by the camera 501 is smaller than the FOV (i.e., V 2 illustrated in fig. 5 (a)) of the image captured by the human eye, and the presentation of the object captured by the camera 501 in the live-view image is larger than the presentation of the object viewed by the human eye in the viewing image. Therefore, when the user turns around, the above depth difference may cause the turning speed of the live-action image captured by the camera 501 to be greater than the turning speed of the user, so as to cause the user to generate a poor experience such as dizziness and a wrong perception of depth. Therefore, the live-action image captured by the camera 501 needs to be re-projected before being transferred to the display 502, specifically referring to fig. 5 (B), the live-action image captured by the camera 501 is first transferred to the processor 510, and under the action of the re-projection algorithm of the processor 510, the outside physical world is projected to the position of the human eye by using other cameras or sensors, so as to perform depth reconstruction on the physical world, and a depth map is obtained, and then the depth information of the live-action image is adjusted by combining the depth map and then displayed in the display 502, and then amplified by the imaging lens 503 and presented to the user. In this way, by compensating for the difference in depth between the live-action image captured by the camera 501 compared to the real image seen by the user's human eye, it is helpful for the human eye to see a normal image from the electronic device that is consistent with the depth of the picture seen by the human eye itself.
By adopting the optical system design scheme, the real image shot by the camera is firstly transmitted to the processor for anti-distortion processing and re-projection processing, so that the distortion problem and the depth difference problem existing in the optical system design can be compensated, and the normal image can be seen by a user. However, the software algorithm path involved in the scheme is long, and the live-action image shot by the camera needs to be displayed to the user through a long algorithm processing process, so that the time delay, the power consumption and the load from the beginning of turning the user to the time when the user sees the changed picture (also called Motion To Photo (MTP)) are relatively high, which is rather unfavorable for improving the VST experience of the user.
In addition, the current common reconstruction algorithm needs to reconstruct the depth of the real physical world, but the reconstruction accuracy of the current stage is limited, and in order to improve the reconstruction efficiency, the whole depth is simply considered as a single depth or represented by using a small number of depths. However, in reality, the photographed object is three-dimensional, and the object may have unique depth at each position, and the depth reconstruction method represented by using a single depth or a small number of depths may possibly make the reconstructed depth have a larger difference from the actual depth, so that the image after the re-projection processing has a certain local distortion phenomenon, which is further unfavorable for improving the VST experience of the user.
In view of this, the embodiment of the present application provides an optical system, by designing devices included in the optical system, an image captured by a camera can be directly sent to a display screen for display, without performing advanced processing by a reverse distortion algorithm and/or a re-projection algorithm, so as to simplify the complexity of a software algorithm path, reduce the time delay, load and power consumption of the MTP, and improve the VST experience of a user.
In view of the above technical problems, the following describes a technical solution provided by an embodiment of the present application with reference to the accompanying drawings.
In the following description of the present application, "plurality" may be understood as "at least two". "and/or" describes an association of associated objects, meaning that there may be three relationships, e.g., A and/or B, and that there may be A alone, while A and B are present, and B alone, where A, B may be singular or plural. "the following item(s)" or "items(s)" or the like, refer to any combination of these items, including any combination of single item(s) or plural item(s). For example, one or more of a, b, or c may represent a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or plural.
And, unless specifically stated otherwise, references to "first," "second," etc. ordinal numbers in the embodiments of the present application are used for descriptive purposes only and are not to be construed as indicating or implying any particular importance or order. For example, the "first image" indicated below is an image obtained by capturing a real scene with a camera, and the "second image" is an image enlarged by an imaging lens, and does not represent that the two images have a difference in order of precedence, priority, or importance.
Embodiment 1
Fig. 6 is a schematic diagram of an imaging scheme of an optical system according to an embodiment of the application. The layout positional relationship of the respective components in the optical system may be the same as that of the optical system illustrated in fig. 5 (a) described above, except that in the optical system, the camera 501 has a first optical distortion, the imaging lens 503 has a second optical distortion, the signs of the first optical distortion and the second optical distortion are opposite, and the absolute value of the sum of the first optical distortion and the second optical distortion is smaller than the distortion threshold. In this way, the camera 501 shoots a target object to obtain a first image with first optical distortion, the first image can be directly transmitted to the display screen 502 for display, then the first image displayed on the display screen 502 is amplified by the imaging lens 503, and the first optical distortion in the first image is compensated under the action of the second optical distortion of the imaging lens 503 to obtain a second image.
Wherein when the sign of the first optical distortion is opposite to the sign of the second optical distortion and the sum value is 0, the second image has no optical distortion, i.e. the user sees a normal undistorted image. Conversely, when the sign of the first optical distortion is opposite to the sign of the second optical distortion but the sum is not 0, the second image has a third optical distortion, and the absolute value of the third optical distortion is smaller than the absolute value of the second optical distortion. That is, although the user sees an image with distortion, the degree of distortion of the image is smaller than that which would be introduced by the imaging lens 503 itself.
The optical distortion can be characterized, for example, by the relative distortion Dist (see equation (1) above), the absolute value of the optical distortion being that of the relative distortion Dist, i.e
The distortion threshold may be configured empirically by a person skilled in the art, or may be a threshold obtained through experimental verification that enables the effect of the second image to be presented to meet the requirements. For instance, in one example, when the optical distortion is characterized by a relative distortion Dist, the distortion threshold may be configured to a value less than 15%. For example, it is found through experimental verification that when the distortion threshold is configured to be 10%, the difference between the absolute value of the first optical distortion and the absolute value of the second optical distortion is within the range of [ -10%,10% ], so that both cost and imaging effect can be simultaneously achieved.
For example, taking the optical distortion characterized by the relative distortion Dist and the distortion threshold as 10%, the correlation between the first optical distortion and the second optical distortion can be seen as shown in the following table 1:
TABLE 1
First optical distortion Dist 1 Second optical distortion Dist 2
[-a-10%,-a+10%] a
[a-10%,a+10%] -a
Where a is a non-negative real number.
As shown in table 1, when designing the optical system:
If the second optical distortion Dist 2 introduced by the imaging lens 503 is a certain positive distortion a, the camera 501 may be configured to have a negative distortion, and the value of the negative distortion remains within 10% of the front and rear of the positive distortion a. For example, the imaging lens 503 introduces 30% of the second optical distortion Dist 2, then the first optical distortion Dist 1 introduced by the camera 501 may be configured as any negative distortion within the range of [ -40%, -20% ], such as-25%, so that after the camera 501 and the imaging lens 503 compensate each other, the second image finally presented to the user has only 5% of the relative distortion;
conversely, if the second optical distortion Dist 2 introduced by the imaging lens 503 is a certain negative distortion-a, the camera 501 may be configured to have a positive distortion, and the values of the positive and negative distortions remain within 10% of the negative distortion-a. For example, the imaging lens 503 introduces-30% of the second optical distortion Dist 2, then the first optical distortion Dist 1 introduced by the camera 501 may be configured to be any positive distortion within the range of [20%,40% ], such as 30%, so that after the camera 501 and the imaging lens 503 compensate each other, the second image that is ultimately presented to the user has no relative distortion.
It should be noted that, the distortion values referred to herein are all described in an ideal state, and some process errors may exist in actual operation, but it should be understood that, all the schemes in which the distortion values are within a certain error range with the distortion values given herein are within the protection scope of the embodiments of the present application, which are not limited in particular.
Illustratively, considering that the imaging lens 503 implements an image magnifying function using a convex lens having a pincushion distortion (i.e., positive distortion), when designing an optical system, a first optical distortion may be configured as barrel distortion (i.e., negative distortion) and a second optical distortion may be configured as pincushion distortion. For example, the lens of the camera 501 may be configured as a concave lens, the lens of the imaging lens 503 may be configured as a convex lens, and the relative distortion magnitudes of the two lenses under the same field of view may be made similar (the absolute value difference is controlled within a range of 10%). In this way, the first image captured by the camera 501 will have barrel distortion, and the first image is displayed on the display screen 502 and then transmitted to the imaging lens 503, and amplified by the concave lens in the imaging lens 503, and meanwhile, barrel distortion introduced by the camera 501 is counteracted under the effect of the pincushion distortion of the concave lens, so that when the amplified image reaches human eyes, the amplified image can be restored to a picture without distortion or with less distortion.
It should be noted that the above example is to directly use the positive distortion existing in the convex lens itself of the imaging lens 503 to configure the lens of the camera 501 as an opposite concave lens so as to introduce as few lenses as possible into the optical system. However, in other embodiments, more lenses may be additionally introduced, such as providing lenses other than convex lenses in the imaging lens 503, so that the lenses in the imaging lens 503 appear to be negative in distortion after being combined, configuring the lenses of the camera 501 to be positive in distortion, or by configuring the lenses in the camera 501 so that the lenses are combined to be positive in distortion. There are many possible implementations, but all that is possible is to provide a solution that enables the camera 501 to directly cancel the distortion introduced by the imaging lens 503 by arranging the imaging lens 503 and the lenses in the camera 501 are within the scope of the embodiments of the present application.
In addition, since the distortion is a monochromatic optical aberration, it is also necessary to configure for each monochromatic light separately when configuring the camera 501 and the imaging lens 503, for example, the relative distortion corresponding to R light configuring the lens of the camera 501 and the relative distortion corresponding to R light configuring the lens of the imaging lens 503 are opposite in sign and similar in magnitude, the relative distortion corresponding to G light configuring the lens of the camera 501 and the relative distortion corresponding to G light configuring the lens of the imaging lens 503 are opposite in sign and similar in magnitude, and the relative distortion corresponding to B light configuring the lens of the camera 501 and the relative distortion corresponding to B light configuring the lens of the imaging lens 503 are opposite in sign and similar in magnitude, so that the dispersions thereof cancel each other while the relative distortions of the camera 501 and the imaging lens 503 cancel each other.
In the first embodiment, by configuring the distortion of the camera and the imaging lens under the same view field with opposite signs and similar magnitudes, the distortions generated by the light after passing through the camera and the imaging lens can be basically offset, so that the anti-distortion processing algorithm can be not introduced before the image shot by the camera is transmitted to the display screen, thereby simplifying the complexity of the software algorithm path, even deleting the software algorithm path, directly transmitting the real image shot by the camera to the display screen for displaying, effectively reducing the delay, load and power consumption of the MTP, and improving the VST experience of the user.
Embodiment II
Fig. 7 is a schematic structural diagram of another optical system according to an embodiment of the application. Fig. 7 (a) shows a top view of the optical system, fig. 7 (B) shows a right side view of the optical system, and referring to fig. 7 (a) and fig. 7 (B), the optical system may further include a reflection assembly 504 in addition to the camera 501, the display 502, and the imaging lens 503. When shooting a target object, the camera 501 emits a first light ray, the first light ray is reflected by the reflecting component 504 and then transmitted to the target object, and then the second light ray is reflected by the target object and then returned to the camera 501. The camera 501 generates a first image according to the received second light, sends the first image to the display screen 502 for display, and the image displayed on the display screen 502 is amplified by the imaging lens 503 to obtain a second image, so as to be presented to the user.
The distance from the light emitted by the camera 501 to the target object after being reflected by the reflecting component 504 is equal to the distance from the eye of the user to the line of sight of the target object. It is understood that the sum of the distance that any light emitted from the camera is transmitted to the reflecting component 504 and the distance that it is transmitted to the target object after being reflected by the reflecting component 504 is equal to the distance of the line of sight of the human eye to the target object. For example, taking the lowest marginal ray illustrated in fig. 7 (a) as an example, the distance from the lowest marginal ray emitted by the camera 501 to the reflection assembly 504 is L 1, the distance from the reflection assembly 504 to the target object after being reflected by the reflection assembly 504 is L 2, and the distance from the human eye to the target object from the lowest marginal line of sight of the human eye to the target object is L 3, then L 1、L2 and L 3 satisfy the following formula (2):
L3=L1+L2......(2)
With continued reference to fig. 7 (a), since the distance from the light beam emitted by the camera 501 to the target object after being reflected by the reflection component 504 is equal to the distance from the eye of the user to the line of sight of the target object, the depth of the camera 501 capturing the target object is identical to the depth of the eye of the user to the target object, so that the FOV of the camera 501 capturing the target object is identical to the FOV of the eye of the user to the target object, for example, both are V as illustrated in fig. 7 (a). In this way, the image obtained by photographing the target object by the camera 501 is the same as the image seen by the user looking at the target object, and the image obtained by photographing the camera 501 can be directly transmitted to the display screen 503 for display, without re-projection processing by the processor.
Illustratively, the reflective assembly 504 may include one or more reflective elements. Such as:
In some embodiments, as shown in fig. 7 (a), the reflective assembly 504 may include only one reflective element with its center point on the optical axis of the display screen 502 and imaging lens 503, and the camera 501 is disposed inside the device housing below the illustration. In this way, the light emitted by the camera 501 is directly reflected by the reflecting element to the target object, and the reflection direction coincides with the line of sight of the human eye to the target object. The shooting depth of the camera 501 is consistent with the watching depth of human eyes through one reflecting element, so that additional parts can be introduced as few as possible, the complexity of an optical system is reduced, and the cost is saved;
In some embodiments, the reflection assembly 504 may include at least two reflection elements, such as an optical system as illustrated in fig. 8, where the reflection assembly 504 includes two reflection elements, namely a reflection element 5041 and a reflection element 5042, a center point of the reflection element 5042 is located on an optical axis of the display screen 502 and the imaging lens 503, the camera 501 is disposed inside a device housing at a lower right side of the illustration, and a center point of the reflection element 5041 is located on an optical axis of the camera 501. In this way, the light emitted by the camera 501 is first reflected by the reflecting element 5041 to the reflecting element 5042, and then reflected by the reflecting element 5042 to the target object 90, and the reflecting direction of the reflecting element 5042 coincides with the direction of the line of sight of the human eye to the target object. Compared with the optical system illustrated in fig. 7 (a), the optical system in fig. 8 has a smaller length in the vertical direction in the drawing, that is, the shooting depth of the camera is consistent with the viewing depth of the human eye through at least two reflecting elements, which is also beneficial to reducing the whole size of the electronic device, compacting the arrangement structure of the internal components of the electronic device and being beneficial to the miniaturization design of the electronic device.
The reflecting element may be any component capable of reflecting the input light, for example an optical element having at least one reflecting surface, such as a mirror or a reflecting prism. The reflecting mirror may be a plane reflecting mirror, a spherical reflecting mirror, or an aspherical reflecting mirror. The prism is also called a polygon mirror, and may be, for example, a triangular prism, a right-angle prism, or a pentagonal prism.
In the second embodiment, the reflection assembly is added to the optical system, so that the distance from the camera to the object after being reflected by the reflection assembly is the same as the distance from the human eye to the object, and further the axial distance between the camera and the human eye in the existing optical system can be eliminated, so that the viewing angle range of the object photographed by the camera and the viewing angle range of the object seen by the human eye are kept consistent, and the accuracy of the depth information of the image obtained by the object photographed by the camera is ensured. Furthermore, because the depth information of the image directly shot by the camera is accurate, a re-projection algorithm can not be introduced before the image shot by the camera is transmitted to the display screen, so that the complexity of a software algorithm path can be reduced, even the software algorithm path can be deleted, the live-action image shot by the camera is directly transmitted to the display screen for display, the time delay, the load and the power consumption of the MTP are effectively reduced, and the VST experience of a user is improved.
It should be noted that, the first embodiment can simplify the anti-distortion processing algorithm in the original algorithm path, and the second embodiment can simplify the re-projection algorithm in the original algorithm path. The first and second embodiments are described separately from the perspective of how to reduce one algorithm, but in actual operation, the two embodiments may also be combined with each other, for example, in a possible combination scheme, not only the first optical distortion opposite to the second optical distortion of the imaging lens is configured for the camera in the optical system, but also the reflection component is added in the optical system, so that the anti-distortion processing algorithm in the original algorithm path is reduced, the re-projection algorithm in the original algorithm path is reduced, the complexity of the software algorithm path is further reduced, or even if the software algorithm path is deleted, the software algorithm path can also be designed by itself depending on the optical system, and the accuracy of the depth information of the displayed image is ensured while the displayed image has no distortion or less distortion.
In addition, the application also provides an electronic device, which comprises the imaging system, such as the imaging system and a shell, wherein the imaging system is packaged in the shell.
The electronic device may be, for example, a NED device, such as VR glasses or VR helmets, or may be a terminal device with a display, such as a mobile phone, a display, a television, a HUD, or the like, which is not limited in particular.
The foregoing is merely illustrative embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about variations or substitutions within the technical scope of the present application, and the application should be covered. Therefore, the protection scope of the application is subject to the protection scope of the claims.

Claims (7)

1.一种光学系统,其特征在于,包括相机、显示屏和成像镜头,所述相机具有第一光学畸变,所述成像镜头具有第二光学畸变,所述第一光学畸变与所述第二光学畸变的符号相反,且所述第一光学畸变与所述第二光学畸变的和值的绝对值小于畸变阈值;1. An optical system, characterized in that it comprises a camera, a display screen and an imaging lens, wherein the camera has a first optical distortion, the imaging lens has a second optical distortion, the first optical distortion and the second optical distortion have opposite signs, and the absolute value of the sum of the first optical distortion and the second optical distortion is less than a distortion threshold. 所述相机,用于拍摄目标物体,得到第一图像;The camera is used to capture the target object and obtain a first image; 所述显示屏,用于显示所述第一图像;The display screen is used to display the first image; 所述成像镜头,用于对所述显示屏上显示的所述第一图像进行放大。The imaging lens is used to magnify the first image displayed on the display screen. 2.如权利要求1所述的光学系统,其特征在于,还包括反射组件;2. The optical system as claimed in claim 1, characterized in that it further comprises a reflective component; 所述反射组件,用于将来自所述相机的发射光线反射至所述目标物体,以及将来自所述目标物体的反射光线反射回所述相机;The reflective component is used to reflect emitted light from the camera to the target object, and to reflect reflected light from the target object back to the camera; 其中,所述相机的发射光线经过所述反射组件反射后到所述目标物体的距离和用户的眼睛看向所述目标物体的视线的距离相等。Wherein, the distance from the camera's emitted light rays to the target object after being reflected by the reflective component is equal to the distance of the user's line of sight to the target object. 3.如权利要求1或2所述的光学系统,其特征在于,所述第一光学畸变为负畸变,所述第二光学畸变为正畸变。3. The optical system as described in claim 1 or 2, wherein the first optical distortion is negative distortion and the second optical distortion is positive distortion. 4.如权利要求1或2所述的光学系统,其特征在于,所述畸变阈值小于15%。4. The optical system as claimed in claim 1 or 2, wherein the distortion threshold is less than 15%. 5.如权利要求2所述的光学系统,其特征在于,所述反射组件包括一个或多个反射元件。5. The optical system of claim 2, wherein the reflective component comprises one or more reflective elements. 6.如权利要求5所述的光学系统,其特征在于,所述反射元件为反射镜或棱镜。6. The optical system as claimed in claim 5, wherein the reflecting element is a mirror or a prism. 7.一种电子设备,其特征在于,包括如权利要求1至6中任一项所述的光学系统。7. An electronic device, characterized in that it includes an optical system as claimed in any one of claims 1 to 6.
CN202310498410.2A 2023-05-05 2023-05-05 An optical system and electronic device Active CN118897401B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310498410.2A CN118897401B (en) 2023-05-05 2023-05-05 An optical system and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310498410.2A CN118897401B (en) 2023-05-05 2023-05-05 An optical system and electronic device

Publications (2)

Publication Number Publication Date
CN118897401A CN118897401A (en) 2024-11-05
CN118897401B true CN118897401B (en) 2025-12-02

Family

ID=93261878

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310498410.2A Active CN118897401B (en) 2023-05-05 2023-05-05 An optical system and electronic device

Country Status (1)

Country Link
CN (1) CN118897401B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117608093B (en) * 2023-12-26 2025-11-21 华勤技术股份有限公司 Virtual reality equipment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110335200A (en) * 2018-03-29 2019-10-15 腾讯科技(深圳)有限公司 A virtual reality anti-distortion method, device and related equipment
CN110490820A (en) * 2019-08-07 2019-11-22 Oppo广东移动通信有限公司 Image processing method and device, electronic equipment, storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4889036B2 (en) * 2007-07-17 2012-02-29 キヤノン株式会社 Image processing apparatus and image processing method
CN103792674B (en) * 2014-01-21 2016-11-23 浙江大学 A kind of apparatus and method measured and correct virtual reality display distortion
US10262400B2 (en) * 2014-10-31 2019-04-16 Huawei Technologies Co., Ltd. Image processing method and device using reprojection error values
CN113160067A (en) * 2021-01-26 2021-07-23 睿爱智能科技(上海)有限责任公司 Method for correcting VR (virtual reality) large-field-angle distortion

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110335200A (en) * 2018-03-29 2019-10-15 腾讯科技(深圳)有限公司 A virtual reality anti-distortion method, device and related equipment
CN110490820A (en) * 2019-08-07 2019-11-22 Oppo广东移动通信有限公司 Image processing method and device, electronic equipment, storage medium

Also Published As

Publication number Publication date
CN118897401A (en) 2024-11-05

Similar Documents

Publication Publication Date Title
JP7408678B2 (en) Image processing method and head mounted display device
US20250125657A1 (en) Nfc communication and qi wireless charging of eyewear
US12015842B2 (en) Multi-purpose cameras for simultaneous capture and CV on wearable AR devices
US12007569B2 (en) Compact catadioptric projector
US20220103752A1 (en) Ultra low power camera pipeline for cv in ar systems
US12386176B2 (en) Thermal architecture for smart glasses
US11921286B2 (en) Lens array for shifting perspective of an imaging system
CN114624875B (en) Image calibration method and device
CN118897401B (en) An optical system and electronic device
US20250310507A1 (en) Image processing method, head-mounted display device, and medium
EP4449185A1 (en) Eyewear including a non-uniform push-pull lens set
CN118176471A (en) Goggles that allow current consumption optimization of wireless system interfaces
US12498583B2 (en) Eyewear having a projector with heat sink shields
US12300877B2 (en) Projector with integrated antenna

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant