CN117880623B - Synchronization method of binocular lens and method for receiving end to acquire synchronous image - Google Patents
Synchronization method of binocular lens and method for receiving end to acquire synchronous image Download PDFInfo
- Publication number
- CN117880623B CN117880623B CN202410269530.XA CN202410269530A CN117880623B CN 117880623 B CN117880623 B CN 117880623B CN 202410269530 A CN202410269530 A CN 202410269530A CN 117880623 B CN117880623 B CN 117880623B
- Authority
- CN
- China
- Prior art keywords
- isp
- sensor
- ahd
- image
- time point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/02—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/665—Control of cameras or camera modules involving internal camera communication with the image sensor, e.g. synchronising or multiplexing SSIS control signals
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
The invention discloses a binocular lens synchronization method and a method for acquiring synchronous images by a receiving end, wherein a binocular lens is provided with two sensors, two ISPs and an AHD decoder which are of the same model, the binocular lens is arranged on a binocular module board, the power-on time sequences of the two sensors are consistent, the two ISPs perform image optimization on the transmitted images with a fixed frame rate Raw format, then convert the images into images with a YUV format and perform AHD coding, an AHD signal with the fixed frame rate is output to the AHD decoder, the AHD decoder receives and decodes the AHD signal input by ISP-A and the AHD signal input by ISP-B, and a single-channel MIPI signal is output. The binocular head not only can ensure the accurate synchronization of binocular images, but also is easy to use by a receiving end.
Description
Technical Field
The invention relates to the technical field of cameras, in particular to a binocular lens synchronization method and a method for acquiring synchronous images by a receiving end.
Background
In some applications for measuring the distance between moving objects, binocular cameras are often used, and the images output by the binocular cameras are required to be synchronized as much as possible, wherein synchronization refers to that the exposure time difference of the pictures shot by the 2 cameras is as small as possible, 2 pictures obtained by the back-end application occur at the same time as much as possible, the common requirement is that the time difference is within 10 milliseconds, and the smaller time difference can reduce the measurement error caused by the movement of the objects between the two cameras.
Some products have the method that the binocular lens is provided with only 2 sensors, raw data (original data) of the 2 sensors are transmitted to a chip (Soc), then the chip performs image processing, and an optimized image is given to the rear end for ranging application, and the method has 2 basic requirements on the chip: 1. the chip supports 2 paths of sensor input, and it is better if the chip can process Raw data (Raw data) of 2 paths of sensor input at the same time; 2. in order to ensure that two sensors on a binocular module board are exposed at the same time as much as possible, a chip is required to output an Fsync pulse signal to the two sensors on the binocular module board at the same time periodically, the sensors generally have an Fsync pin, the sensors receive the Fsync pulse and do exposure operation through configuration of the sensors, and in order to ensure that the frame rate of a video stream is fixed, the periodic pulse signal cannot be influenced by other matters of the chip to cause time deviation.
In addition, the chip should process other transactions while keeping the raw data from the sensors as small as possible, which is particularly important in applications where fast measurement speeds are required. Therefore, the mode has certain requirements on the model selection of the back-end chip, and meanwhile, the chip also needs to be added with software and hardware operation for synchronizing the binocular images. The functional block diagram is shown in fig. 1.
In many application scenarios requiring a camera, the use of analog high-definition cameras is very mature. For an analog high-definition camera, generally, the analog high-definition camera comprises a sensor and an image signal processor (AHD ISP) with an analog high-definition video stream output function, and the binocular module board can perform a series of optimization processes on Raw data (original data) acquired by the sensor, including: automatic exposure, automatic white balance, color matrix conversion, color optimization and the like, and finally, the optimized image is encoded into an AHD (analog high definition protocol) format, and can be transmitted to receiving equipment at the back end only by one signal wire, and the back end equipment can be free from processing image effects.
Although the technology of simulating high-definition cameras is mature, if 2 simulated high-definition cameras are simply used as binocular image input equipment of a ranging system, the synchronism of binocular images cannot be ensured because 2 modules are isolated and have no communication relationship, and the binocular system can only be generally used for image stitching application, but cannot be applied to scenes for ranging moving objects.
For the transmission of the AHD format signal, a decoding chip of a double-path AHD is further needed to be added at the receiving end, the decoding chip transmits the images of 2 cameras to the chip at the rear end by using the virtual channel staggered data transmission mode of MIPI, and the chip distinguishes the images of 2 cameras from the MIPI signal and outputs the images to the rear end for application, and the principle is as shown in figure 2.
In view of this, the present invention is developed in response to the shortcomings and inconveniences caused by the existing synchronization method of the binocular camera and the method of receiving the binocular camera by the receiving end, and the present invention is actively researched and improved.
Disclosure of Invention
The invention aims to provide a binocular synchronization method of a binocular lens, which can enable a receiving end to easily obtain binocular synchronization images.
In order to achieve the above object, the solution of the present invention is:
One camera has a sensor-A and an ISP-A connected with the sensor-A, the other camera has a sensor-B and an ISP-B connected with the sensor-B, the sensor-A and the sensor-B are of the same model, the ISP-A and the ISP-B are of the same model, flash memories (flash) used by the ISP-A and the ISP-B are of the same model, the flash memories (flash) burn the same firmware, the power supplies required by the ISP-A and the ISP-B are generated by the same power supply chip, clock sources provided for the ISP-A and the ISP-B are the same active crystal oscillator, reset signals of the ISP-A and the ISP-B are generated by the same reset chip, the sensor-A, the sensor-B, ISP-A and ISP-B, AHD decoders are arranged on a binocular module board, the power-on time sequence of the sensor-A is consistent with that of the sensor-B, the ISP-A performs image optimization on the image with the fixed frame rate Raw format transmitted by the sensor-A, converts the image into the image with the YUV format and performs AHD coding, outputs an AHD signal with the fixed frame rate to an AHD decoder, the ISP-B performs image optimization on the image with the fixed frame rate Raw format transmitted by the sensor-B, converts the image into the image with the YUV format and performs AHD coding, outputs an AHD signal with the fixed frame rate to the AHD decoder, and the AHD decoder receives and decodes the AHD signal input by the ISP-A and the AHD signal input by the ISP-B to output a single-channel MIPI signal;
The binocular lens synchronization method comprises the following steps:
step A1, ISP-A and ISP-B are powered on simultaneously;
Step A2, providing a sensor-A clock source by the ISP-A at the moment of a first time point, and providing a sensor-B clock source by the ISP-B at the moment of the first time point;
Step A3, ISP-A resets sensor-A at the moment of the second time point, ISP-B resets sensor-B at the moment of the second time point;
step A4, the ISP-A configures a sensor-A at the moment of a third time point, and the ISP-B configures a sensor-B at the moment of the third time point;
Step A5, enabling the ISP-A to start exposing the sensor-A at the moment of a fourth time point, and enabling the ISP-B to start exposing the sensor-B at the moment of the fourth time point;
Step A6, the sensor-A outputs the Raw format image to the ISP-A using the fixed frame rate, and the sensor-B outputs the Raw format image to the ISP-B using the same fixed frame rate as the sensor-A.
After the binocular heads are synchronized, the receiving end does not need to perform any control operation on the binocular synchronization.
Another object of the present invention is to provide a method for a receiving end to acquire a synchronization image.
In order to achieve the above object, the solution of the present invention is:
A method for obtaining synchronous image by receiving end includes providing two-eye lens and receiving end, setting one camera with sensor-A and ISP-A connected to sensor-A, setting the other camera with sensor-B and ISP-B connected to sensor-B, setting the sensor-A and sensor-B to be the same model, setting flash memories (flash) used by ISP-A and ISP-B to be the same model, burning the same firmware by flash memories (flash), generating clock source provided to ISP-A and ISP-B by the same power chip as same active crystal oscillator by the same power chip, generating reset signal of ISP-A and ISP-B as same reset chip, the sensor-A, the sensor-B, ISP-A and the ISP-B, AHD decoder are arranged on the binocular module board, the power-on time sequence of the sensor-A is consistent with that of the sensor-B, the ISP-A performs image optimization on the image with the fixed frame rate Raw format transmitted by the sensor-A, converts the image into the image with the YUV format and performs AHD coding, outputs an AHD signal with the fixed frame rate to the AHD decoder, the ISP-B performs image optimization on the image with the fixed frame rate Raw format transmitted by the sensor-B, converts the image into the image with the YUV format and performs AHD coding, outputs an AHD signal with the fixed frame rate to the AHD decoder, and the AHD decoder receives the AHD signal input by the ISP-A and the AHD signal input by the ISP-B and decodes the AHD signal and outputs a single-channel MIPI signal; the receiving end is provided with a chip;
the method for acquiring the synchronous image by the receiving end comprises the following steps:
s1, powering on ISP-A and ISP-B of a binocular lens simultaneously;
Step S2, providing a sensor-A clock source at the moment of a first time point by the ISP-A, and providing a sensor-B clock source at the moment of the first time point by the ISP-B;
S3, resetting the sensor-A at the moment of the second time point by the ISP-A, and resetting the sensor-B at the moment of the second time point by the ISP-B;
S4, the ISP-A configures a sensor-A at the moment of a third time point, and the ISP-B configures a sensor-B at the moment of the third time point;
S5, enabling the ISP-A to start exposing the sensor-A at the moment of a fourth time point, and enabling the ISP-B to start exposing the sensor-B at the moment of the fourth time point;
S6, the sensor-A outputs the Raw image to the ISP-A by using a fixed frame rate, and the sensor-B outputs the Raw image to the ISP-B by using the same fixed frame rate as the sensor-A;
S7, performing image optimization and AHD encoding on the ISP-A and the ISP-B simultaneously;
S8, ISP-A and ISP-B simultaneously output AHD signals to an AHD decoder at a fixed frame rate;
S9, an AHD decoder receives two paths of AHD signals of the ISP-A and the ISP-B for decoding and outputs a single path of MIPI signal;
step S10, a chip of the receiving end receives an MIPI signal output by an AHD decoder, 2 paths of AHD video streams are distinguished through identifiers of virtual channels in MIPI data packets, and 2 paths of synchronous and stable video streams are obtained according to line-field synchronous information of each of the 2 paths of AHD video streams in the data packets.
For operations of receiving MIPI signals and obtaining multichannel images through MIPI virtual channels, a common chip is easy to realize. The receiving end does not need to perform optimization processing on the image, and a synchronous and optimized binocular image is obtained.
After the scheme is adopted, the method for synchronizing the binocular lens and the method for acquiring the synchronous image by the receiving end have the following advantages compared with the prior art:
1. the image optimization work of the binocular lens is completed at the camera end, and the receiving end does not need to process the work related to the image optimization.
2. The binocular image synchronization work of the binocular lens is completed at the camera end, and the receiving end does not need to process control work related to the image synchronization.
3. The binocular images output by the binocular lens can achieve accurate synchronization, and can be applied to scenes in which distance measurement is required to be carried out on fast moving objects.
4. ISP-A and ISP-B used by the binocular lens of the invention are only a single chip for processing image effect, the software architecture is simple, and the problem of frame dropping can not occur.
5. The firmware burnt by the ISP-A and the ISP-B in the binocular lens is only general firmware of a conventional analog camera, other special operations are not required to be added for binocular image synchronization, and the effect of binocular synchronization can be obtained by utilizing proper hardware design.
6. The binocular lens has a double-path AHD decoding function at the camera end, the camera finally transmits the image to the rear end (receiving end) through the MIPI transmission interface, and the chip of the receiving end is provided with a standard MIPI input interface, so that the binocular lens does not have the problem of chip type selection.
The binocular head not only can ensure the accurate synchronization of binocular images, but also is easy to use by a receiving end. The binocular lens transmits the image to the receiving end, the receiving end does not need to participate in the image optimization work or process the control work related to the image synchronization, the images acquired from the camera end are precisely synchronized and the effect is optimized, and the images can be directly output to the ranging application of the rear end.
Drawings
Fig. 1 is a schematic block diagram of a conventional binocular lens interacting with a chip.
Fig. 2 is a schematic block diagram of interaction between a conventional binocular lens and a receiving end.
Fig. 3 is an overall flow diagram of the present invention.
Fig. 4 is a schematic diagram of a hardware framework of the present invention.
FIG. 5 is a timing diagram of a power-up sequence of OmniVison sensors.
FIG. 6 is a timing diagram of the power-up sequence of the sensor for the ON Semiconductor.
Fig. 7 is a timing diagram of a power-up sequence of a SONY sensor.
Fig. 8 is a power-on timing diagram of an ISP.
Detailed Description
In order to further explain the technical scheme of the invention, the invention is explained in detail by specific examples.
As shown in fig. 3 and 4, the present invention discloses a synchronous binocular lens for a receiving end, wherein one camera has a sensor-a and an ISP-a, the other camera has a sensor-B and an ISP-B, the sensor-a and the sensor-B are of the same model, the ISP-a and the ISP-B are of the same model, and a flash memory (flash) used by the ISP-a and the ISP-B is of the same model. Because the design, manufacture and test of the same type of chip are highly accurate and consistent, the two cameras can be ensured to work simultaneously and have the same time sequence after being electrified under the same temperature and power supply environment.
In chip design, designers use computer aided design tools to create functional and timing specifications for chips and ensure that chips can function properly at a given clock frequency; during manufacturing, a strictly consistent manufacturing process is adopted to ensure that the structure and performance of the chip are consistent with the design specification; during testing, chips which do not meet the specification requirements are screened out by a precise and consistent testing means, so that the chips delivered from the factory are guaranteed to have the same performance and time sequence characteristics. When the chips are powered up, since the same model chips are produced according to the same design and manufacturing process, their circuit structure and component parameters are the same, which allows them to have the same timing behavior under the same clock signal.
After ensuring that the chips of the two cameras are of the same model, the binocular lens synchronization method is realized according to the following thought:
The first frame is synchronized after the 2 paths of video streams of the binocular heads are electrified, and the primary condition is that the sensor-A and the sensor-B are simultaneously exposed and simultaneously output first frame data after the power is electrified, namely the power-on time sequences of the sensor-A and the sensor-B are consistent. The following takes descriptions of power-on timings of sensors in several mainstream sensor factory specifications as examples:
OmniVison (welt semiconductor):
as shown in fig. 5, after the reset release of sensor-a and sensor-B, the system is put into a stable and waiting operation state, the ISP writes the correct configuration to sensor-a and sensor-B through I2C, and after the duration of the fourth time point +t4, sensor-a and sensor-B can output the first frame of the video stream. Since the reset time periods of the sensor-a and the sensor-B and the configurations of the sensor-a and the sensor-B are controlled by the ISP, the time point at which the reset of the sensor-a and the sensor-B is released and the time point at which the I2C configuration ends can be controlled to be the same.
The fourth time point is the fixed delay in the sensor, and the difference of the sensors of the same model is not large. t4 depends on the exposure time of the sensor, since the external brightness conditions of sensor-A and sensor-B on the binocular head are identical and the exposure time will be defined by ISP, t4 of sensor-A and sensor-B will be the same.
After the same time is ensured, the sensor-A and the sensor-B can be simultaneously exposed and output first frame data after being electrified, namely the first frames of the video streams output by the binocular heads are kept synchronous.
ON Semiconductor (An Senmei)
As shown in fig. 6, after the sensor is powered on, the time period t4 of reset (reset time) is controlled by the ISP, and the reset release time points of the sensor-a and the sensor-B can be set to be the same as well.
T5 and t6 are internally determined by sensor-A and sensor-B, which are not significantly different for the same model of sensor-A and sensor-B. Software Standby (Software Standby) is also controlled by the ISP, so sensor-A and sensor-B can be set to the same duration. After the same time is ensured, the sensor-A and the sensor-B can be simultaneously exposed and simultaneously output the first frame data after being powered on.
Sony (Sony):
As shown in fig. 7, the XVS and XHS trigger signals are not required since the binocular heads would only be in Master mode. Similar to the two sensors mentioned above, the AHD ISP controls the reset, I2C configuration and standby state (standby) time of the sensors after power-up, so that the first frame data can be simultaneously output after 2 sensors are powered up.
As can be derived from the sensor timing information of the above three companies, the sensor belongs to a passive device in the camera, and the time interval (Tpower _up_to_out) from powering up to outputting the first frame data of the sensor depends on the inherent delay inside the sensor and the related control of the ISP, and the control includes: the reset time of the sensor, the time point when the sensor jumps out of standby state (standby), the time point when the I2C starts to configure and the parameters configured by the I2C. In the invention, the sensor-A and the sensor-B of the binocular lens are of the same model, so that the intrinsic delay inside the sensor-A and the sensor-B can be ensured to be basically consistent. Then Tpower up to out depends only on the relevant control of ISP-a and ISP-B, so that ISP-a and ISP-B give the respective sensors a reset release time point, a time point when the sensor jumps out of standby state (standby), a time point when I2C starts to configure are the same, ISP-a and ISP-B must be of the same model, and flash memories (flash) storing firmware corresponding to ISP-a and ISP-B must also be of the same model, and burn the same firmware.
Meanwhile, the power supplies required by the ISP-A and ISP-B modules are generated by the same power supply chip, so that the power supply environments of the ISP-A and ISP-B are ensured to be the same; the clock sources (EXTCLK) provided to ISP-A and ISP-B must be the same active crystal oscillator to ensure that the clock frequencies of ISP-A and ISP-B are the same; the reset signals (RESETB) of ISP-a and ISP-B must be generated for the same reset chip to ensure that ISP-a and ISP-B enter an operating state simultaneously after power-up.
Taking a power-on sequence of a common ISP as an example:
As shown in FIG. 8, since the power supply conditions of ISP-A and ISP-B are the same, the same clock source and reset chip are used, so that the durations of ISP-A and ISP-B from power-on to reset release are consistent (second point in time in the example). Since ISP-a and ISP-B use flash memories (flash) of the same model and burn the same firmware, the time interval from reading firmware from flash memory (flash) through the SPI interface to starting the relevant control and configuration of the respective sensors through the I2C interface is identical (T4 in the example).
It follows that the SPI is simultaneous to the operation of the respective sensors after power up.
According to the description of the sensor and the ISP of the binocular lens, after the binocular module board of the binocular lens is powered on, the ISP-A and the ISP-B can operate and configure the respective sensors at the same time, and the operation and configuration contents (namely, the data quantity written into the sensor-A and the sensor-B through the I2C) are completely consistent. The sensor-a and the sensor-B are simultaneously exposed after being powered on and output a first frame Raw data (Raw data) of the effective image at the same time.
The Raw data (Raw data) output to the ISP-a and ISP-B by the sensor-a and the sensor-B is only luminance data, and the luminance data needs to be optimized by the ISP and converted into data in YUV format, so that the luminance data can be used by the back end. Taking the AHD ISP used in the invention as an example, the received Raw data is optimized and converted into YUV data, and then encoded into an AHD format and output to the back end.
Since the ISP-a and the ISP-B are of the same model, the same firmware is run, the processing of Raw data and the data format conversion of the ISP-a and the ISP-B are completed by the hardware inside the ISP chip, and the chips of the same model ensure the same circuit characteristics and timing behavior after leaving the factory, so that the interval time from the receiving of Raw data (Raw data) of the first frame effective image to the outputting of an AHD signal of the first frame effective image is the same for the ISP-a and the ISP-B. So that both ISP-a and ISP-B will output the AHD signal for the first frame of active image at the same time after power up.
According to the above description, the AHD signal of the first frame effective image after the binocular head is powered up has been synchronized. In order to ensure that the binocular images remain always synchronized in a simple way, the ISP's firmware can set the output of the camera to a video stream of fixed frame rate, such setting being a conventional setting of a normal analog camera, no special handling being required in software.
In the invention, the decoder for the AHD signal is also arranged on the binocular module board and is an AHD decoder with double-path AHD input and single-path MIPI output, and the decoder is very mature in the market. After receiving the MIPI signal output by the decoder, a chip on the back-end device can distinguish 2 paths of AHD video streams through identifiers of Virtual Channels (VCs) in the MIPI data packet, and obtain 2 paths of stable video streams according to respective synchronous information of the 2 paths of AHD video streams in the data packet. For chips that perform this function, it is already common in the market that there is essentially no choice of type. For operations of receiving the MIPI signal and distinguishing and acquiring the multi-channel image through the MIPI virtual channel, a general chip can be easily implemented.
The above examples and drawings are not intended to limit the form or form of the present invention, and any suitable variations or modifications thereof by those skilled in the art should be construed as not departing from the scope of the present invention.
Claims (2)
1. A binocular lens synchronizing method is characterized in that: the system comprises two cameras, wherein one camera is provided with a sensor-A and an ISP-A connected with the sensor-A, the other camera is provided with a sensor-B and an ISP-B connected with the sensor-B, the sensor-A and the sensor-B are of the same model, the ISP-A and the ISP-B are of the same model, flash memories used by the ISP-A and the ISP-B are of the same model, the flash memories burn the same firmware, the clock sources of the ISP-A and the ISP-B which are required by the same power chip are the same active crystal oscillator, reset signals of the ISP-A and the ISP-B are generated by the same reset chip, the power-on time sequences of the sensor-A, the sensor-B, ISP-A and the ISP-B, AHD decoder are consistent, the fixed frame rate Raw images transmitted by the sensor-A are optimized, the fixed frame rate Raw images transmitted by the sensor-A are converted into the images, the fixed frame rate AHD signals are output by the fixed frame rate AHD decoder, and the fixed frame rate AHD signals are input by the fixed AHD decoder, and the AHD signals are output by the fixed frame rate AHD decoder;
The binocular lens synchronizing method comprises the following steps:
step A1, ISP-A and ISP-B are powered on simultaneously;
Step A2, providing a sensor-A clock source by the ISP-A at the moment of a first time point, and providing a sensor-B clock source by the ISP-B at the moment of the first time point;
Step A3, ISP-A resets sensor-A at the moment of the second time point, ISP-B resets sensor-B at the moment of the second time point;
step A4, the ISP-A configures a sensor-A at the moment of a third time point, and the ISP-B configures a sensor-B at the moment of the third time point;
Step A5, enabling the ISP-A to start exposing the sensor-A at the moment of a fourth time point, and enabling the ISP-B to start exposing the sensor-B at the moment of the fourth time point;
Step A6, the sensor-A outputs the Raw format image to the ISP-A using the fixed frame rate, and the sensor-B outputs the Raw format image to the ISP-B using the same fixed frame rate as the sensor-A.
2. A method for obtaining synchronous image by receiving end is characterized by comprising a binocular lens and a receiving end, wherein one camera of the binocular lens is provided with a sensor-A and an ISP-A connected with the sensor-A, the other camera is provided with a sensor-B and an ISP-B connected with the sensor-B, the sensor-A and the sensor-B are of the same model, the flash memories used by the ISP-A and the ISP-B are of the same model, the flash memories burn the same firmware, the clock sources provided for the ISP-A and the ISP-B by the same power chip are the same active crystal oscillator, the reset signals of the ISP-A and the ISP-B are generated by the same reset chip, the sensor-A, the sensor-B, ISP-A and the ISP-B, AHD decoder are arranged on the binocular module board, the power-on time sequence of the sensor-A is consistent with that of the sensor-B, the ISP-A performs image optimization on the image with the fixed frame rate Raw format transmitted by the sensor-A, converts the image into the image with the YUV format and performs AHD coding, outputs an AHD signal with the fixed frame rate to the AHD decoder, the ISP-B performs image optimization on the image with the fixed frame rate Raw format transmitted by the sensor-B, converts the image into the image with the YUV format and performs AHD coding, outputs an AHD signal with the fixed frame rate to the AHD decoder, and the AHD decoder receives the AHD signal input by the ISP-A and the AHD signal input by the ISP-B and decodes the AHD signal and outputs a single-channel MIPI signal; the receiving end is provided with a chip;
the method for acquiring the synchronous image by the receiving end comprises the following steps:
s1, powering on ISP-A and ISP-B of a binocular lens simultaneously;
Step S2, providing a sensor-A clock source at the moment of a first time point by the ISP-A, and providing a sensor-B clock source at the moment of the first time point by the ISP-B;
S3, resetting the sensor-A at the moment of the second time point by the ISP-A, and resetting the sensor-B at the moment of the second time point by the ISP-B;
S4, the ISP-A configures a sensor-A at the moment of a third time point, and the ISP-B configures a sensor-B at the moment of the third time point;
S5, enabling the ISP-A to start exposing the sensor-A at the moment of a fourth time point, and enabling the ISP-B to start exposing the sensor-B at the moment of the fourth time point;
S6, the sensor-A outputs the Raw image to the ISP-A by using a fixed frame rate, and the sensor-B outputs the Raw image to the ISP-B by using the same fixed frame rate as the sensor-A;
S7, performing image optimization and AHD encoding on the ISP-A and the ISP-B simultaneously;
S8, ISP-A and ISP-B simultaneously output AHD signals to an AHD decoder at a fixed frame rate;
S9, an AHD decoder receives two paths of AHD signals of the ISP-A and the ISP-B for decoding and outputs a single path of MIPI signal;
step S10, a chip of the receiving end receives an MIPI signal output by an AHD decoder, 2 paths of AHD video streams are distinguished through identifiers of virtual channels in MIPI data packets, and 2 paths of synchronous and stable video streams are obtained according to line-field synchronous information of each of the 2 paths of AHD video streams in the data packets.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410269530.XA CN117880623B (en) | 2024-03-11 | 2024-03-11 | Synchronization method of binocular lens and method for receiving end to acquire synchronous image |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410269530.XA CN117880623B (en) | 2024-03-11 | 2024-03-11 | Synchronization method of binocular lens and method for receiving end to acquire synchronous image |
Publications (2)
Publication Number | Publication Date |
---|---|
CN117880623A CN117880623A (en) | 2024-04-12 |
CN117880623B true CN117880623B (en) | 2024-05-28 |
Family
ID=90583326
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410269530.XA Active CN117880623B (en) | 2024-03-11 | 2024-03-11 | Synchronization method of binocular lens and method for receiving end to acquire synchronous image |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117880623B (en) |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2000951A1 (en) * | 2007-06-04 | 2008-12-10 | Hand Held Products, Inc. | Indicia reading terminal processing plurality of frames of image data responsively to trigger signal activation |
CN104717480A (en) * | 2014-01-28 | 2015-06-17 | 杭州海康威视数字技术股份有限公司 | Binocular camera pixel-level synchronous image acquisition device and method thereof |
CN111064890A (en) * | 2019-12-25 | 2020-04-24 | 安凯(广州)微电子技术有限公司 | Multi-view circuit equipment and multi-view circuit control method |
CN111107247A (en) * | 2020-02-26 | 2020-05-05 | 上海富瀚微电子股份有限公司 | Exposure method, image system and method for cooperative work of image system |
CN113132552A (en) * | 2019-12-31 | 2021-07-16 | 成都鼎桥通信技术有限公司 | Video stream processing method and device |
CN113329174A (en) * | 2021-05-21 | 2021-08-31 | 浙江大华技术股份有限公司 | Control method, device and system of multi-view camera and electronic device |
CN113497883A (en) * | 2020-04-01 | 2021-10-12 | 纳恩博(北京)科技有限公司 | Image processing method and system, camera module and image acquisition system |
CN115484409A (en) * | 2022-09-09 | 2022-12-16 | 成都微光集电科技有限公司 | Multi-image sensor cooperative working method and system |
CN115914604A (en) * | 2022-12-09 | 2023-04-04 | 厦门瑞为信息技术有限公司 | Image processing method, device, equipment and medium based on binocular camera |
TWI804368B (en) * | 2022-06-28 | 2023-06-01 | 躍訊實業有限公司 | Image converter and method |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102679016B1 (en) * | 2019-01-23 | 2024-06-28 | 에스케이하이닉스 주식회사 | Image sensor chip, electronic device, and method for operating the image sensor chip |
-
2024
- 2024-03-11 CN CN202410269530.XA patent/CN117880623B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2000951A1 (en) * | 2007-06-04 | 2008-12-10 | Hand Held Products, Inc. | Indicia reading terminal processing plurality of frames of image data responsively to trigger signal activation |
CN104717480A (en) * | 2014-01-28 | 2015-06-17 | 杭州海康威视数字技术股份有限公司 | Binocular camera pixel-level synchronous image acquisition device and method thereof |
CN111064890A (en) * | 2019-12-25 | 2020-04-24 | 安凯(广州)微电子技术有限公司 | Multi-view circuit equipment and multi-view circuit control method |
CN113132552A (en) * | 2019-12-31 | 2021-07-16 | 成都鼎桥通信技术有限公司 | Video stream processing method and device |
CN111107247A (en) * | 2020-02-26 | 2020-05-05 | 上海富瀚微电子股份有限公司 | Exposure method, image system and method for cooperative work of image system |
CN113497883A (en) * | 2020-04-01 | 2021-10-12 | 纳恩博(北京)科技有限公司 | Image processing method and system, camera module and image acquisition system |
CN113329174A (en) * | 2021-05-21 | 2021-08-31 | 浙江大华技术股份有限公司 | Control method, device and system of multi-view camera and electronic device |
TWI804368B (en) * | 2022-06-28 | 2023-06-01 | 躍訊實業有限公司 | Image converter and method |
CN115484409A (en) * | 2022-09-09 | 2022-12-16 | 成都微光集电科技有限公司 | Multi-image sensor cooperative working method and system |
CN115914604A (en) * | 2022-12-09 | 2023-04-04 | 厦门瑞为信息技术有限公司 | Image processing method, device, equipment and medium based on binocular camera |
Also Published As
Publication number | Publication date |
---|---|
CN117880623A (en) | 2024-04-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9654672B1 (en) | Synchronized capture of image and non-image sensor data | |
JP6465946B2 (en) | Distributed video display system, control device, and control method | |
US8205017B2 (en) | Systems and methods for addressing and synchronizing multiple devices | |
US9813783B2 (en) | Multi-camera dataset assembly and management with high precision timestamp requirements | |
US9253389B2 (en) | Image pickup apparatus, image pickup system, image pickup method and computer readable recording medium implementing synchronization for image pickup operations | |
US9154696B2 (en) | Imaging device for synchronized imaging | |
CN102186065B (en) | Monitoring camera with 360-degree field angle | |
JP2015099346A (en) | Multi-screen display system, and image signal correcting method for the same | |
CN107948463B (en) | A camera synchronization method, device and system | |
CN108432228A (en) | Frame synchornization method, image signal processing apparatus and the terminal of image data | |
CN117880623B (en) | Synchronization method of binocular lens and method for receiving end to acquire synchronous image | |
US20220094513A1 (en) | Communication apparatus, communications system, and communication method | |
CN221900961U (en) | Device for receiving end to obtain synchronous image | |
CN221900960U (en) | Binocular synchronized camera for easy use on the receiving end | |
US7017065B2 (en) | System and method for processing information, and recording medium | |
CN111147689B (en) | Method for generating a trigger signal for controlling a multimedia interface | |
CN107690053A (en) | A kind of method and system of the time shaft for determining video flowing | |
WO2024060763A1 (en) | Wireless smart wearable device and image acquisition method thereof | |
CN118796730A (en) | Chip, chip system and time stamp synchronization method | |
Sousa et al. | NanEye-An Endoscopy Sensor With 3-D Image Synchronization | |
CN210518588U (en) | Expansion module control circuit and projection device | |
CN102487444A (en) | Stereo imaging system using complementary metal oxide semiconductor (CMOS) image sensor | |
JP2004040185A (en) | Signal processor, signal processing system, signal processing method, storage medium, and program | |
CN207218701U (en) | Smartphone Image Receiving System for Visible Light Communication | |
TWI748892B (en) | Clock synchronization system and method for operating a clock synchronization system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |