The application is a divisional application with the application number of 201810402999.0, application date of 2018, 4-month and 28-month, and the title of data processing method, device, electronic equipment and computer readable storage medium.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first client may be referred to as a second client, and similarly, a second client may be referred to as a first client, without departing from the scope of the present application. Both the first client and the second client are clients, but they are not the same client.
Fig. 1 is a diagram illustrating an application scenario of a data processing method according to an embodiment. As shown in fig. 4, the electronic device includes a laser camera 102, a floodlight 104, a laser lamp 106, a first processing unit 110, a second processing unit 120, and a controller 130. The first Processing Unit 110 may be an MCU (micro controller Unit) module, etc., and the second Processing Unit 120 may be a CPU (Central Processing Unit) module, etc. The first processing unit 110 may be connected to the laser camera 102 and the second processing unit 120, and the first processing unit 110 may be connected to the controller 130 through an I2C bus. The first processing unit 110 may include a PWM (Pulse Width Modulation) module 112, and is connected to the controller 130 through the PWM module 112, and the controller 130 may be connected to the floodlight 104 and the laser light 106 respectively.
When the first processing unit 110 receives the image acquisition instruction sent by the second processing unit 120, a control instruction is sent to the controller 130 through the I2C bus, and the control instruction can be used for controlling to turn on at least one of the floodlight 104 and the laser light 106. The first processing unit 110 may send a pulse to the controller 130 through the PWM module 112, illuminate at least one of the floodlight 104 and the laser light 106 that are turned on, and capture an image of the object through the laser camera 102. The first processing unit 110 may process the target image and transmit the processed target image to the second processing unit 120.
Fig. 2 is an application scenario diagram of a data processing method in another embodiment. As shown in fig. 2, the electronic device 200 may include a camera module 210, a second processing unit 220, and a first processing unit 230. The second processing unit 220 may be a CPU module. The first processing unit 230 may be an MCU module. The first processing unit 230 is connected between the second processing unit 220 and the camera module 210, the first processing unit 230 can control the laser camera 212, the floodlight 214 and the laser light 218 in the camera module 210, and the second processing unit 220 can control the RGB camera 216 in the camera module 210.
The camera module 210 includes a laser camera 212, a floodlight 214, an RGB camera 216, and a laser light 218. The laser camera 212 may be an infrared camera for acquiring infrared images. The floodlight 214 is a surface light source capable of emitting infrared light; the laser lamp 218 is a point light source capable of emitting laser light and is a point light source with a pattern. When the floodlight 214 emits a surface light source, the laser camera 212 can obtain an infrared image according to the reflected light. When the laser lamp 218 emits a point light source, the laser camera 212 may obtain a speckle image according to the reflected light. The speckle image is an image of the pattern deformation after the point light source with the pattern emitted by the laser lamp 218 is reflected.
The second processing unit 220 may include a CPU core operating in a TEE (Trusted Execution Environment) Environment and a CPU core operating in a REE (natural Execution Environment) Environment. The TEE environment and the REE environment are both running modes of an ARM module (Advanced RISC Machines, Advanced reduced instruction set processor). The security level of the TEE environment is higher, and only one CPU core in the second processing unit 220 can operate in the TEE environment at the same time. Generally, the operation behavior with higher security level in the electronic device 200 needs to be executed in the CPU core in the TEE environment, and the operation behavior with lower security level can be executed in the CPU core in the REE environment.
The first processing unit 230 includes a PWM module 232, an SPI/I2C (Serial Peripheral Interface/Inter-Integrated Circuit) Interface 234, a RAM (Random Access Memory) module 236, and a depth engine 238. The first processing unit 230 may be connected via the PWM module 232 to a controller for the floodlight 214 and the laser light 218, which controller may be connected to the floodlight 214 and the laser light 218, respectively, for controlling the floodlight 214 and the laser light 218. The first processing unit 230 is also connected to the controller via an I2C bus, and the floodlight 214 or the laser light 218 can be controlled to be turned on via the connected I2C bus, and the PWM module 232 can transmit a pulse to the camera module to light the turned-on floodlight 214 or laser light 218. The first processing unit 230 may collect an infrared image or a speckle image through the laser camera 212. The SPI/I2C interface 234 is used for receiving the image capturing instruction sent by the second processing unit 220. The depth engine 238 may process the speckle images to obtain a depth disparity map.
When the second processing unit 220 receives a data acquisition request of an application program, for example, when the application program needs to perform face unlocking and face payment, an image acquisition instruction may be sent to the first processing unit 230 through the CPU core operating in the TEE environment. After the first processing unit 230 receives the image acquisition command, a control command can be sent to the controller through the I2C bus, the floodlight 214 in the control camera module 210 is controlled to be turned on, the pulse wave is sent to the controller through the PWM module 232 to light the floodlight 214, the laser camera 212 is controlled to acquire the infrared image through the I2C bus, the control command can be sent to the controller through the I2C bus, the laser lamp 218 in the camera module 210 is controlled to be turned on, the pulse wave is sent to the controller through the PWM module 232 to light the laser lamp 218, and the laser camera 212 is controlled to acquire the speckle image through the I2C bus. The camera module 210 may send the collected infrared image and speckle image to the first processing unit 230. The first processing unit 230 may process the received infrared image to obtain an infrared disparity map; and processing the received speckle images to obtain a speckle parallax image or a depth parallax image. The processing of the infrared image and the speckle image by the first processing unit 230 refers to correcting the infrared image or the speckle image and removing the influence of internal and external parameters in the camera module 210 on the image. The first processing unit 230 can be set to different modes, and the images output by the different modes are different. When the first processing unit 230 is set to the speckle pattern mode, the first processing unit 230 processes the speckle image to obtain a speckle disparity map, and a target speckle image can be obtained according to the speckle disparity map; when the first processing unit 230 is set to the depth map mode, the first processing unit 230 processes the speckle images to obtain a depth disparity map, and obtains a depth image according to the depth disparity map, where the depth image is an image with depth information. The first processing unit 230 may send the infrared disparity map and the speckle disparity map to the second processing unit 220, and the first processing unit 230 may also send the infrared disparity map and the depth disparity map to the second processing unit 220. The second processing unit 220 may obtain an infrared image of the target according to the infrared disparity map and obtain a depth image according to the depth disparity map. Further, the second processing unit 220 may perform face recognition, face matching, living body detection, and depth information acquisition on the detected face according to the target infrared image and the depth image.
The communication between the first processing unit 230 and the second processing unit 220 is through a fixed security interface to ensure the security of the transmitted data. As shown in fig. 1, the data sent by the second processing unit 220 to the first processing unit 230 is through a SECURE SPI/I2C 240, and the data sent by the first processing unit 230 to the second processing unit 220 is through a SECURE MIPI (Mobile Industry Processor Interface) 250.
In an embodiment, the first processing unit 230 may also obtain a target infrared image according to the infrared disparity map, calculate and obtain a depth image according to the depth disparity map, and then send the target infrared image and the depth image to the second processing unit 220.
FIG. 3 is a block diagram of an electronic device in one embodiment. As shown in fig. 3, the electronic device includes a processor, a memory, a display screen, and an input device connected through a system bus. The memory may include, among other things, a non-volatile storage medium and a processor. The non-volatile storage medium of the electronic device stores an operating system and a computer program, and the computer program is executed by a processor to implement a data processing method provided in the embodiment of the present application. The processor is used for providing calculation and control capability and supporting the operation of the whole electronic equipment. The internal memory in the electronic device provides an environment for the execution of the computer program in the nonvolatile storage medium. The display screen of the electronic device may be a liquid crystal display screen or an electronic ink display screen, and the input device may be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on a housing of the electronic device, or an external keyboard, a touch pad or a mouse. The electronic device may be a mobile phone, a tablet computer, or a personal digital assistant or a wearable device, etc. Those skilled in the art will appreciate that the architecture shown in fig. 3 is a block diagram of only a portion of the architecture associated with the subject application, and does not constitute a limitation on the electronic devices to which the subject application may be applied, and that a particular electronic device may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
As shown in fig. 4, in one embodiment, there is provided a data processing method comprising the steps of:
step 410, when the first processing unit receives the image acquisition instruction sent by the second processing unit, determining the type of the acquired image according to the image acquisition instruction.
And step 420, sending a control instruction corresponding to the image type to the controller through a bidirectional two-wire system synchronous serial I2C bus, wherein the control instruction is used for controlling to start at least one of the floodlight and the radium lamp.
When an application program in the electronic device needs to acquire face data, the application program may send a data acquisition request to the second processing unit, where the face data may include, but is not limited to, data that needs to be subjected to face verification in scenes such as face unlocking and face payment, face depth information, and the like. After receiving the data acquisition request, the second processing unit may send an image acquisition instruction to the first processing unit, where the first processing unit may be an MCU module, and the second processing unit may be a CPU module.
The electronic equipment can comprise a controller, the controller can be respectively connected with the floodlight and the laser lamp, and the floodlight and the laser lamp can be controlled by the same controller. The controller controls the floodlight and the laser light, and can comprise controlling and starting the floodlight or the laser light, controlling the switching between the floodlight and the laser light, controlling the emission power of the floodlight and the laser light, and the like. The first processing unit may be connected to the controller via an I2C bus, and the I2C bus may implement data transmission between devices connected to the I2C bus via a data line and a clock line. When the first processing unit receives the image acquisition instruction sent by the second processing unit, a control instruction can be sent to the controller through the I2C bus, and after the controller receives the control instruction, at least one of the floodlight and the laser lamp can be started according to the control instruction.
And 430, sending a pulse to the controller through the pulse width modulation PWM module, lighting at least one of the started floodlight and the started radium-shine lamp, and collecting a target image through the laser camera.
The first processing unit can be connected with the controller through the PWM module, when at least one of the floodlight and the laser lamp is controlled to be started, the first processing unit can send pulses to the controller through the PWM module, and at least one of the started floodlight and the started laser lamp is lightened. Optionally, the PWM module may control to turn on at least one of the floodlight and the laser light according to a pulse signal continuously sent to the controller at a certain voltage amplitude and a certain time interval.
The first processing unit may collect a target image through the laser camera, and the target image may include an infrared image, a speckle image, and the like. If what open is floodlight, the PWM module can send the pulse to the controller, lights floodlight, and floodlight can be one kind to the pointolite of all directions uniform irradiation, when floodlight was lighted, can emit the infrared light, and the laser camera can gather the people face and obtain infrared image. If the laser lamp is started, the PWM module can send pulses to the controller to light the laser lamp. When the laser lamp is lighted, the emitted laser can be diffracted by a lens and a DOE (diffraction optical elements) to generate a pattern with speckle particles, the pattern with the speckle particles is projected to a target object, the offset of the speckle pattern is generated due to different distances between each point of the target object and electronic equipment, and a laser camera acquires the target object to obtain a speckle image.
Step 440, processing the target image by the first processing unit, and sending the processed target image to the second processing unit.
The laser camera can send the target image of gathering to first processing unit, and first processing unit can be handled the target image, and wherein, the target image can include infrared image, speckle image etc.. After the first processing unit determines the image type according to the image acquisition instruction, a target image corresponding to the image type can be acquired according to the determined image type, and corresponding processing is carried out on the target image. When the image type is an infrared image, the first processing unit can send pulses to the first controller through the first PWM module, light the floodlight, acquire the infrared image through the laser camera and process the infrared image to obtain an infrared parallax image. When the image type is the speckle image, the first processing unit can send pulses to the second controller through the second PWM module, light the laser lamp, collect the speckle image through the laser camera, and can process the speckle image to obtain the speckle parallax image. When the image type is a depth image, the first processing unit can collect the speckle image and process the collected speckle image to obtain a depth parallax image.
In one embodiment, the first processing unit may perform a correction process on the target image, where the correction process is performed to correct an image content shift of the target image due to internal and external parameters of the laser camera and the RGB camera, for example, an image content shift due to a laser camera deflection angle, a placement position between the laser camera and the RGB camera, and the like. After the target image is corrected, a disparity map of the target image can be obtained, for example, an infrared disparity map can be obtained by correcting an infrared image, and a speckle disparity map or a depth disparity map can be obtained by correcting a speckle image. The correction processing is performed on the target image, so that the situation that the image finally presented on the screen of the electronic equipment is ghosted can be prevented.
The first processing unit processes the target image, and can send the processed target image to the second processing unit. The second processing unit can obtain required images such as infrared images, speckle images, depth images and the like according to the processed target images. The second processing unit can process the required image according to the requirement of the application program.
For example, when the application program needs to perform face verification, the second processing unit may perform face detection on the obtained required image, where the face detection may include face recognition, face matching, and living body detection. The human face recognition means that whether a human face exists in an image required for recognition, the human face matching means that the human face in the image required for recognition is matched with a pre-stored human face, and the living body detection means that whether the human face in the image required for detection has biological activity or not. If the application program needs to acquire the depth information of the face, the generated depth image can be uploaded to the application program, and the application program can perform beautifying processing, three-dimensional modeling and the like according to the received depth image.
In this embodiment, when the first processing unit receives the image acquisition instruction that the second processing unit sent, send control instruction to the controller through I2C bus, at least one in floodlight and laser lamp is opened in the control to send the pulse to the controller through the PWM module, light at least one in floodlight and laser lamp opened, handle the target image again after gathering the target image, can realize the control to floodlight and laser lamp through a controller, can reduce the complexity of controlling floodlight and laser lamp etc. and practice thrift the cost.
In one embodiment, the step of acquiring an image of the target by a laser camera comprises: and the laser camera is controlled by an I2C bus to acquire a target image.
The first processing unit can be connected with the laser camera through an I2C bus, and controls the laser camera to acquire a target image through the connected I2C bus. In one embodiment, the first processing unit, the laser camera, and the controller may be connected to the same I2C bus. After the first processing unit receives the image acquisition command sent by the second processing unit, the floodlight or the laser light can be controlled to be turned on through the I2C bus, the pulse is sent to the controller through the PWM module, the turned-on floodlight or the laser light is lightened, and the laser camera is controlled to acquire target images such as infrared images or speckle images through the connected I2C bus.
In one embodiment, the first processing unit can address the controller through the connected I2C bus, send a control command to the controller, control to turn on the floodlight or the radium light, address the laser camera through the connected I2C bus, control the laser camera to collect a target image, and multiplex the same connected I2C bus at different times, so that resources are saved.
FIG. 5 is a schematic diagram of the first processing unit, the laser camera, and the controller connected to the same I2C bus in one embodiment. As shown in fig. 5, the electronic device includes a laser camera 102, a floodlight 104, a laser lamp 106, a first processing unit 110, a second processing unit 120, and a controller 130. The first processing unit 110 may be connected with the second processing unit 120. The first processing unit 110 may include a PWM module 112 and is connected to the controller 130 through the PWM module 112, and the controller 130 may be connected to the floodlight 104 and the laser light 106 respectively. The laser camera 102, the first processing unit 110, and the controller 130 may be connected to the same I2C bus.
After receiving the image acquisition command sent by the second processing unit 120, the first processing unit 110 can send a control command to the controller 130 through the I2C bus to control the floodlight 104 or the laser light 106 to be turned on, send a pulse to the controller 130 through the PWM module 112 to light the turned-on floodlight 104 or the laser light 106, and control the laser camera 102 to acquire target images such as infrared images or speckle images through the connected I2C bus.
In this embodiment, the floodlight, the laser light and the laser camera are controlled by the same I2C bus, and the I2C bus is multiplexed, so that the complexity of the control circuit can be reduced, and the cost can be reduced.
As shown in FIG. 6, in one embodiment, the step of sending a control instruction corresponding to the image type to the controller via the I2C bus comprises the steps of:
step 602, if the image type is the first type, the first processing unit sends a first control instruction to the controller through the I2C bus, where the first control instruction is used to instruct the controller to turn on the floodlight.
The first processing unit receives the image acquisition instruction sent by the second processing unit, and can determine the type of the acquired image according to the image acquisition instruction, wherein the image type can be one or more of an infrared image, a speckle image, a depth image and the like. The image type can be determined according to the face data required by the application program, and after the second processing unit receives the data acquisition request, the image type can be determined according to the data acquisition request, and an image acquisition instruction containing the image type is sent to the first processing unit. For example, if the data required to unlock the face is the infrared image and the speckle image, the image type may be determined to be the depth image if the face depth information is required, but the present invention is not limited thereto.
If the image type is the first type, in this embodiment, the first type may be an infrared image, the first processing unit may send a first control instruction to the controller through the connected I2C, and the controller may switch to the floodlight according to the first control instruction and turn on the floodlight. The first processing unit may transmit a pulse to the controller via the PWM module to illuminate the floodlight. Alternatively, the first processing unit may address the controller via I2C and send a first control instruction to the controller.
And step 604, if the image type is the second type, the first processing unit sends a second control instruction to the controller through the I2C bus, and the second control instruction is used for instructing the controller to turn on the laser lamp.
If the image type is the second type, in this embodiment, the second type may be a speckle image, a depth image, or the like, the first processing unit may send a second control instruction to the controller through the connected I2C, and the controller may switch to the laser lamp according to the second control instruction and turn on the laser lamp. The first processing unit can emit pulses to the controller through the PWM module to light the laser lamp.
The first processing unit determines the image types according to the image acquisition instruction, and the image types can include at least two types, for example, the image types can include a first type and a second type at the same time. When the image type includes an infrared image and a speckle image, or includes an infrared image and a depth image, the infrared image and the speckle image need to be acquired simultaneously. The first processing unit can collect the infrared image firstly or the speckle image firstly, and does not limit the collecting sequence. The first processing unit can send a first control instruction to the controller through the I2C bus, turn on the floodlight, transmit pulse to the controller through the PWM module, light the floodlight, and then control the laser camera to collect infrared images through the I2C bus. After the first processing unit controls the laser camera to collect the target image corresponding to the first type, a second control instruction can be sent to the controller through the I2C bus, the laser lamp is started, pulses are transmitted to the controller through the PWM module, the laser lamp is lightened, and then the laser camera is controlled through the I2C bus to collect the speckle image.
In one embodiment, when the image type may include both the first type and the second type, the first processing unit may also first send a second control instruction to the controller through the I2C bus, turn on the laser lamp, transmit a pulse to the controller through the PWM module, illuminate the laser lamp, and control the laser camera to collect a speckle image through the I2C bus. After the first processing unit controls the laser camera to collect the target image corresponding to the second type, a first control instruction can be sent to the controller through the I2C bus, the floodlight is started, the pulse is transmitted to the controller through the PWM module, the floodlight is lightened, and then the infrared image is collected by the laser camera through the I2C bus.
Optionally, the first processing unit may send the first control instruction and the second control instruction to the controller at different times, a time interval between the time of sending the first control instruction and the time of sending the second control instruction may be smaller than a time threshold, and the laser camera may collect the speckle images at the time interval smaller than the time threshold after collecting the infrared images, so that image contents of the collected infrared images and the speckle images are relatively consistent, and subsequent processing such as face detection is facilitated. The time threshold may be set according to actual requirements, for example, 20 milliseconds, 30 milliseconds, and the like. The image content of the collected infrared image is consistent with that of the speckle image, and the accuracy of follow-up face detection can be improved.
In this embodiment, the floodlight and the laser light can be switched and controlled by one controller, so that the complexity of the control circuit can be reduced, and the cost can be reduced.
As shown in FIG. 7, in one embodiment, step 430 processes the target image by the first processing unit and sends the processed target image to the second processing unit, comprising the steps of:
step 702, acquiring a stored reference speckle image, wherein the reference speckle image has reference depth information.
In the camera coordinate system, a straight line which is vertical to the imaging plane and passes through the center of the mirror surface is taken as a Z axis, and if the coordinates of the object in the camera coordinate system are (X, Y, Z), the Z value is the depth information of the object in the imaging plane of the camera. If the application program needs to acquire the depth information of the face, a depth image containing the face depth information needs to be acquired. The first processing unit can control the laser lamp to be started through an I2C bus and control the laser camera to collect speckle images through an I2C bus. The first processing unit can be pre-stored with a reference speckle pattern, the reference speckle pattern can be provided with reference depth information, and the depth information of each pixel point contained in the speckle image can be acquired according to the collected speckle image and the reference speckle image.
And step 704, matching the reference speckle image with the speckle image to obtain a matching result.
The first processing unit may sequentially select a pixel block of a predetermined size, for example, 31 pixels by 31 pixels, centering on each pixel point included in the collected speckle image, and search for a block matching the selected pixel block on the reference speckle image. The first processing unit can find two points on the same laser light path in the speckle image and the reference speckle image respectively from the pixel block selected from the acquired speckle images and the block matched with the reference speckle image, wherein the speckle information of the two points on the same laser light path is consistent, and the two points on the same laser light path can be identified as corresponding pixel points. The depth information of the points on each laser path in the reference speckle image is known. The first processing unit can calculate the offset between two corresponding pixel points of the target speckle image and the reference speckle image on the same laser light path, and calculate the depth information of each pixel point contained in the acquired speckle pattern according to the offset.
In one embodiment, the first processing unit calculates an offset between the collected speckle image and the reference speckle pattern, and calculates depth information of each pixel point included in the speckle image according to the offset, where a calculation formula may be as shown in formula (1):
wherein Z isDRepresenting the depth information of the pixel points, namely the depth values of the pixel points; l is the distance between the laser camera and the laser; f is the focal length of the lens in the laser camera, Z0The reference speckle image is acquired by comparing the depth value of a reference plane with the depth value of a laser camera of the electronic equipment, and P is the offset between the acquired speckle image and the corresponding pixel point in the reference speckle image. P can be obtained by multiplying the amount of pixels of the offset of the pixels in the target speckle pattern and the reference speckle pattern by the actual distance of one pixel. When the distance between the target object and the laser camera is larger than the distance between the reference plane and the laser camera, P is a negative value, and when the distance between the target object and the laser camera is smaller than the distance between the reference plane and the laser camera, P is a positive value.
And step 706, generating a depth disparity map according to the reference depth information and the matching result, sending the depth disparity map to the second processing unit, and processing the depth disparity map through the second processing unit to obtain the depth map.
The first processing unit obtains the depth information of each pixel point contained in the acquired speckle image, can correct the acquired speckle image, and corrects the acquired speckle image to have image content offset caused by internal and external parameters of the laser camera and the RGB camera. The first processing unit can generate a depth parallax map according to the corrected speckle images and the depth values of all pixel points in the speckle images, and sends the depth parallax map to the second processing unit. The second processing unit may obtain a depth map according to the depth disparity map, and the depth map may include depth information of each pixel point. The second processing unit can upload the depth map to an application program, and the application program can perform beautifying, three-dimensional modeling and the like according to the depth information of the face in the depth map. The second processing unit can also carry out living body detection according to the depth information of the face in the depth map, and can prevent the collected face from being a two-dimensional plane face and the like.
Optionally, the second processing unit in the electronic device may include two operation modes, where the first operation mode may be a TEE, the TEE is a trusted operation environment, and the security level is high; the second operation mode may be REE, which is a natural operation environment, and the security level of the REE is low. After receiving a data acquisition request sent by an application program, the second processing unit can send an image acquisition instruction to the first processing unit through the first operation mode. When the second processing unit is a CPU with a single core, the single core can be directly switched from the second operation mode to the first operation mode; when the second processing unit has multiple cores, one core can be switched from the second operation mode to the first operation mode, other cores still operate in the second operation mode, and an image acquisition instruction is sent to the first processing unit through the core operating in the first operation mode.
After the first processing unit processes the acquired target image, the processed target image can be sent to the kernel operating in the first operation mode, so that the first processing unit can be ensured to always operate in a trusted operation environment, and the safety is improved. The second processing unit may obtain a required image according to the processed target image in the kernel operating in the first operating mode, and process the required image according to a requirement of the application program. For example, the second processing unit may perform face detection on a desired image in the kernel operating in the first operating mode. The image acquisition instruction is sent to the first processing unit through the kernel with high safety of the second processing unit, so that the first processing unit can be ensured to be in an environment with high safety, and the data safety is improved.
In one embodiment, since the kernel operating in the first operating mode is unique, the second processing unit performs face detection on the target image in the TEE environment, and can perform face recognition, face matching, live body detection and the like on the target image one by one in a serial manner. The second processing unit may perform face recognition on the required image, and when a face is recognized, match the face included in the required image with a face stored in advance, and determine whether the faces are the same face. If the face is the same face, then the living body detection is carried out on the face according to the required image, and the collected face is prevented from being a two-dimensional plane face and the like. When the face is not recognized, face matching and live body detection may not be performed, and the processing pressure of the second processing unit may be reduced.
In this embodiment, the depth information of the acquired image can be accurately obtained through the first processing unit, the data processing efficiency is high, and the accuracy of image processing is improved.
In one embodiment, a data processing method is provided, comprising the steps of:
and (1) when the first processing unit receives an image acquisition instruction sent by the second processing unit, sending a control instruction to the controller through a bidirectional two-wire system synchronous serial I2C bus, wherein the control instruction is used for controlling to start at least one of the floodlight and the laser lamp.
In one embodiment, step (1) comprises: determining the type of the acquired image according to the image acquisition instruction; if the image type is a first type, the first processing unit sends a first control instruction to the controller through the I2C bus, and the first control instruction is used for instructing the controller to turn on the floodlight; if the image type is the second type, the first processing unit sends a second control instruction to the controller through the I2C bus, and the second control instruction is used for instructing the controller to start the laser lamp.
In one embodiment, after the step of determining the type of the captured image according to the image capturing instruction, the method further comprises: when the image types comprise a first type and a second type, the first processing unit sends a first control instruction to the controller through the I2C bus, and the floodlight is turned on; after the target image corresponding to the first type is collected through the laser camera, a second control instruction is sent to the controller through an I2C bus, and the laser lamp is started.
In one embodiment, after the step of determining the type of the captured image according to the image capturing instruction, the method further comprises: when the image types comprise a first type and a second type, the first processing unit sends a second control instruction to the controller through the I2C bus, and the laser lamp is started; after the target image of the second type is collected through the laser camera, a first control instruction is sent to the controller through an I2C bus, and the floodlight is turned on.
In one embodiment, a time interval between a time when the first processing unit transmits the first control instruction and a time when the second control instruction is transmitted is less than a time threshold.
And (2) sending pulses to the controller through the pulse width modulation PWM module, lighting at least one of the started floodlight and the started laser light, and collecting a target image through the laser camera.
In one embodiment, the first processing unit, the controller and the laser camera are connected to the same I2C bus; the method comprises the following steps of collecting a target image through a laser camera, wherein the method comprises the following steps: and the laser camera is controlled by an I2C bus to acquire a target image.
And (3) processing the target image through the first processing unit, and sending the processed target image to the second processing unit.
In one embodiment, the target image includes a speckle image; step (3), comprising: acquiring a stored reference speckle image, wherein the reference speckle image is provided with reference depth information; matching the reference speckle image with the speckle image to obtain a matching result; and generating a depth parallax map according to the reference depth information and the matching result, sending the depth parallax map to a second processing unit, and processing the depth parallax map through the second processing unit to obtain the depth map.
In this embodiment, when the first processing unit receives the image acquisition instruction that the second processing unit sent, send control instruction to the controller through I2C bus, at least one in floodlight and laser lamp is opened in the control to send the pulse to the controller through the PWM module, light at least one in floodlight and laser lamp opened, handle the target image again after gathering the target image, can realize the control to floodlight and laser lamp through a controller, can reduce the complexity of controlling floodlight and laser lamp etc. and practice thrift the cost.
It should be understood that, although the steps in the respective flow charts described above are shown in sequence as indicated by the arrows, the steps are not necessarily performed in sequence as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in the various flow diagrams described above may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternatingly with other steps or at least a portion of the sub-steps or stages of other steps.
In one embodiment, an electronic device is provided and includes a camera module, a first processing unit, a second processing unit, and a controller, wherein the first processing unit is connected to the second processing unit and the camera module, respectively. The first processing unit is connected to the controller via an I2C bus. The camera module comprises a laser camera, a floodlight and a laser lamp, and the floodlight and the laser lamp are respectively connected with the controller. The first processing unit comprises a PWM module, and the first processing unit is connected with the controller through the PWM module.
The first processing unit is used for determining the type of the acquired image according to the image acquisition instruction when receiving the image acquisition instruction sent by the second processing unit, sending a control instruction corresponding to the image type to the controller through an I2C bus, wherein the control instruction is used for controlling to start at least one of the floodlight and the laser lamp, sending a pulse to the controller through the Pulse Width Modulation (PWM) module, lighting at least one of the started floodlight and the started laser lamp, acquiring a target image through the laser camera, processing the target image and sending the processed target image to the second processing unit.
In this embodiment, when the first processing unit receives the image acquisition instruction that the second processing unit sent, send control instruction to the controller through I2C bus, at least one in floodlight and laser lamp is opened in the control to send the pulse to the controller through the PWM module, light at least one in floodlight and laser lamp opened, handle the target image again after gathering the target image, can realize the control to floodlight and laser lamp through a controller, can reduce the complexity of controlling floodlight and laser lamp etc. and practice thrift the cost.
In one embodiment, the first processing unit, the controller, and the laser camera are connected to the same I2C bus.
The first processing unit is also used for controlling the laser camera to acquire a target image through an I2C bus.
In this embodiment, the floodlight, the laser light and the laser camera are controlled by the same I2C bus, and the I2C bus is multiplexed, so that the complexity of the control circuit can be reduced, and the cost can be reduced.
In one embodiment, the first processing unit is further configured to determine a type of the captured image according to the image capturing instruction, and if the type of the image is a first type, the first processing unit sends a first control instruction to the controller through an I2C bus, where the first control instruction is used to instruct the controller to turn on the floodlight, and if the type of the image is a second type, the first processing unit sends a second control instruction to the controller through an I2C bus, and the second control instruction is used to instruct the controller to turn on the laser light.
In one embodiment, the first processing unit is further configured to, when the image types include a first type and a second type, send a first control instruction to the controller through an I2C bus, turn on the floodlight, and after a target image corresponding to the first type is acquired by the laser camera, send a second control instruction to the controller through an I2C bus, and turn on the laser light.
In one embodiment, the first processing unit is further configured to, when the image types include a first type and a second type, send a second control instruction to the controller through the I2C bus, turn on the laser light, and after acquiring a target image of the second type through the laser camera, send a first control instruction to the controller through the I2C bus, and turn on the floodlight.
In one embodiment, a time interval between a time when the first processing unit transmits the first control instruction and a time when the second control instruction is transmitted is less than a time threshold.
In this embodiment, the floodlight and the laser light can be switched and controlled by one controller, so that the complexity of the control circuit can be reduced, and the cost can be reduced.
In one embodiment, the target image includes a speckle image.
The first processing unit is further used for acquiring a stored reference speckle image, matching the reference speckle image with the speckle image to obtain a matching result, generating a depth parallax map according to the reference depth information and the matching result, and sending the depth parallax map to the second processing unit, wherein the reference speckle image has the reference depth information;
and the second processing unit is also used for processing the depth parallax map to obtain a depth map.
In this embodiment, the depth information of the acquired image can be accurately obtained through the first processing unit, the data processing efficiency is high, and the accuracy of image processing is improved.
In one embodiment, a data processing apparatus 800 is provided that includes an instruction sending module 810, a pulse sending module 820, and a processing module 830.
And the instruction sending module 810 is configured to, when the first processing unit receives an image acquisition instruction sent by the second processing unit, determine an acquired image type according to the image acquisition instruction, send a control instruction corresponding to the image type to the controller through a bidirectional two-wire system synchronous serial I2C bus, where the control instruction is used to control to turn on at least one of the floodlight and the laser light.
And the pulse sending module 820 is used for sending pulses to the controller through the pulse width modulation PWM module, lightening at least one of the started floodlight and the started laser light, and collecting a target image through the laser camera.
The processing module 830 is configured to process the target image through the first processing unit, and send the processed target image to the second processing unit.
In this embodiment, when the first processing unit receives the image acquisition instruction that the second processing unit sent, send control instruction to the controller through I2C bus, at least one in floodlight and laser lamp is opened in the control to send the pulse to the controller through the PWM module, light at least one in floodlight and laser lamp opened, handle the target image again after gathering the target image, can realize the control to floodlight and laser lamp through a controller, can reduce the complexity of controlling floodlight and laser lamp etc. and practice thrift the cost.
In one embodiment, the first processing unit, the controller, and the laser camera are connected to the same I2C bus.
And the pulse sending module 820 is further used for controlling the laser camera to acquire a target image through the I2C bus.
In this embodiment, the floodlight, the laser light and the laser camera are controlled by the same I2C bus, and the I2C bus is multiplexed, so that the complexity of the control circuit can be reduced, and the cost can be reduced.
In one embodiment, the instruction sending module 810 includes a type determining unit, a first sending unit, and a second sending unit.
And the type determining unit is used for determining the type of the acquired image according to the image acquisition instruction.
And the first sending unit is used for sending a first control instruction to the controller through the I2C bus if the image type is the first type, and the first control instruction is used for instructing the controller to turn on the floodlight.
And if the image type is the second type, the first processing unit sends a second control instruction to the controller through the I2C bus, and the second control instruction is used for instructing the controller to turn on the laser lamp.
In one embodiment, the first sending unit is further configured to send a first control instruction to the controller through the I2C bus to turn on the floodlight when the image types include a first type and a second type.
And the second sending unit is also used for sending a second control instruction to the controller through an I2C bus after the target image corresponding to the first type is collected through the laser camera, and starting the laser lamp.
In one embodiment, the second sending unit is further configured to send a second control instruction to the controller through the I2C bus to turn on the laser lamp when the image type includes the first type and the second type.
And the first sending unit is also used for sending a first control instruction to the controller through the I2C bus and starting the floodlight after the target image of the second type is collected through the laser camera.
In one embodiment, a time interval between a time when the first processing unit transmits the first control instruction and a time when the second control instruction is transmitted is less than a time threshold.
In this embodiment, the floodlight and the laser light can be switched and controlled by one controller, so that the complexity of the control circuit can be reduced, and the cost can be reduced.
In one embodiment, the processing module 830 includes an image acquisition unit, a matching unit, and a generation unit.
And the image acquisition unit is used for acquiring the stored reference speckle images, and the reference speckle images are provided with reference depth information.
And the matching unit is used for matching the reference speckle image with the speckle image to obtain a matching result.
And the generating unit is used for generating a depth parallax map according to the reference depth information and the matching result, sending the depth parallax map to the second processing unit, and processing the depth parallax map through the second processing unit to obtain the depth map.
In this embodiment, the depth information of the acquired image can be accurately obtained through the first processing unit, the data processing efficiency is high, and the accuracy of image processing is improved.
In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored, which computer program, when being executed by a processor, carries out the above-mentioned data processing method.
In an embodiment, a computer program product is provided, comprising a computer program, which, when run on a computer device, causes the computer device to carry out the above-mentioned data processing method when executed.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a non-volatile computer-readable storage medium, and can include the processes of the embodiments of the methods described above when the program is executed. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), or the like.
Any reference to memory, storage, database, or other medium as used herein may include non-volatile and/or volatile memory. Suitable non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and bus dynamic RAM (RDRAM).
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.