[go: up one dir, main page]

CN118214946A - Image processing method of image acquisition device, electronic device and storage medium - Google Patents

Image processing method of image acquisition device, electronic device and storage medium Download PDF

Info

Publication number
CN118214946A
CN118214946A CN202410632121.1A CN202410632121A CN118214946A CN 118214946 A CN118214946 A CN 118214946A CN 202410632121 A CN202410632121 A CN 202410632121A CN 118214946 A CN118214946 A CN 118214946A
Authority
CN
China
Prior art keywords
image
image data
data
detection result
motion detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202410632121.1A
Other languages
Chinese (zh)
Other versions
CN118214946B (en
Inventor
王正学
演鑫
万国挺
郑攀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Huacheng Software Technology Co Ltd
Original Assignee
Hangzhou Huacheng Software Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Huacheng Software Technology Co Ltd filed Critical Hangzhou Huacheng Software Technology Co Ltd
Priority to CN202410632121.1A priority Critical patent/CN118214946B/en
Publication of CN118214946A publication Critical patent/CN118214946A/en
Application granted granted Critical
Publication of CN118214946B publication Critical patent/CN118214946B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/65Control of camera operation in relation to power supply
    • H04N23/651Control of camera operation in relation to power supply for reducing power consumption by affecting camera operations, e.g. sleep mode, hibernation mode or power off of selective parts of the camera

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application discloses an image processing method of an image acquisition device, an electronic device and a storage medium. The image processing method of the image acquisition device comprises the following steps: the processor receives image data from the image sensor in a low frequency mode, wherein at least part of the functional modules in the low frequency mode are in an inactive state; responding to received image data to meet a first image processing condition, adjusting the processor to a high-frequency mode to start each functional module, wherein the first image processing condition comprises that the image data contains related information of a target object and/or the number of the image data reaches a preset number; and controlling each functional module to perform image processing on the image data. According to the scheme, the wake-up time of each functional module in the image processing process of the image acquisition equipment can be reduced, so that the consumption of the battery power of the image acquisition equipment is reduced.

Description

Image processing method of image acquisition device, electronic device and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to an image processing method of an image acquisition device, an electronic device, and a storage medium.
Background
With rapid development of image acquisition technology, image acquisition devices capable of acquiring images in real time are increasingly being used. When the existing image acquisition equipment is used, the whole machine is in a starting state, so that each functional module can perform corresponding image processing after receiving corresponding image data. However, each functional module is always in an awake state, and is in a high power consumption mode substantially when no corresponding image processing is required, so that the power consumption of the image acquisition device is serious.
Aiming at the defects of the prior art, how to provide an image processing scheme of effective image acquisition equipment is a technical problem to be solved urgently by the technicians in the field.
Disclosure of Invention
The application provides at least an image processing method of an image acquisition device, an electronic device and a storage medium.
The application provides an image processing method of image acquisition equipment, which comprises the following steps: the processor receives image data from the image sensor in a low frequency mode, wherein at least part of the functional modules in the low frequency mode are in an inactive state; responding to received image data to meet a first image processing condition, adjusting the processor to a high-frequency mode to start each functional module, wherein the first image processing condition comprises that the image data contains related information of a target object and/or the number of the image data reaches a preset number; and controlling each functional module to perform image processing on the image data.
The application provides an image processing device of an image acquisition device, comprising: the device comprises a receiving module, an adjusting module and a control module; the receiving module is used for receiving the image data from the image sensor by the processor in a low-frequency mode, wherein at least part of functional modules in the low-frequency mode are in an unactuated state; the adjusting module is used for responding to the received image data to meet a first image processing condition, adjusting the processor to a high-frequency mode to start each functional module, wherein the first image processing condition comprises that the image data contains related information of a target object and/or the number of the image data reaches a preset number; and the control module is used for controlling each functional module to carry out image processing on the image data.
The application provides an electronic device, which comprises a memory and a processor, wherein the processor is used for executing program instructions stored in the memory so as to realize the image processing method of the image acquisition device.
The present application provides a computer-readable storage medium having stored thereon program instructions which, when executed by a processor, implement an image processing method of the above-described image capturing apparatus.
According to the scheme, the processor is adjusted to the high-frequency mode to start each functional module in response to the fact that the image data received by the processor in the low-frequency mode meets the first image processing condition, the first image processing condition comprises that the image data contains relevant information of a target object and/or the number of the image data reaches the preset number, and each functional module is controlled to process the image data.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application as claimed.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
FIG. 1 is a flow chart of an embodiment of an image processing method of an image acquisition device of the present application;
FIG. 2 is another flow chart of an embodiment of an image processing method of the image capturing device according to the present application;
FIG. 3 is a flow chart of an embodiment of an image processing method of the image capturing device according to the present application;
FIG. 4 is a schematic view of an image processing apparatus of an image capturing device according to an embodiment of the present application;
FIG. 5 is a schematic diagram of an embodiment of an electronic device of the present application;
FIG. 6 is a schematic diagram of a computer-readable storage medium according to an embodiment of the present application.
Detailed Description
The following describes embodiments of the present application in detail with reference to the drawings.
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, interfaces, techniques, etc., in order to provide a thorough understanding of the present application.
The term "and/or" is herein merely an association relationship describing an associated object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship. Further, "a plurality" herein means two or more than two. In addition, the term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, for example, including at least one of A, B, C, may mean including any one or more elements selected from the group consisting of A, B and C.
The application provides an image processing method of image acquisition equipment and an image processing device of the image acquisition equipment. Application scenarios of the image processing method of the image acquisition device include, but are not limited to, a startup procedure for a multi-camera device. The execution subject of the image processing method of the image capturing apparatus may be an image processing device of the image capturing apparatus or a master of the image capturing apparatus. For example, the image processing apparatus of the image capturing device may be provided in a terminal device or a server or other processing device, wherein the terminal device may be a device for image processing of the image capturing device, a User Equipment (UE), a mobile device, a User terminal, a cellular phone, a cordless phone, a Personal digital assistant (Personal DIGITAL ASSISTANT, PDA), a handheld device, a computing device, an in-vehicle device, or the like. In some possible implementations, the image processing method of the image acquisition apparatus may be implemented by a processor calling computer readable instructions stored in a memory.
Referring to fig. 1, fig. 1 is a flowchart illustrating an embodiment of an image processing method of an image capturing device according to the present application. Specifically, the image capturing apparatus includes an image sensor, a processor, and a plurality of functional modules, and the image processing method of the image capturing apparatus may include the steps of:
Step S11: the processor receives image data from the image sensor in a low frequency mode.
Wherein at least part of the functional modules in the low frequency mode are in an inactive state.
The processor may be a central processor (Central Processing Unit, CPU) of the image-capturing device. The processor may be a processor with operation and control capability and may operate at different frequencies.
The processor may have a low frequency mode. The processor in the low frequency mode has weak processing power and low power consumption. In some application scenarios, a processor in a low frequency mode may have the ability to receive image data as well as transmit image data. In some application scenarios, at least some of the functional modules in the image acquisition device handle an inactive state with the processor handling a low frequency mode. Each functional module in the image capturing apparatus may be a related module capable of performing image processing on image data. In some application scenarios, each functional module in the image acquisition device may be a hardware encoder. The hardware encoder may encode the received data into video files of different formats for memory storage. In particular, the hardware encoder may be a video encoder (Video Encoder, ENC). The video encoder may be a video encoder of a different format. Illustratively, the video encoder may be a first encoder (Video Encoder H264,264). The video encoder may be a second encoder (Video Encoder H265,265). The video encoder may be a third encoder (Video Encoder MotionJpeg). The functional modules in the image acquisition device may also be image signal processors (IMAGE SIGNAL Processor, ISP). The image Sensor may be an image Sensor (Sensor). The image sensor may send image data to the processor. Wherein the image sensor may have a motion detection mode. Specifically, the motion detection mode may be a motion detection function of the image sensor. In some application scenarios, the motion detection function may be an image sensor adaptive motion detection function, and when the frame changes, a control signal is output. In some application scenarios, the processor may refer to a main control chip connected to the image sensor.
Step S12: responsive to the received image data satisfying the first image processing condition, the processor is adjusted to a high frequency mode to activate the functional modules.
The first image processing condition includes that the image data contains related information of the target object and/or that the number of image data reaches a preset number.
The processor may have a high frequency mode. The processor in the high frequency mode has strong processing power and high power consumption. In some application scenarios, a processor in a high frequency mode may have other processing capabilities than the capability to receive image data and to transmit image data. Other processing capabilities may be to send corresponding data to each functional module to activate each functional module to complete the image data encoding. In some application scenarios, all functional module processes in the image acquisition device are in a startup state with the processor in a high frequency mode. It will be appreciated that after determining that the received image data satisfies the first image processing condition, the processor in the low frequency mode may be mode switched and adjusted to process the processor in the high frequency mode. After the processor switches from the low frequency mode to the high frequency mode, the processor in the high frequency mode can activate the functional modules. After the processor is in the high frequency mode, the processor in the high frequency mode can activate the functional modules. In some application scenarios, the main control chip enters a high-frequency mode, and synchronously starts the target software operating system so as to realize the personalized setting requirements related to image processing. Specifically, the process of synchronously booting the target software operating system may be that the power management mode is restored into the operating system. The power management mode may be a target power mode STR (Suspend to RAM or Sleep, STR). The target power mode STR is rights management for the operating system. And managing the authority of the operating system by the target power mode STR according to different demand stages. In some application scenarios, the first demand phase may be that all operations of the operating system are stopped, but the memory still maintains power supply and its contents. In the first demand stage, the operating system saves energy when inactive and saves battery power consumption of the image acquisition device. In other applications, the second demand phase may be an operation that allows the system to quickly revert to a full power operating state when the demand condition is met. The requirement condition may be that the received image data satisfies the first image processing condition, or that the processor is adjusted to the high frequency mode. Dividing the first demand phase and the second demand phase can improve battery power usage efficiency.
In some application scenarios, it is determined whether the image data contains relevant information of the target object. The first image processing condition may be that the image data contains related information of the target object. The target object may be set according to the requirements. In particular, the target object may be a predetermined biological type, or a predetermined object type, or a predetermined vehicle type. In some application scenarios, it is determined whether the number of images corresponding to the image data reaches a preset number. The preset number may be 10 images corresponding to the image data. The first image processing condition may be that the number of images corresponding to the image data reaches a preset number. The image corresponding to the image data may be an image acquired by the image acquisition device on the target area. In other application scenes, the method synchronously judges whether the image data contains the related information of the target object and judges whether the number of images corresponding to the image data reaches a preset number. The first image processing condition may be that the image data contains related information of the target object and the number of images corresponding to the target object in the image data reaches a preset number.
Step S13: and controlling each functional module to perform image processing on the image data.
After each functional module is started, each functional module is controlled to perform image processing on the image data. In some application scenarios, the image processing may be performed by sending image data to each functional module at a time, so as to obtain video data. The image data may be in the form of ImageRaw data, for example. The image processing mode may be to send the image data to the hardware encoder for processing at one time, and wait for encoding to complete to obtain video data. Specifically, the image data is sent to the image signal processor and the video encoder for processing at one time, and the video data is obtained after the encoding is completed. After obtaining the video data, the video data is used as first target storage data, and the first target storage data is stored. After the first target storage data is stored, the control processor is powered down. In some application scenarios, the controlling the processor to power down may be controlling the master control chip to power down.
In some embodiments, the image acquisition device further comprises a memory. In some application scenarios, the manner in which video data is stored may be by writing the video data to a memory in a high power consumption mode. And in response to the video data storage, adjusting the memory to a low power consumption mode, and controlling the image sensor to enter a dormant state. After the memory is in the low power mode, the control processor is powered down. Controlling the image sensor to enter the sleep state may be the processor notifying the image sensor to enter the sleep state.
According to the scheme, the processor is adjusted to the high-frequency mode to start each functional module in response to the fact that the image data received by the processor in the low-frequency mode meets the first image processing condition, the first image processing condition comprises that the image data contains relevant information of a target object and/or the number of the image data reaches the preset number, and each functional module is controlled to process the image data.
In some embodiments, the image capturing device further comprises a memory, and the image processing method of the image capturing device may further comprise the steps of: first, in response to the received image data satisfying the second image processing condition, the image data is written to the memory in the high power consumption mode. The second image processing condition includes that the image data does not contain the related information of the target object and the number of the image data does not reach the preset number. And then, in response to the completion of the storage of the image data, adjusting the memory to a low power consumption mode and controlling the image sensor to enter a dormant state.
The memory in the image acquisition device may be a random access memory. The memory may be, for example, a Double Data Rate (DDR). In particular, the random access memory DDR may be DDR SDRAM double rate synchronous dynamic random access memory. The DDR can be embodied as a memory technology, which is mainly used for the ram of a computer. The memory has a low power consumption mode and a high power consumption mode. In some application scenarios, the memory in the high power consumption mode may be executed immediately after receiving the read-write command, which is faster. Wherein the memory in the high power mode has a higher power consumption. The memory in the high power consumption mode may be a memory in an operating state or a normal mode, and is capable of reading and writing related data to be stored. In other applications, the memory in the low power mode may be used to automatically refresh the relevant data to be stored, and in this mode, the memory is slow and can operate at low voltage. The memory in the low power mode has low power consumption. The memory in the low power consumption mode may be a memory in a self-refresh mode, and is not capable of reading and writing related data to be stored. It can be understood that the memory in the low power consumption mode has the function of retaining the historical stored data, and does not have the function of reading and writing the related data to be stored. The retention history storage data may be video data stored in a memory in a high power consumption mode before the mode adjustment or image data stored in a memory in a high power consumption mode before the mode adjustment.
The second image processing condition includes that the image data does not contain the related information of the target object and the number of the image data does not reach the preset number. In response to the received image data satisfying the second image processing condition, the image data is written to the memory in the high power consumption mode. At this time, the image data may be original image data that is not processed by the image signal processor and the video encoder. And in response to the received image data meeting the second image processing condition, writing the image data as second target storage data to the memory in the high power consumption mode. And after the second target storage data is stored, adjusting the memory to a low power consumption mode, and controlling the image sensor to enter a dormant state. After the memory is in the low power mode, the control processor is powered down. Controlling the image sensor to enter the sleep state may be the processor notifying the image sensor to enter the sleep state.
It can be considered that, under the condition that the received image data meets the second image processing condition, the image data is written into the memory in the high power consumption mode, the memory is adjusted to the low power consumption mode, and the image sensor is controlled to sleep and the processor is powered down, so that the image data can be cached under the condition that the image data is in a motion state and a non-target object exists, and the power consumption of each module in the image acquisition device is saved, thereby improving the service efficiency of the battery of the image acquisition device.
In some embodiments, before the step of adjusting the processor to the high frequency mode in response to the received image data satisfying the first image processing condition, the image processing method of the image capturing apparatus may further include the steps of: firstly, cutting image data to obtain data to be processed. Then, object detection processing and/or quantity statistical processing are carried out on the data to be processed, and a data processing result is obtained. The image processing result comprises an object detection result and/or a quantity statistical result, wherein the object detection result comprises related information of a target object contained in the data to be processed or related information of no target object contained in the data to be processed, and the quantity statistical result comprises the total quantity corresponding to each image in the data to be processed. Next, it is determined whether the received image data satisfies the first image processing condition based on the data processing result.
The image acquisition device further comprises a video cropping module. The video cropping module may be a video cropping module VIF (Video Input Interface, VIF).
In some application scenarios, the video cropping module may transmit the received image data to the master. And performing object detection processing and/or quantity statistical processing on the image data to obtain a data processing result. In some application scenarios, the image data is subject to object detection, and the data processing result obtained may be an object detection result. In some application scenarios, the image data is subjected to quantity statistics, and the obtained data processing result may be a quantity statistics result. In other application scenarios, the step of performing the object detection processing on the image data and the step of performing the quantity statistics processing on the image data are performed simultaneously, and the obtained data processing result may be an object detection result and a quantity statistics result.
In other application scenarios, the video cropping module may crop the image data to obtain the data to be processed. The video clipping module can transmit the data to be processed to the main control. And carrying out object detection processing and/or quantity statistical processing on the data to be processed to obtain a data processing result. In some application scenarios, the object detection processing is performed on the data to be processed, and the obtained data processing result may be an object detection result. In some application scenarios, the data to be processed is subjected to quantity statistics, and the obtained data processing result may be a quantity statistics result. In other application scenarios, the step of performing the object detection processing on the data to be processed and the step of performing the quantity statistics processing on the data to be processed are performed synchronously, and the obtained data processing result may be an object detection result and a quantity statistics result. The relationship between the data to be processed and the image data may be that the data to be processed is the image data after clipping, wherein the number of images in the data to be processed is the same as the number of images in the image data.
In some application scenarios, the step of performing the number statistics processing on the data to be processed or the step of performing the number statistics processing on the image data may be performing frame rate counting on the data to be processed or the image data. The frame rate may be counted by adding the historical statistics to the number of images in the image data to obtain a number statistic. Or, adding the historical statistical quantity to the quantity of the images in the data to be processed to obtain a quantity statistical result.
Specifically, the process of frame rate counting may refer to formula (1):
RawCount = RawCount +n formula (1);
Wherein RawCount0 may represent the above-mentioned historical statistics. The historical statistics may be the number of images of historical image data retained at a historical time in memory. N may represent the number of images in the data to be processed or in the image data at the current time. Illustratively, N may be 1.RawCount1 may represent a quantity statistic.
The image processing result comprises an object detection result and/or a quantity statistical result, wherein the object detection result comprises related information of a target object contained in the data to be processed or related information of no target object contained in the data to be processed, and the quantity statistical result comprises the total quantity corresponding to each image in the data to be processed. It is determined whether the received image data satisfies the first image processing condition based on the data processing result.
In some application scenarios, the step of performing the object detection processing on the data to be processed is performed first, and the data processing result may be an object detection result. And executing the step of carrying out quantity statistical processing on the image data under the condition that the object detection result is that the data to be processed does not contain the related information of the target object. The number statistical result is the total number corresponding to each image in the data to be processed. And the total number of the images corresponding to the to-be-processed data corresponding to the number statistical result does not reach the preset number. The determination image processing result determines that the received image data satisfies the second image processing condition. And taking the data to be processed as second target storage data, and writing the second target storage data into the memory in the high power consumption mode. And after the second target storage data is stored, adjusting the memory to a low power consumption mode, and controlling the image sensor to enter a dormant state.
It is to be understood that the step of performing the object detection processing on the data to be processed and the step of performing the quantity statistics processing on the data to be processed may be performed in a serial manner or may be performed in a parallel manner. The step of performing the object detection processing on the image data may refer to the step of performing the object detection processing on the data to be processed, which is not described herein. The step of performing the number statistics processing on the image data may refer to the step of performing the number statistics processing on the data to be processed, which is not described herein.
In some application scenarios, in the serial execution mode, the step of performing the object detection processing on the data to be processed is performed first, and the data processing result may be an object detection result. And executing the step of carrying out quantity statistics processing on the data to be processed under the condition that the object detection result is that the data to be processed does not contain the related information of the target object. And the total number of the images corresponding to the to-be-processed data corresponding to the number statistical result reaches the preset number. And determining an image processing result to determine that the received data to be processed meets the first image processing condition.
In some application scenarios, in the serial execution mode, the step of performing the object detection processing on the data to be processed is performed first, and the data processing result may be an object detection result. And executing the step of carrying out quantity statistics processing on the data to be processed under the condition that the object detection result is that the data to be processed does not contain the related information of the target object. And the total number of the images corresponding to the to-be-processed data corresponding to the number statistical result does not reach the preset number. And determining an image processing result to determine that the received data to be processed meets the second image processing condition.
In some application scenarios, in the serial execution mode, the step of performing the object detection processing on the data to be processed is performed first, and the data processing result may be an object detection result. And executing the step of carrying out quantity statistics processing on the data to be processed under the condition that the object detection result is that the data to be processed contains the related information of the target object. And the total number of the images corresponding to the to-be-processed data corresponding to the number statistical result reaches the preset number. And determining an image processing result to determine that the received data to be processed meets the first image processing condition.
In some application scenarios, in the serial execution mode, the step of performing the object detection processing on the data to be processed is performed first, and the data processing result may be an object detection result. In the case that the object detection result is that the data to be processed contains the related information of the target object, the step of performing the quantity statistical processing on the data to be processed is not required to be executed. And determining an image processing result to determine that the received data to be processed meets the first image processing condition.
In other application scenarios, in the parallel execution mode, the step of performing object detection processing on the data to be processed and the step of performing quantity statistics processing on the data to be processed are executed in parallel. The obtained data processing results comprise object detection results and quantity statistical results. And determining that the data to be processed meets the second image processing condition in response to the condition that the object detection result is that the data to be processed does not contain the related information of the target object and the total number of the images in the data to be processed corresponding to the number statistics result does not reach the preset number. And determining that the data to be processed meets the first image processing condition in response to the condition that the object detection result is that the data to be processed does not contain the related information of the target object and the total number of the images in the data to be processed corresponding to the number statistics result reaches the preset number. And determining that the data to be processed meets the first image processing condition when the object detection result is that the data to be processed contains the related information of the target object and the total number of the images in the data to be processed corresponding to the number statistics result reaches or does not reach the preset number.
In some application scenarios, the processor is adjusted to a high frequency mode to activate the functional modules in response to the received data to be processed meeting the first image processing condition. And controlling each functional module to perform image processing on the data to be processed to obtain video data. After obtaining the video data, the video data is used as first target storage data, and the first target storage data is stored. After the first target storage data is stored, the memory is adjusted to a low power consumption mode, and the image sensor is controlled to enter a dormant state. After the first target storage data is stored, the control processor is powered down.
In some application scenarios, in response to the received data to be processed meeting the second image processing condition, the data to be processed is taken as second target storage data, and the second target storage data is written into the memory in the high power consumption mode. And after the second target storage data is stored, adjusting the memory to a low power consumption mode, and controlling the image sensor to enter a dormant state. After the memory is in the low power mode, the control processor is powered down.
It can be considered that, under the condition that the received image data meets the second image processing condition, the image data is written into the memory in the high power consumption mode, the memory is adjusted to the low power consumption mode, and the image sensor is controlled to sleep and the processor is powered down, so that the image data can be cached under the condition that the image data is in a motion state and a non-target object exists, and the power consumption of each module in the image acquisition device is saved, thereby improving the service efficiency of the battery of the image acquisition device.
It can be considered that, under the condition that the received image data meets the first image processing condition, writing the video data into the memory in the high power consumption mode, adjusting the memory to the low power consumption mode, controlling the image sensor to sleep and the processor to be powered down, storing the video data under the condition that the total number of the images in the to-be-processed data corresponding to the target object and/or the number statistical result reaches the preset number, and saving the power consumption of each module in the image acquisition device, thereby improving the service efficiency of the battery of the image acquisition device.
In some embodiments, the operation state of the image sensor includes a motion detection mode and a map mode, and the image processing method of the image capturing device may further include the steps of: first, an image sensor in a motion detection mode is controlled to detect image data, and a motion detection result is obtained. The motion detection result includes a first motion detection result regarding the related information including the moving object in the image data. Then, in response to the motion detection result being the first motion detection result, the image sensor is adjusted to a graph mode so that the processor receives image data from the image sensor in a low frequency mode.
The image sensor has an operating state and a sleep state. The image sensor in the working state can have a motion detection mode and a graph mode. Specifically, the motion detection mode may be a motion detection function of the image sensor. In some application scenarios, the motion detection function may be an image sensor adaptive motion detection function, and when the frame changes, a control signal is output. The Motion detection mode may be Motion Detection (MD) for example. Specifically, the pattern of the map may be such that the image sensor is able to send the acquired image data to the processor.
The moving object may be all objects to be confirmed including the target object. The object to be confirmed may be a predetermined organism type, or a predetermined object type, or a predetermined vehicle type. Specifically, the object to be confirmed may be an object type capable of changing a picture corresponding to the target area in the acquired image data. And controlling the image sensor in the motion detection mode to detect the image data to obtain a motion detection result. The motion detection result is a first motion detection result regarding the related information of the moving object contained in the image data. In case the motion detection result is the first motion detection result, the image sensor is adjusted to the image mode so that the processor receives image data from the image sensor in the low frequency mode.
In some embodiments, the motion detection result includes a second motion detection result regarding information related to the moving object not included in the image data, and the image processing method of the image capturing apparatus may further include the steps of: and controlling the image sensor to enter a dormant state in response to the motion detection result being a second motion detection result.
And controlling the image sensor in the motion detection mode to detect the image data to obtain a motion detection result. The motion detection result is a first motion detection result regarding the related information of the moving object contained in the image data. And controlling the image sensor to enter a dormant state under the condition that the motion detection result is a second motion detection result.
In some embodiments, the first motion detection result includes a plurality of current motion detection events, the plurality of current motion detection events characterize that the plurality of objects to be confirmed are in a motion state, and the step of adjusting the image sensor to the image mode in response to the motion detection result being the first motion detection result may include the following steps: and in response to the first dynamic detection result meeting a first preset condition, adjusting the image sensor to a graph mode, wherein the first preset condition comprises no historical dynamic detection event before a current timestamp corresponding to the current dynamic detection event in the first dynamic detection result. The historical motion detection event characterizes that the object to be confirmed is in motion prior to the current timestamp. Or, in response to the first moving detection result meeting a second preset condition, adjusting the image sensor to a graph mode, wherein the second preset condition comprises that under the condition that a historical moving detection event exists before a current timestamp corresponding to a current moving detection event in the first moving detection result, and the interval time between current timestamps corresponding to at least one current moving detection event in the first moving detection result is larger than or equal to a second preset time.
The first motion detection result comprises a plurality of current motion detection events. The current motion detection events can be one current motion detection event or a plurality of current motion detection events. The current motion detection events represent that a plurality of objects to be confirmed are in a motion state. The current motion detection event characterizes that an object to be confirmed is in a motion state. The objects to be confirmed among the current motion detection events can be different objects or the same object.
In an exemplary embodiment, the image sensor is configured to determine that there is no historical motion detection event prior to the first acquisition of image data when the image sensor first acquires image data after the image sensor is powered on. Or under the condition that the image sensor acquires the image data for a plurality of times, other image data before the tail image data does not detect the historical motion detection event, the tail image data detects the motion detection event, and the image sensor does not have the historical motion detection event before the current motion detection event corresponding to the tail image data. The absence of a historical motion detection event may be the absence of historical image data or the absence of historical pending data in memory.
The historical motion detection event characterizes that the object to be confirmed is in motion prior to the current timestamp. Each current or historical motion detection event corresponds to a time stamp. That is, the current timestamp may be a starting time of the current motion detection event when the object to be confirmed is in a motion state. The current timestamp may be an end time of the current motion detection event when the object to be confirmed is in a motion state. The current timestamp may be a current time period in which the object to be confirmed is in a motion state in the current motion detection event. The current time period includes a start time and an end time at which the object to be confirmed is in a motion state. The historical timestamp may be a starting time of the historical motion detection event when the object to be confirmed is in a motion state. The historical timestamp may be an end time of the historical motion detection event when the object to be confirmed is in a motion state. The historical timestamp may be a historical period of time in which the object to be confirmed is in motion in the historical motion detection event. The historical time period includes a start time and an end time at which the object to be confirmed is in a motion state.
In some application scenarios, the first preset condition includes that there is no historical motion detection event before a current timestamp corresponding to a current motion detection event in the first motion detection result. It can be understood that the image sensor is directly adjusted to the image mode in the case that there is no historical motion detection event before the current time stamp of the plurality of current motion detection events corresponding to the image data collected by the image sensor.
In other application scenarios, the second preset condition includes that under the condition that a historical motion detection event is before a current timestamp corresponding to a current motion detection event in the first motion detection result, and a separation time between current timestamps corresponding to at least one current motion detection event in the first motion detection result is greater than or equal to the second preset time. It can be understood that, in the case that there is a historical motion detection event before the current time stamp of the plurality of current motion detection events corresponding to the image data collected by the image sensor, a relationship between the current time stamp of the plurality of current motion detection events and the historical time stamp of the historical motion detection event is determined. And responding to the relation between the current time stamps of the current dynamic detection events and the historical time stamps of the historical dynamic detection events to meet the second preset time, and adjusting the image sensor to a graph mode. In some application scenarios, a stand-off time between a current timestamp in response to at least one current motion detection event and a historical timestamp of a historical motion detection event is greater than or equal to a second preset time. In other application scenarios, a time interval between a current timestamp in response to a first current motion detection event and a historical timestamp of a historical motion detection event is greater than or equal to a second preset time. The current timestamp of the first current motion detection event may be a time of the first motion detection event in a number of current timestamps corresponding to the number of current motion detection events. The historical motion detection event may be a motion detection event adjacent to the first current motion detection event in the plurality of historical motion detection events and before a current timestamp corresponding to the first current motion detection event.
It is understood that the first preset time and the second preset time may be the same time or different times.
It can be considered that under the condition that the current motion detection event does not exist in the image processing data, the image sensor directly enters the dormant state, and the motion detection event is redetected after waiting for the timed wake-up, so that the wake-up time of the image sensor can be reduced, the image sensor is not always in the working state in the wake-up state, the power consumption of each module in the image acquisition equipment can be saved, and the service efficiency of the battery of the image acquisition equipment is improved.
In some embodiments, the image processing method of the image acquisition apparatus may further include the steps of: and responding to the first dynamic detection result to meet a third preset condition, and controlling the image sensor to enter a dormant state, wherein the third preset condition comprises that under the condition that a historical dynamic detection event is arranged before a current timestamp corresponding to a current dynamic detection event in the first dynamic detection result, and the interval time between the current timestamps corresponding to all the current dynamic detection events and the historical timestamps of the historical dynamic detection events in the first dynamic detection result is smaller than the third preset time.
In some application scenarios, the third preset condition includes that under the condition that a historical motion detection event is before a current timestamp corresponding to a current motion detection event in the first motion detection result, and the interval time between the current timestamps corresponding to all the current motion detection events and the historical timestamps of the historical motion detection events in the first motion detection result is smaller than the third preset time. It can be understood that, in the case that there is a historical motion detection event before the current time stamp of the plurality of current motion detection events corresponding to the image data collected by the image sensor, a relationship between the current time stamp of the plurality of current motion detection events and the historical time stamp of the historical motion detection event is determined. And controlling the image sensor to enter a dormant state in response to the relation between the current time stamps of the current dynamic detection events and the historical time stamps of the historical dynamic detection events meeting a third preset time. In some application scenarios, a time interval between a current timestamp in response to each current motion detection event and a historical timestamp of a historical motion detection event is less than a third preset time. In other application scenarios, a time interval between a current timestamp in response to the last current motion detection event and a historical timestamp of the historical motion detection event is less than a third preset time. The current timestamp of the last current motion detection event may be the time of the last motion detection event in the current timestamps corresponding to the current motion detection events. The historical motion detection event may be a motion detection event adjacent to the first current motion detection event in the plurality of historical motion detection events and before a current timestamp corresponding to the first current motion detection event.
It is understood that the third preset time and the first preset time and the second preset time may be the same time or different times.
Under the condition that the current dynamic detection event exists in the image processing data, the relation between a plurality of current dynamic detection events and the historical dynamic detection event is judged or the relation between a plurality of current dynamic detection events is judged, so that the image sensor is determined to be in a picture mode or a dormant state, the image sensor is not always in a wake-up state or always in a motion detection mode, the electric energy consumption of each image sensor in the image acquisition equipment can be saved, and the use efficiency of the battery of the image acquisition equipment is improved.
In some embodiments, before the above step S11, the image processing method of the image capturing apparatus may further include the steps of: first, in response to the image sensor powering up, a wake-up condition is set to the image sensor. Subsequently, the image sensor is controlled to perform state switching at a predetermined frequency. The wake-up condition includes that the image sensor in the dormant state is adjusted to the working state after the target wake-up time, wherein the image sensor in the working state can be used for collecting image data.
In response to the image sensor powering up, a wake-up condition is set to the image sensor. Subsequently, the image sensor is controlled to perform state switching at a predetermined frequency. Setting the wake-up condition to the image sensor may be that the image sensor will switch from a sleep state to an active state within a target wake-up time. Specifically, the image sensor enters a motion detection mode after the image sensor is switched from a sleep state to an operating state within a target wake-up time. The wake-up condition comprises the step of adjusting the image sensor in the dormant state to a working state after a first preset time, wherein the image sensor in the working state can be used for collecting image data. The target wake-up time may be dynamically set according to the requirements of image processing.
It can be considered that after the wake-up condition is set, the image sensor can be wake-up at regular time, so that the image sensor is not always in a dormant state, and the efficiency of the image sensor for collecting image data and detecting motion can be improved, thereby improving the efficiency of image processing of the image collecting device.
In some embodiments, the image processing method of the image acquisition device may further comprise the steps of, before the processor receives the image data from the image sensor in the low frequency mode: and adjusting the exposure parameters of the image acquisition equipment. A wake-up condition is set to the adjusted image sensor so that the adjusted image sensor performs a state switching at a predetermined frequency.
The exposure parameter adjustment may be exposure convergence FastAe (Fast Auto Exposure, fastAe) is a technique for quickly adjusting the exposure settings in a camera system. The adjusted image sensor may find the optimal exposure setting as fast as possible in order to be able to adapt quickly when the ambient light conditions change. The image exposure acquired by the image sensor can be ensured to be stable in a short time.
The image sensor after adjustment can be considered to perform state switching according to a predetermined frequency, and the acquired image data is output stable data.
According to the scheme, the processor is adjusted to the high-frequency mode to start each functional module in response to the fact that the image data received by the processor in the low-frequency mode meets the first image processing condition, the first image processing condition comprises that the image data contains relevant information of a target object and/or the number of the image data reaches the preset number, and each functional module is controlled to process the image data.
Referring to fig. 2, fig. 2 is another flow chart of an embodiment of an image processing method of the image capturing device according to the present application.
Step S201, step S202, step S203, and step S204 are sequentially executed. Step S201: the image sensor is powered up. Step S202: and adjusting the exposure parameters of the image acquisition equipment. Step S203: a wake-up condition is set to the image sensor. Step S204: the image sensor enters a motion detection mode, and detects the acquired image data to obtain a motion detection result. And judging the motion detection result as a first motion detection result or a second motion detection result. In the case where the motion detection result is the first motion detection result, step S205 is performed. Step S205: and responding to the first dynamic detection result to meet a first preset condition, and adjusting the image sensor to a graph mode. If the motion detection result is the second motion detection result, step S206, step S207, and step S208 are sequentially performed. Step S206: and controlling the image sensor to enter a dormant state in response to the motion detection result being a second motion detection result. Step S207: and judging whether the image sensor meets a wake-up condition. Step S208: the image sensor remains dormant.
It is considered that controlling the image sensor to enter the sleep state and the working state under different conditions can reduce the number of false wake-up times of the image sensor, thereby improving the use efficiency of the battery of the image acquisition device.
Referring to fig. 3, fig. 3 is a flow chart of an embodiment of an image processing method of the image capturing device according to the present application. As shown in fig. 3, the flow of image processing of the image acquisition device may include the steps of:
Step S301 and step S302 are sequentially performed. Step S301: the processor receives image data from the image sensor in the map mode in the low frequency mode. Step S302: and the video clipping module clips the image data to obtain data to be processed. Judging whether the data to be processed meets the first image condition or not, and judging whether the data to be processed meets the second image processing condition or not. In the case where the data to be processed satisfies the first image processing condition, step S303, step S304, and step S305 are sequentially executed. Step S303: and adjusting the processor to a high-frequency mode in response to the received data to be processed meeting the first image processing condition. Step S304: and controlling each functional module to perform image processing on the data to be processed to obtain video data. Step S305: and in response to the video data storage, adjusting the memory to a low power consumption mode, and controlling the image sensor to enter a dormant state. In the case where the data to be processed satisfies the second image processing condition, step S306 and step S307 are sequentially executed. Step S306: and writing the data to be processed to the memory in the high power consumption mode in response to the received image data satisfying the second image processing condition. Step S307: and adjusting the memory to a low power consumption mode in response to the storage of the data to be processed, and controlling the image sensor to enter a dormant state.
It can be considered that under different image processing conditions, the starting of each functional module in the image acquisition device is controlled, so that the starting of a non-target object to each functional module can be reduced, and the use efficiency of the battery of the image acquisition device is improved. In addition, under different use demands, the modules in the image acquisition equipment are switched in different power consumption modes, so that the use efficiency of the battery of the image acquisition equipment can be improved.
According to the scheme, the processor is adjusted to the high-frequency mode to start each functional module in response to the fact that the image data received by the processor in the low-frequency mode meets the first image processing condition, the first image processing condition comprises that the image data contains relevant information of a target object and/or the number of the image data reaches the preset number, and each functional module is controlled to process the image data.
Referring to fig. 4, fig. 4 is a schematic structural diagram of an image processing apparatus of an image capturing device according to an embodiment of the present application. The image processing device 40 of the image acquisition apparatus comprises a receiving module 41, an adjusting module 42 and a control module 43. A receiving module 41, configured to receive image data from the image sensor in a low frequency mode, where at least part of the functional modules in the low frequency mode are in an inactive state; an adjustment module 42, configured to adjust the processor to a high-frequency mode to activate each functional module in response to the received image data satisfying a first image processing condition, where the first image processing condition includes that the image data includes information about a target object and/or that the number of image data reaches a preset number; and a control module 43 for controlling each functional module to perform image processing on the image data.
According to the scheme, the processor is adjusted to the high-frequency mode to start each functional module in response to the fact that the image data received by the processor in the low-frequency mode meets the first image processing condition, the first image processing condition comprises that the image data contains relevant information of a target object and/or the number of the image data reaches the preset number, and each functional module is controlled to process the image data.
The functions executed by each module refer to an image processing method of the image acquisition device, and are not described herein.
Referring to fig. 5, fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the application. The electronic device 50 comprises a memory 51 and a processor 52, the processor 52 being arranged to execute program instructions stored in the memory 51 for carrying out the steps of the image processing method embodiments of the image acquisition device described above. In one particular implementation scenario, electronic device 50 may include, but is not limited to: the electronic device 50 may also include mobile devices such as a notebook computer, a tablet computer, and the like, and is not limited thereto.
In particular, the processor 52 is adapted to control itself and the memory 51 to implement the steps in the image processing method embodiment of the image acquisition apparatus described above. The processor 52 may also be referred to as a CPU (Central Processing Unit ). The processor 52 may be an integrated circuit chip having signal processing capabilities. The Processor 52 may also be a general purpose Processor, a digital signal Processor (DIGITAL SIGNAL Processor, DSP), an Application SPECIFIC INTEGRATED Circuit (ASIC), a Field-Programmable gate array (Field-Programmable GATE ARRAY, FPGA) or other Programmable logic device, a discrete gate or transistor logic device, a discrete hardware component. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. In addition, the processor 52 may be commonly implemented by an integrated circuit chip.
According to the scheme, the processor is adjusted to the high-frequency mode to start each functional module in response to the fact that the image data received by the processor in the low-frequency mode meets the first image processing condition, the first image processing condition comprises that the image data contains relevant information of a target object and/or the number of the image data reaches the preset number, and each functional module is controlled to process the image data.
Referring to fig. 6, fig. 6 is a schematic structural diagram of an embodiment of a computer readable storage medium according to the present application. A computer readable storage medium 60 having stored thereon program instructions 601, which program instructions 601 when executed by a processor implement the steps of an image processing method embodiment of any of the image acquisition devices described above.
According to the scheme, the processor is adjusted to the high-frequency mode to start each functional module in response to the fact that the image data received by the processor in the low-frequency mode meets the first image processing condition, the first image processing condition comprises that the image data contains relevant information of a target object and/or the number of the image data reaches the preset number, and each functional module is controlled to process the image data.
In some embodiments, functions or modules included in an apparatus provided by the embodiments of the present disclosure may be used to perform a method described in the foregoing method embodiments, and specific implementations thereof may refer to descriptions of the foregoing method embodiments, which are not repeated herein for brevity.
The foregoing description of various embodiments is intended to highlight differences between the various embodiments, which may be the same or similar to each other by reference, and is not repeated herein for the sake of brevity.
In the several embodiments provided in the present application, it should be understood that the disclosed method and apparatus may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical functional division, and there may be additional divisions of actual implementation, e.g., units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical, or other forms.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) to execute all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.

Claims (10)

1. An image processing method of an image acquisition device, wherein the image acquisition device comprises an image sensor, a processor and a plurality of functional modules, the method comprising:
the processor receives image data from the image sensor in a low frequency mode, wherein at least part of functional modules in the low frequency mode are in an inactive state;
In response to the received image data meeting a first image processing condition, adjusting the processor to a high-frequency mode to start each functional module, wherein the first image processing condition comprises that the image data contains relevant information of a target object and/or the number of the image data reaches a preset number;
And controlling each functional module to perform image processing on the image data.
2. The method of claim 1, wherein the image acquisition device further comprises a memory, the method further comprising:
Writing the image data to the memory in a high power consumption mode in response to the received image data satisfying a second image processing condition, the second image processing condition including that the image data does not contain related information of the target object and the number of the image data does not reach the preset number;
And adjusting the memory to a low power consumption mode in response to the completion of the storage of the image data, and controlling the image sensor to enter a dormant state.
3. The method of claim 2, wherein before said adjusting said processor to a high frequency mode in response to said received image data meeting a first image processing condition, said method further comprises:
Cutting the image data to obtain data to be processed;
Performing object detection processing and/or quantity statistical processing on the data to be processed to obtain a data processing result, wherein the data processing result comprises an object detection result and/or a quantity statistical result, the object detection result comprises detection of related information of the target object contained in the image data or related information of the target object not contained in the image data, and the quantity statistical result comprises the total quantity corresponding to each image in the data to be processed;
Determining whether the received image data satisfies the first image processing condition based on the data processing result.
4. A method according to any one of claims 1 to 3, wherein the operating state of the image sensor includes a motion detection mode and a map mode, the method further comprising, before the processor receives image data from the image sensor in a low frequency mode:
Controlling the image sensor in the motion detection mode to detect the image data to obtain a motion detection result, wherein the motion detection result comprises a first motion detection result about the related information of the moving object contained in the image data;
And in response to the motion detection result being the first motion detection result, adjusting the image sensor to the graph mode so that the processor receives image data from the image sensor in a low frequency mode.
5. The method of claim 4, wherein the first motion detection result comprises a number of current motion detection events characterizing a number of objects to be confirmed in motion, and wherein adjusting the image sensor to the pattern of images in response to the motion detection result being the first motion detection result comprises:
in response to the first dynamic detection result meeting a first preset condition, adjusting the image sensor to the graph mode, wherein the first preset condition comprises no historical dynamic detection event before a current timestamp corresponding to the current dynamic detection event in the first dynamic detection result, and the historical dynamic detection event represents that the object to be confirmed is in a motion state before the current timestamp; or alternatively, the first and second heat exchangers may be,
And in response to the first moving detection result meeting a second preset condition, adjusting the image sensor to the graph mode, wherein the second preset condition comprises that under the condition that a historical moving detection event exists before a current timestamp corresponding to the current moving detection event in the first moving detection result, and the interval time between the current timestamps corresponding to at least one current moving detection event in the first moving detection result is larger than or equal to a second preset time.
6. The method of claim 5, wherein the method further comprises:
And responding to the first dynamic detection result to meet a third preset condition, and controlling the image sensor to enter a dormant state, wherein the third preset condition comprises that under the condition that a historical dynamic detection event exists before a current timestamp corresponding to the current dynamic detection event in the first dynamic detection result, and the interval time between all current timestamps corresponding to the current dynamic detection event and the historical timestamp of the historical dynamic detection event in the first dynamic detection result is smaller than a third preset time.
7. The method of claim 4, wherein the motion detection result includes a second motion detection result regarding no related information of a moving object contained in the image data, the method further comprising:
And responding to the motion detection result as the second motion detection result, and controlling the image sensor to enter the dormant state.
8. A method according to any one of claims 1 to 3, wherein before the processor receives image data from the image sensor in a low frequency mode, the method further comprises:
setting a wake-up condition to the image sensor in response to the image sensor powering up;
and controlling the image sensor to perform state switching according to a preset frequency, wherein the wake-up condition comprises the step of adjusting the image sensor in a dormant state to a working state after target wake-up time, and the image sensor in the working state can be used for acquiring the image data.
9. An electronic device, comprising: a memory and a processor, wherein the memory stores program instructions, the processor retrieving the program instructions from the memory to perform the method of any of claims 1-8.
10. A computer-readable storage medium, comprising: program files are stored which, when executed by a processor, are adapted to carry out the method according to any one of claims 1-8.
CN202410632121.1A 2024-05-21 2024-05-21 Image processing method of image acquisition device, electronic device and storage medium Active CN118214946B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410632121.1A CN118214946B (en) 2024-05-21 2024-05-21 Image processing method of image acquisition device, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410632121.1A CN118214946B (en) 2024-05-21 2024-05-21 Image processing method of image acquisition device, electronic device and storage medium

Publications (2)

Publication Number Publication Date
CN118214946A true CN118214946A (en) 2024-06-18
CN118214946B CN118214946B (en) 2024-08-30

Family

ID=91446605

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410632121.1A Active CN118214946B (en) 2024-05-21 2024-05-21 Image processing method of image acquisition device, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN118214946B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118764709A (en) * 2024-09-05 2024-10-11 浙江宇视科技有限公司 A method, device, equipment and medium for switching working modes of image collector
CN119559563A (en) * 2025-02-06 2025-03-04 浙江大华技术股份有限公司 Image data processing method, electronic device and storage medium

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101489292A (en) * 2009-02-25 2009-07-22 南京邮电大学 Target tracking method in wireless multimedia sensor network
CN105472318A (en) * 2015-11-17 2016-04-06 深圳市共进电子股份有限公司 Method and system for starting low-power network camera
CN105812734A (en) * 2016-03-17 2016-07-27 深圳市哈工大交通电子技术有限公司 Low power consumption video sensor and control method of the low power consumption video sensor
CN109873952A (en) * 2018-06-20 2019-06-11 成都市喜爱科技有限公司 A kind of method, apparatus of shooting, equipment and medium
WO2019154241A1 (en) * 2018-02-08 2019-08-15 云丁网络技术(北京)有限公司 Power consumption control method and camera device
CN111327821A (en) * 2020-02-24 2020-06-23 珠海格力电器股份有限公司 Control method and device of intelligent camera device, computer equipment and storage medium
CN113938598A (en) * 2020-07-14 2022-01-14 浙江宇视科技有限公司 Surveillance camera wake-up method, device, device and medium
CN114785954A (en) * 2022-04-27 2022-07-22 深圳影目科技有限公司 Processor wake-up method, device, system, storage medium and AR glasses
CN116458849A (en) * 2023-04-23 2023-07-21 石家庄铁道大学 A progressively triggered human body abnormal state recognition system and application method
WO2023160000A1 (en) * 2022-02-25 2023-08-31 上海商汤智能科技有限公司 Sleep wakeup method and related product
CN116962877A (en) * 2023-08-17 2023-10-27 武汉中原电子信息有限公司 Quick starting method and related device for low-power-consumption scattering type monitoring equipment
CN117992128A (en) * 2022-11-03 2024-05-07 华为技术有限公司 A wake-up method and device

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101489292A (en) * 2009-02-25 2009-07-22 南京邮电大学 Target tracking method in wireless multimedia sensor network
CN105472318A (en) * 2015-11-17 2016-04-06 深圳市共进电子股份有限公司 Method and system for starting low-power network camera
CN105812734A (en) * 2016-03-17 2016-07-27 深圳市哈工大交通电子技术有限公司 Low power consumption video sensor and control method of the low power consumption video sensor
WO2019154241A1 (en) * 2018-02-08 2019-08-15 云丁网络技术(北京)有限公司 Power consumption control method and camera device
CN109873952A (en) * 2018-06-20 2019-06-11 成都市喜爱科技有限公司 A kind of method, apparatus of shooting, equipment and medium
CN111327821A (en) * 2020-02-24 2020-06-23 珠海格力电器股份有限公司 Control method and device of intelligent camera device, computer equipment and storage medium
CN113938598A (en) * 2020-07-14 2022-01-14 浙江宇视科技有限公司 Surveillance camera wake-up method, device, device and medium
US20230269462A1 (en) * 2020-07-14 2023-08-24 Zhejiang Uniview Technologies Co., Ltd. Wake-up method for surveillance camera, surveillance camera, and non-transitory computer-readable storage medium
WO2023160000A1 (en) * 2022-02-25 2023-08-31 上海商汤智能科技有限公司 Sleep wakeup method and related product
CN114785954A (en) * 2022-04-27 2022-07-22 深圳影目科技有限公司 Processor wake-up method, device, system, storage medium and AR glasses
CN117992128A (en) * 2022-11-03 2024-05-07 华为技术有限公司 A wake-up method and device
CN116458849A (en) * 2023-04-23 2023-07-21 石家庄铁道大学 A progressively triggered human body abnormal state recognition system and application method
CN116962877A (en) * 2023-08-17 2023-10-27 武汉中原电子信息有限公司 Quick starting method and related device for low-power-consumption scattering type monitoring equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118764709A (en) * 2024-09-05 2024-10-11 浙江宇视科技有限公司 A method, device, equipment and medium for switching working modes of image collector
CN119559563A (en) * 2025-02-06 2025-03-04 浙江大华技术股份有限公司 Image data processing method, electronic device and storage medium

Also Published As

Publication number Publication date
CN118214946B (en) 2024-08-30

Similar Documents

Publication Publication Date Title
CN118214946B (en) Image processing method of image acquisition device, electronic device and storage medium
CN108345524B (en) Application monitoring method and application monitoring device
CN109167931B (en) Image processing method, device, storage medium and mobile terminal
US20130222629A1 (en) Methods, apparatuses, and computer program products for facilitating concurrent video recording and still image capture
WO2014177049A1 (en) Method and device for adjusting frame rate of video recording
EP3382527B1 (en) Method and apparatus for managing a shared storage system
CN109788138A (en) Screen control method, device, terminal and storage medium
CN107729889A (en) Image processing method and device, electronic device, computer-readable storage medium
US20190364221A1 (en) Camera Control Method and Terminal
CN105100730A (en) Monitoring method and camera device
CN110636598B (en) Terminal power saving control method and device, mobile terminal and storage medium
CN107544842A (en) Application program processing method and device, computer equipment, storage medium
US20230262331A1 (en) Image capture method and apparatus, mobile terminal, and storage medium
CN105323484B (en) Rapid photographing method and electronic equipment
EP2738712A1 (en) Embedded multimedia card partitioned storage space adjustment method and terminal
CN103885568B (en) A kind of method and device reducing electric current when taking pictures
WO2019034107A1 (en) Power saving method and smart glasses
WO2025119132A1 (en) Photographing method and apparatus, and electronic device
US20150116523A1 (en) Image signal processor and method for generating image statistics
CN117793405A (en) Video playing method, device, equipment and storage medium
WO2024222606A1 (en) Cache resource adjustment method and apparatus, and electronic device and readable storage medium
CN107491349A (en) Application program processing method and device, computer equipment, storage medium
CN116610445A (en) Queue depth adjustment method, device, electronic device and readable storage medium
CN115543500B (en) Window processing method and electronic equipment
CN115587049A (en) Memory reclamation method, device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant