CN112486684B - Driving image display method, device and platform, storage medium and embedded equipment - Google Patents
Driving image display method, device and platform, storage medium and embedded equipment Download PDFInfo
- Publication number
- CN112486684B CN112486684B CN202011373475.7A CN202011373475A CN112486684B CN 112486684 B CN112486684 B CN 112486684B CN 202011373475 A CN202011373475 A CN 202011373475A CN 112486684 B CN112486684 B CN 112486684B
- Authority
- CN
- China
- Prior art keywords
- cpu
- display
- environment information
- pointer
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/46—Multiprogramming arrangements
- G06F9/50—Allocation of resources, e.g. of the central processing unit [CPU]
- G06F9/5005—Allocation of resources, e.g. of the central processing unit [CPU] to service a request
- G06F9/5027—Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals
- G06F9/505—Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals considering the load
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/4401—Bootstrapping
- G06F9/4418—Suspend and resume; Hibernate and awake
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/46—Multiprogramming arrangements
- G06F9/48—Program initiating; Program switching, e.g. by interrupt
- G06F9/4806—Task transfer initiation or dispatching
- G06F9/4843—Task transfer initiation or dispatching by program, e.g. task dispatcher, supervisor, operating system
- G06F9/4881—Scheduling strategies for dispatcher, e.g. round robin, multi-level priority queues
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/20—Processor architectures; Processor configuration, e.g. pipelining
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C5/00—Registering or indicating the working of vehicles
- G07C5/008—Registering or indicating the working of vehicles communicating information to a remotely located station
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C5/00—Registering or indicating the working of vehicles
- G07C5/08—Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
- G07C5/0841—Registering performance data
- G07C5/085—Registering performance data using electronic data carriers
- G07C5/0866—Registering performance data using electronic data carriers the electronic data carrier being a digital video recorder in combination with video camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/04—Synchronising
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Landscapes
- Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Computer Security & Cryptography (AREA)
- Signal Processing (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
A driving image display method, device and platform, storage medium and embedded device, the method includes: when 2D environment information and 3D environment information are synchronously displayed, 2D display parameters and 3D display parameters are obtained, wherein the 2D display parameters are parameters when the 2D environment information is displayed, and the 3D display parameters are parameters when the 3D environment information is displayed; acquiring running information of a CPU (central processing unit), wherein the CPU comprises a plurality of cores; adjusting the frequency of the CPU and/or changing an execution core according to the 2D display parameters, the 3D display parameters and/or the running information of the CPU, wherein the execution core is a CPU core for running a target thread, and the target thread is a thread for processing 2D environment information and/or 3D environment information. The method and the device can solve the problems of reduction of the display frame rate and display delay when the existing panoramic driving image system simultaneously displays 2D and 3D pictures.
Description
Technical Field
The invention relates to the technical field of video display, in particular to a driving image display method, a driving image display device, a driving image display platform, a driving image display storage medium and embedded equipment.
Background
With the rapid development of image and vision technologies, more and more related technologies are applied to the field of vehicle-mounted electronics, and the traditional driving image system only can cover the area with limited visual angle around the tail of a vehicle by utilizing a single-path camera installed at the tail of the vehicle, so that the information around the vehicle cannot be viewed, and the driving safety potential hazards of a driver can be greatly increased.
The current novel panoramic driving image system senses the surrounding environment information of a vehicle by using a plurality of cameras, and obtains the driving information of the vehicle through a 360-degree (or written as 360 degrees) plane two-dimensional (2D) mode or a 360-degree three-dimensional (3D) mode. A Camera (Camera) capturing device commonly available on the market has an output frame rate of 30 Frames Per Second (fps), a frame interval of about 33 milliseconds (ms), and an output format of YUV. YUV is a color coding method, Y represents brightness (Luma) and is a gray scale value, and U and V represent chrominance. However, the current panoramic driving image system can only pass through a 360 ° plane 2D mode, cannot sense 360 ° 3D environmental information of the vehicle, or can only display a 360 ° 3D mode, and cannot be compatible with a 2D plane mode.
If 2D and 3D pictures are displayed simultaneously, the embedded device is influenced by performance and power consumption, and cannot run on the vehicle platform at full power for a long time, so that the situation that the frame rate cannot reach 30fps due to the fact that the 2D/3D algorithm is not optimized and run directly on the embedded platform is caused. At this time, displaying the 2D and 3D pictures simultaneously may reduce the 2D/3D display frame rate, cause display delay, and fail to satisfy the real-time performance of the display. When a driver handles situations such as vehicle starting, driving turning, parking, narrow-road meeting, obstacle avoidance and the like, distance deviation possibly caused by delay can be delayed, judgment of the driver can be affected, and driving safety is damaged.
Disclosure of Invention
The invention solves the technical problems of reduction of the display frame rate and display delay when the existing panoramic driving image system simultaneously displays 2D and 3D pictures.
In order to solve the above problem, an embodiment of the present invention provides a driving image display method, including: when 2D environment information and 3D environment information are synchronously displayed, 2D display parameters and 3D display parameters are obtained, wherein the 2D display parameters are parameters when the 2D environment information is displayed, and the 3D display parameters are parameters when the 3D environment information is displayed; acquiring running information of a CPU (central processing unit), wherein the CPU comprises a plurality of cores; adjusting the frequency of the CPU and/or changing an execution core according to the 2D display parameters, the 3D display parameters and/or the running information of the CPU, wherein the execution core is a CPU core for running a target thread, and the target thread is a thread for processing 2D environment information and/or 3D environment information.
Optionally, the operation information of the CPU includes a temperature of the CPU and/or a frequency of the CPU.
Optionally, the 2D display parameters include a display frame rate for displaying the 2D environment information, and the 3D display parameters include a display frame rate for displaying the 3D environment information.
Optionally, the CPU core includes at least one large core and a plurality of small cores, and the adjusting the execution core includes: and if the CPU frequency value needing to be set is adjusted to the maximum frequency value and the temperature of the CPU is increased within first preset time, changing the execution core into a large core.
Optionally, after changing the execution core to a big core, the method further includes: and if the temperature of the CPU is reduced within a second preset time, changing the execution core into a small core.
Optionally, the CPU frequency value to be set is adjusted according to the following formula:
wherein, freq new Freq for the CPU frequency to be set max Is the maximum frequency, K is the transition factor, and load is the current load of the CPU max For the maximum load of the CPU, P is a compensation factor, CPU temp Is the temperature of the CPU, fps 2D/3D The display frame rate is 2D/3D.
Optionally, the method further includes: and controlling the CPU core without executing the task to enter a shallow sleep mode.
Optionally, before adjusting the execution core according to the 2D display parameter, the 3D display parameter, and/or the operation information of the CPU, the method further includes: acquiring 2D display parameters and 3D display parameters from an application layer; and acquiring the temperature of the CPU and the frequency of the CPU from a CPU kernel.
Optionally, the target thread at least includes: the method comprises a shot image acquisition thread, a 2D algorithm processing thread and a 3D algorithm processing thread.
Optionally, the method further includes: acquiring shot images from a plurality of cameras of a vehicle through the shot images, and storing the acquired shot images in a cache region, wherein the shot images comprise 2D images and 3D images; intercepting at least one frame of 2D image from the cache area each time, sending the frame of 2D image to the 2D algorithm processing thread, and processing the 2D image by the 2D algorithm processing thread to obtain displayable 2D environment information; and intercepting at least one frame of 3D image from the buffer area every time, sending the frame of 3D image to the 3D algorithm processing thread, and processing the 3D image by the 3D algorithm processing thread to obtain displayable 3D environment information.
Optionally, the method further includes: defining at least 5 pointers, the pointers including a first pointer, a second pointer, a third pointer, a fourth pointer, and a fifth pointer; when at least one frame of 2D image is intercepted from the cache region each time, the first pointer and the second pointer respectively point to the starting position and the stopping position of the current interception; when at least one frame of 3D image is intercepted from the cache region each time, the third pointer and the fourth pointer respectively point to the starting position and the stopping position of the current interception; the fifth pointer points to the maximum capacity position of the cache region, and the positions pointed by the first pointer, the second pointer, the third pointer and the fourth pointer cannot exceed the position pointed by the fifth pointer.
Optionally, Zero-Copy technology is used in at least one of the following operations: acquiring a shot image, intercepting a 2D image and intercepting a 3D image.
Optionally, the method further includes: receiving control information sent by a vehicle-mounted central control and/or gear controller; and synchronously displaying the 2D environment information and the 3D environment information according to the control signal.
Optionally, when the 2D environment information and the 3D environment information are displayed synchronously, the size of the area for displaying the 2D environment information is smaller than the size of the area for displaying the 3D environment information.
The embodiment of the invention also provides a driving image display device, which comprises: the device comprises a parameter acquisition module, a parameter display module and a parameter display module, wherein the parameter acquisition module is used for acquiring 2D display parameters and 3D display parameters when synchronously displaying 2D environment information and 3D environment information, the 2D display parameters are parameters when displaying the 2D environment information, and the 3D display parameters are parameters when displaying the 3D environment information; the CPU operation information acquisition module is used for acquiring the operation information of the CPU, and the CPU comprises a plurality of cores; and the execution core adjusting module is used for adjusting the frequency of the CPU and/or changing the execution core according to the 2D display parameters, the 3D display parameters and/or the running information of the CPU, wherein the execution core is a CPU core for running a target thread, and the target thread is a thread for processing the 2D environment information and/or the 3D environment information.
Embodiments of the present invention further provide a storage medium, on which a computer program is stored, where the computer program is executed by a processor to perform any of the steps of the method.
The embodiment of the present invention further provides an embedded device, where a CPU of the embedded device is multi-core, the embedded device may include a memory and a processor, the memory stores a computer program that can be run on the processor, and the processor executes any of the steps of the method when running the computer program.
The embodiment of the invention also provides a driving image display platform which comprises a plurality of cameras for acquiring the environmental information of the vehicle and acquiring the shot image, an embedded device and a vehicle-mounted display terminal for displaying the 2D environmental information and/or the 3D environmental information.
Compared with the prior art, the technical scheme of the embodiment of the invention has the following beneficial effects:
the embodiment of the invention provides a driving image display method, which comprises the following steps: when 2D environment information and 3D environment information are synchronously displayed, 2D display parameters and 3D display parameters are obtained, wherein the 2D display parameters are parameters when the 2D environment information is displayed, and the 3D display parameters are parameters when the 3D environment information is displayed; acquiring running information of a CPU (central processing unit), wherein the CPU comprises a plurality of cores; adjusting the frequency of the CPU and/or changing an execution core according to the 2D display parameters, the 3D display parameters and/or the running information of the CPU, wherein the execution core is a CPU core for running a target thread, and the target thread is a thread for processing 2D environment information and/or 3D environment information. Compared with the prior art, the invention provides a scheme for simultaneously displaying 2D and 3D pictures, can monitor the execution conditions of the 2D algorithm and the 3D algorithm and the operation information of a CPU (central processing unit) in real time in order to avoid the situation that the display frame rate is reduced due to the fact that the 2D/3D algorithm is directly operated on an embedded device, and can adjust the execution core to ensure the hardware support when the synchronous real-time display of the 2D and 3D environmental information cannot be supported, thereby ensuring the frame rate and the real-time performance of the 2D/3D display, and solving the problems of reduction of the display frame rate and the display delay when the existing panoramic driving image system simultaneously displays the 2D and 3D pictures.
Furthermore, only the CPU is allowed to be in a light sleep mode and is forbidden to enter a deep sleep mode in the scheduling strategy, so that the awakening delay is reduced.
Furthermore, the shot image acquisition thread, the 2D algorithm processing thread and the 3D algorithm processing thread are divided into three independent threads, so that the mutual interference of the threads is avoided, and the processing efficiency is improved.
Further, at least 5 pointers are defined in the Buffer method for the 2D/3D algorithm processing thread. The 2D/3D algorithm processing threads capable of running simultaneously can read data in the Buffer simultaneously through two sets of data reading pointers (one set is a first pointer and a second pointer, and the other set is a third pointer and a fourth pointer).
Furthermore, a zero-copy technology is used at the same time, a multi-pointer pointing method is used in the Buffer data, so that the data of the Buffer can be obtained concurrently in multi-thread access, pointer copy transfer is used in the data analysis, encapsulation and transfer processes, and delay caused by real data copy is reduced. And the Camera data acquisition and the 2D/3D algorithm processing are decoupled, and the multithread acceleration and zero-copy technology are used for reducing the data and algorithm processing delay.
Drawings
Fig. 1 is a schematic flow chart of a driving image display method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a scheduling policy according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of 3 threads according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating the execution of one embodiment of thread 1 of FIG. 3;
FIG. 5 is a diagram illustrating the pointing positions of 5 defined pointers according to an embodiment of the present invention;
FIG. 6 is a schematic structural diagram of a driving image display device according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of a driving image display platform according to an embodiment of the present invention.
Detailed Description
As described in the background art, there are problems of a display frame rate reduction and a display delay when a panoramic driving video system simultaneously displays 2D and 3D pictures in the prior art.
Specifically, the output frame rate of the embedded video capture device commonly used in the market at present is 30fps, and 33ms of time is required for each fps, and if there is a delay in displaying fps in 2D/3D, the delay frame rate and time relationship can be expressed by formula (1):
T=0.033×n (1);
where n is the frame rate of the delay, and T is the total delay time.
If the driver operates at a speed of 10 kilometers per hour (km/h) while dealing with the situations of vehicle start, driving turn, parking in a place, narrow lane crossing, obstacle avoidance, etc., the distance deviation due to the delay can be expressed by equation (2):
s ═ T × 2.8, unit: meters per second (m/s) (2);
where S is the delay offset distance.
If the video capture device outputs images at 30fps, but the 2D/3D display output frame rate is 25fps, this results in a 5fps delay, and the distance offset delayed using the above method is: 0.46 m. It can be seen that the faster the speed the greater the offset. The reason for this problem is that the embedded device is affected by performance and power consumption, and cannot run on the car machine platform at full power for a long time, so that the situation that the frame rate cannot reach 30fps can be caused by no optimized 2D/3D algorithm running directly on the embedded platform. If the 2D/3D is required to meet the frame rate requirement of real-time performance, the 2D/3D algorithm needs to be processed within 33ms of the interval time between every two frames.
Based on the above technical problem, an embodiment of the present invention provides a driving image display method, including: when 2D environment information and 3D environment information are synchronously displayed, 2D display parameters and 3D display parameters are obtained, wherein the 2D display parameters are parameters when the 2D environment information is displayed, and the 3D display parameters are parameters when the 3D environment information is displayed; acquiring running information of a CPU (central processing unit), wherein the CPU comprises a plurality of cores; and adjusting an execution core according to the 2D display parameters, the 3D display parameters and/or the running information of the CPU, wherein the execution core is a CPU core for running a target thread, and the target thread is a thread for processing 2D environment information and/or 3D environment information.
By the scheme, the problems of reduction of the display frame rate and display delay when the panoramic driving image system simultaneously displays 2D and 3D pictures can be solved, so that the driving safety is improved.
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in detail below.
Referring to fig. 1, fig. 1 is a driving image display method according to an embodiment of the present invention, including the following steps:
step S101, when synchronously displaying 2D environment information and 3D environment information, obtaining 2D display parameters and 3D display parameters, wherein the 2D display parameters are parameters when displaying the 2D environment information, and the 3D display parameters are parameters when displaying the 3D environment information;
among them, the 2D environment information is an image of the vehicle surrounding environment information displayed on a display device (such as a screen, etc.). The 3D environment information is an image of the vehicle surrounding environment information displayed on the display device.
The 2D/3D display parameters are attribute parameters for representing the execution situation of the algorithm when the 2D/3D algorithm is executed, for example, the operation situation of a thread executing the 2D/3D algorithm; or outputting an attribute parameter of the 2D/3D environment information to be displayed, such as a display frame rate of the 2D/3D environment information. The display frame rate is a frame Per Second (fps), which is a definition in the field of images, and refers to a frame Per Second, and in popular terms, refers to a number of pictures of animation or video. fps is a measure of the amount of information used to store and display motion video. The greater the number of frames per second, the smoother the video motion displayed.
In a specific embodiment, the 2D display parameter includes a display frame rate for displaying the 2D environment information, and the 3D display parameter includes a display frame rate for displaying the 3D environment information. Therefore, whether display delay is possible to occur in the current display can be monitored in real time through the display frame rate of the 2D/3D environment information.
Step S102, obtaining running information of a CPU, wherein the CPU comprises a plurality of cores;
a Central Processing Unit (CPU) is a final execution Unit for information Processing and program operation, and serves as an operation and control core of a computer system. Optionally, the CPU may include a multi-core structure of a Symmetric Multiprocessor (SMP) or a Heterogeneous Multiprocessor (HMP). Where all processors in an SMP are peer-to-peer, they share the same block of physical memory through a bus connection, which results in all resources in the system (CPU, memory, input/output interface (I/O), etc.) being shared. The HMP can implement dynamic configuration of different types of CPUs, Graphics Processing units (GPUs for short) and other Processing engines in the system, and ensure the most appropriate task allocation, optimal performance, and the lowest power consumption in actual work. By means of the multiprocessor technology, the complex computing capability of a System-on-a-Chip (SoC for short) can be fully released.
The operation information of the CPU is related information for indicating a state of the CPU at the time of operation, such as an occupancy (or occupancy) of the CPU, a frequency of the CPU, a temperature of the CPU, and the like. In a specific embodiment, the operation information of the CPU includes a temperature of the CPU and/or a frequency of the CPU.
Step S103, adjusting the frequency of the CPU and/or replacing an execution core according to the 2D display parameters, the 3D display parameters and/or the running information of the CPU, wherein the execution core is a CPU core for running a target thread, and the target thread is a thread for processing 2D environment information and/or 3D environment information.
When the 2D environment information and the 3D environment information are synchronously displayed, the execution conditions of the 2D algorithm and the 3D algorithm are monitored according to the 2D display parameters and the 3D display parameters, the operation condition of hardware equipment for realizing the method is monitored through the operation information of the CPU, and when the condition that the execution core of the CPU cannot support synchronous real-time display of the 2D environment information and the 3D environment information is detected, the frequency of the CPU can be adjusted and/or the execution core can be replaced.
Optionally, the CPU core includes at least one large core and a plurality of small cores, the large core has a stronger computational capability than the small cores, and an existing multi-core CPU may include 2 large cores and 6 small cores, or 4 large cores and 4 small cores, and the like. Adjusting the execution core may also include moving the target thread from the large core to the small core, or from the small core to the large core.
Step S103 sets a scheduling policy of the CPU, taking 8 CPU cores (2 Big cores (Big Core1 and Big Core2 in fig. 2) and 6 small cores (Little cores 1, …, Little cores 6 in fig. 2) as an example, the schematic diagram of the scheduling policy may refer to fig. 2 through a CPU frequency modulation module (CPU Freq module shown in fig. 2), and the scheduling policy is essentially to allocate the computing resources of the CPU as needed. The CPU frequency modulation operation is integrated in a scheduling strategy, and the frequency of the CPU managed by the strategy needs to be updated in real time due to the change of the running information and the 2D/3D display parameters of the CPU.
Optionally, the CPU Freq module may also be used to replace the target thread to run on another CPU core. Because the scheduling policy manages multiple CPU cores, computing resources may be scheduled to run on different cores by identifying performance and power consumption differences between CPU cores under different architectures.
In the schedulable policy, the CPU Freq module may determine Dynamic Voltage and Frequency Scaling (DVFS), that is, dynamically de-scaling the voltage and frequency to balance performance and power consumption. Specifically, the operating frequency and voltage of the chip are dynamically adjusted according to different requirements of the application program operated by the chip on the computing capacity of the CPU, wherein for the same chip, the higher the frequency is, the higher the required voltage is.
In addition, when the 2D environment information and the 3D environment information are displayed in synchronization, the size of the area where the 2D environment information is displayed is smaller than the size of the area where the 3D environment information is displayed.
Alternatively, the 2D environment information and the 3D environment information may be displayed on the same screen or may be displayed on different screens. Optionally, if the two-dimensional images are displayed on the same screen, the size of the area for displaying the 2D environment information accounts for one third of the total size of the screen, and the size of the area for displaying the 3D environment information accounts for two thirds of the total size of the screen.
The embodiment of fig. 1 provides a scheme for simultaneously displaying 2D and 3D pictures, which can monitor the execution conditions of the 2D algorithm and the 3D algorithm and the operation information of a CPU in real time in order to avoid the situation that the display frame rate is reduced due to the direct operation of the 2D/3D algorithm on an embedded device, and adjust an execution kernel to ensure hardware support when the synchronous real-time display of the 2D and 3D environmental information cannot be supported, thereby ensuring the frame rate and real-time performance of 2D/3D display, and solving the problems of the display frame rate reduction and display delay when the existing panoramic driving image system simultaneously displays 2D and 3D pictures.
In one embodiment, the CPU core of the hardware device executing the method includes at least one big core and several small cores, and the adjusting the execution core in step S103 may be: and if the CPU frequency value needing to be set is adjusted to the maximum frequency value and the temperature of the CPU is increased within first preset time, changing the execution core into a large core.
The frequency value of the CPU to be set is a frequency value capable of meeting the 2D/3D display requirement. The maximum frequency value is a set CPU threshold value, when the CPU frequency is adjusted to a threshold value, the core switching is carried out by combining the temperature of the CPU, and the core with stronger computing power is taken as an execution core to ensure synchronous display.
Optionally, after the changing the execution core into the big core, the method further includes: and if the temperature of the CPU is reduced within a second preset time, changing the execution core into a small core.
And when the temperature of the CPU is reduced within a second preset time, scheduling the 2D/3D algorithm to the small core to run by combining with a scheduling strategy. The first preset time can be the same as or different from the second preset time, and the two periods of time are the time for monitoring the running of the CPU and can be set as required.
In one embodiment, the CPU frequency linear adjustment formula and the size core switching judgment in the scheduling policy may refer to the following formula (3) and formula (4).
The value of P satisfies the formula:
wherein, freq new Freq for the CPU frequency to be set max Is the maximum frequency, K is the transition factor, and load is the current load of the CPU max For the maximum load of the CPU, P is a compensation factor, CPU temp Is the temperature of the CPU, fps 2D/3D The display frame rate is 2D/3D.
In one embodiment, referring to fig. 1 and fig. 2, the method further includes: and controlling the CPU core without executing the task to enter a shallow sleep mode.
Optionally, the CPU core executing the task is controlled to enter the sleep mode by a CPU Idle (Idle) (as shown in fig. 2) module in the scheduling policy of the CPU.
The sleep modes of the CPU core may include a deep sleep mode and a shallow sleep mode, and although the CPU core in the deep sleep mode consumes the lowest power, the wake-up delay time is also significantly longer, so in order to reduce the wake-up delay, the CPU is only allowed to be in the shallow sleep mode in the scheduling policy, and is prohibited from entering the deep sleep mode.
In one embodiment, before the adjusting the execution core according to the 2D display parameter, the 3D display parameter, and/or the operation information of the CPU, the method further includes: acquiring 2D display parameters and 3D display parameters from an Application layer (Application layer in FIG. 2); the 2D/3D display parameter may be a display frame rate (abbreviated as display frame rate in fig. 2) for displaying the 2D/3D environment information. And acquiring the temperature of the CPU and the frequency of the CPU from a CPU Kernel (Kernel).
In one embodiment, the target threads in FIG. 1 include at least 3 threads in FIG. 3: a captured image acquisition thread (e.g., thread 1 in fig. 3), a 2D algorithm processing thread (e.g., thread 2 in fig. 3), and a 3D algorithm processing thread (e.g., thread 3 in fig. 3).
The captured image acquiring thread is used for acquiring data captured by one or more cameras (cameras) capturing surroundings of the vehicle, and performing data analysis and data packaging (shown in fig. 3). Generally, the plurality of cameras include cameras that capture four directions of the front, rear, left, and right of the vehicle. Further, the data analysis is to decompose the data of a plurality of cameras as required, and if the cameras are installed in the front, rear, left and right directions of the vehicle, a group of data obtained needs to be analyzed into four independent data groups, namely front, rear, left and right. Further, the data collected by Camera is in YUV format, and the four independent data of front, back, left and right can be resolved into data sets of Y component and UV component again. The data encapsulation is to express the separated data in a Buffer (Buffer) method.
And the 2D algorithm processing thread and the 3D algorithm processing thread simultaneously read the data stored in the Buffer, and the 2D algorithm processing thread is used for processing the acquired data to obtain displayable 2D environment information and transmitting the obtained 2D environment information to display equipment. The 3D algorithm processing thread is used for processing the acquired data to obtain displayable 3D environment information and transmitting the obtained 2D environment information to be displayed by the display device.
In the embodiment, the shot image acquisition thread, the 2D algorithm processing thread and the 3D algorithm processing thread are divided into three independent threads, so that the mutual interference of the threads is avoided, and the processing efficiency is improved.
In one embodiment, referring to fig. 4, the step of executing thread 1 in fig. 3 includes:
step S401, acquiring shot images from a plurality of cameras of a vehicle through the shot images, and storing the acquired shot images in a cache region, wherein the shot images comprise 2D images and 3D images;
step S402, intercepting at least one frame of 2D image from the cache area each time, sending the 2D image to the 2D algorithm processing thread, and processing the 2D image by the 2D algorithm processing thread to obtain displayable 2D environment information;
step S403, capturing at least one frame of 3D image from the buffer, sending the captured 3D image to the 3D algorithm processing thread, and processing the 3D image by the 3D algorithm processing thread to obtain displayable 3D environment information.
The execution sequence of steps S402 and S403 is not limited, and may be performed simultaneously.
In one embodiment, the method further comprises: defining at least 5 pointers, wherein the pointers comprise a first pointer, a second pointer, a third pointer, a fourth pointer and a fifth pointer, and the pointing positions of the 5 pointers are shown in FIG. 5; in step S402, each time at least one frame of 2D image is captured from the buffer, the first pointer and the second pointer point to a start position (start position 1 in fig. 5) and a stop position (stop position 1 in fig. 5) of this capture, respectively; in step S403, each time at least one frame of 3D image is captured from the buffer, the third pointer and the fourth pointer point to the start position (start position 2 in fig. 5) and the stop position (stop position 2 in fig. 5) of this capture, respectively; the fifth pointer points to the maximum capacity position (marked in fig. 5) of the buffer area, and the positions pointed by the first pointer, the second pointer, the third pointer and the fourth pointer cannot exceed the position pointed by the fifth pointer.
When reading a 2D image, only the first pointer at the start position 1 and the second pointer at the stop position 1 need to be moved, that is, the data position and length are obtained in the Buffer in the form of segments, and the first pointer and the second pointer cannot point beyond the position pointed by the fifth pointer. When reading a 3D image, only the third pointer at the start position 2 and the fourth pointer at the stop position 2 need to be moved, that is, the data position and length are obtained in the Buffer in the form of segments, and the pointing directions of the third pointer and the fourth pointer cannot exceed the position pointed by the fifth pointer.
Therefore, the 2D/3D algorithm processing thread which can run simultaneously can read the data in the Buffer simultaneously through two groups of data reading pointers (one group is the first pointer and the second pointer, and the other group is the third pointer and the fourth pointer).
In one embodiment, Zero-Copy technology is used in at least one of the following operations: acquiring a shot image, intercepting a 2D image and intercepting a 3D image.
Specifically, in step S402 and step S403, each time at least one frame of 2D/3D image is captured from the buffer and sent to the 2D/3D algorithm processing thread, the pointer (the first pointer and the second pointer, or the third pointer and the fourth pointer) for capturing the 2D/3D image may be transferred to the 2D/3D algorithm processing thread, and transferring the pointed data value by means of the pointer may reduce the delay caused by the real copy of a large amount of data, and implement the zero-copy technique.
Optionally, when the thread 1 for acquiring the captured image encapsulates data, the parsed data set is expressed by a Buffer method described in the zero-copy technology, and then the encapsulated data is respectively transmitted to the thread 2 and the thread 3 for 2D and 3D algorithm processing.
Data transmission among various threads can use zero-copy technology, and multithread acceleration is used for decoupling shooting image acquisition and running of 2D/3D processing algorithms. Currently, the existing Camera data (i.e. data corresponding to a shot image) is stored in the Buffer method, only one group of data reading pointers exists, and if multiple threads exist and data in the Buffer is read at the same time, concurrent processing cannot be performed. In a panoramic driving image real-time display method based on an embedded device, it should be noted that if a zero-copy technology implemented by using multiple data reading pointers based on the content of the present invention does not affect the essential content of the present invention, the present invention is considered as the protection scope of the present invention.
Meanwhile, a zero-copy technology is used, a multi-pointer pointing method is used in Buffer data, so that data of Camera Buffer can be obtained concurrently in multi-thread access, pointer copy transmission is used in the data analysis, encapsulation and transmission processes, and delay caused by real data copy is reduced. And the Camera data acquisition and the 2D/3D algorithm processing are decoupled, and the multithread acceleration and zero-copy technology are used for reducing the data and algorithm processing delay.
In one embodiment, the method further comprises: receiving control information sent by a vehicle-mounted central control and/or gear controller; and synchronously displaying the 2D environment information and the 3D environment information according to the control signal.
The control information is sent by the vehicle-mounted central control and/or gear controller, the display of starting and stopping 2D/3D is carried out by using a control button of the vehicle-mounted central control or a reverse gear/forward gear, and a UI (user interface) for displaying the 2D environment information and the 3D environment information on the vehicle is switched. For example, the 2D and 3D synchronized displays may be turned on during reverse gear, and only the 2D display may be turned on when the vehicle is moving forward normally.
Referring to fig. 6, an embodiment of the invention further provides a driving image display device 60, including:
a parameter obtaining module 601, configured to obtain a 2D display parameter and a 3D display parameter when 2D environment information and 3D environment information are synchronously displayed, where the 2D display parameter is a parameter when the 2D environment information is displayed, and the 3D display parameter is a parameter when the 3D environment information is displayed;
a CPU running information obtaining module 602, configured to obtain running information of a CPU, where the CPU includes multiple cores;
an execution core adjusting module 603, configured to adjust a frequency of the CPU and/or change an execution core according to the 2D display parameter, the 3D display parameter, and/or the operation information of the CPU, where the execution core is a CPU core that runs a target thread, and the target thread is a thread that is used to process 2D environment information and/or 3D environment information.
For more details of the operation principle and the operation mode of the driving image display device 60, reference may be made to the description of the method in fig. 1 to 5, which is not repeated herein.
Embodiments of the present invention further provide a storage medium, on which a computer program is stored, where the computer program is executed by a processor to perform the steps of the method in fig. 1 to 5. The storage medium may be a computer-readable storage medium, and may include, for example, a non-volatile (non-volatile) or non-transitory (non-transitory) memory, and may further include an optical disc, a mechanical hard disk, a solid state hard disk, and the like.
The embodiment of the invention also provides the embedded equipment. The terminal may comprise a memory having stored thereon a computer program operable on the processor, and a processor which, when executing the computer program, performs the steps of the method of fig. 1-5.
Referring to fig. 7, an embodiment of the present invention further provides a driving image display platform 70, where the platform includes a plurality of cameras 701 for collecting environment information of a vehicle and acquiring a captured image, an embedded device 703, and an on-vehicle display terminal 704 for displaying 2D environment information and/or 3D environment information. Wherein the 2D environment information is displayed in a 2D display area in the in-vehicle display terminal 704, and the 3D environment information is displayed in a 3D display area in the in-vehicle display terminal 704. Optionally, the size of the 2D display area is smaller than the size of the 3D display area. The 2D and 3D displays are based on different surfaces (surfaces) created by the operating system.
The 2D and 3D display is based on different surfaces, so that the 2D and 3D display frame rates in different surfaces are different, and by the driving image display method provided by the embodiment of the invention, 2D and 3D algorithms are processed at the interval of the output frame rate of image acquisition equipment such as a Camera (the output frame rate of the commonly-used embedded Camera acquisition equipment is 30fps, and the interval time is about 33ms), so that the 2D/3D display frame rate and the output frame rate of the Camera acquisition equipment can be synchronized.
Optionally, the driving image display platform 70 is powered by the vehicle power supply 702. The embedded device can also be independently powered by the vehicle-mounted power supply 702, and the stable power supply of the vehicle-mounted power supply 702 also determines the running stability of the CPU. The panoramic driving image real-time display method shown in fig. 1 to 5 may be executed in an embedded device or a CPU of a driving image display platform of an automobile, where the CPU includes a multi-core structure of an SMP or MPP.
Optionally, the hardware of the embedded device or the driving image display platform 70 may include a CPU based on X86, ARM, MIPIS, and Powpc architectures, and the software system for executing the above method may include an operating system based on Android, iOS, windows, and Linux. It should be noted that, if the panoramic driving image real-time display method based on the embedded device is implemented by using other hardware systems and software systems based on the content of the present invention, the substantial content of the present invention is not affected, and all of them are considered as the protection scope of the present invention.
Specifically, in the embodiment of the present invention, the processor may be a Central Processing Unit (CPU), and the processor may also be other general-purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
It will also be appreciated that the memory in the embodiments of the subject application can be either volatile memory or nonvolatile memory, or can include both volatile and nonvolatile memory. The nonvolatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash memory. Volatile memory can be Random Access Memory (RAM), which acts as external cache memory. By way of example and not limitation, many forms of Random Access Memory (RAM) are available, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (enhanced SDRAM), SDRAM (SLDRAM), synchlink DRAM (SLDRAM), and direct bus RAM (DR RAM).
In summary, the invention provides a driving image display method, a driving image display device, a driving image display platform, a storage medium and an embedded device, and the driving image display method and the driving image display device are realized under the condition that the output of image acquisition equipment such as a camera is 30 fps. And the 2D/3D algorithm is processed within 33ms of the interval time between every two frames, so that the frame rate and the real-time property of 2D/3D display are ensured. The embodiment of the invention specifically comprises the following effects:
1. the 2D/3D display frame rate may be maintained in synchronization with the output frame rate of the Camera acquisition device.
2. By utilizing the characteristic of multi-core embedded equipment, fps of 2D/3D and CPU frequency and temperature are fused in a CPU scheduling strategy, and the 2D/3D algorithm is adjusted to run on a large core or a small core in time.
3. By using the zero-copy technology and using a multi-pointer pointing method in the Buffer data, Camera Buffer data can be obtained concurrently in multi-thread access, pointer copy transfer is used in the data analysis, encapsulation and transfer processes, and delay caused by real data copy is reduced.
In this case, if the acquisition of the captured image and the 2D/3D algorithm are serially coupled, the delayed frame rate increases. Therefore, the method decouples the acquisition of Camera data and the processing of the 2D/3D algorithm, and adopts multithread acceleration and zero-copy technology to reduce the processing delay of the data and the algorithm.
It should be understood that the term "and/or" herein is merely one type of association relationship that describes an associated object, meaning that three relationships may exist, e.g., a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" in this document indicates that the former and latter related objects are in an "or" relationship.
The "plurality" appearing in the embodiments of the present application means two or more.
The descriptions of the first, second, etc. appearing in the embodiments of the present application are only for illustrating and differentiating the objects, and do not represent the order or the particular limitation of the number of the devices in the embodiments of the present application, and do not constitute any limitation to the embodiments of the present application.
The term "connect" in the embodiments of the present application refers to various connection manners, such as direct connection or indirect connection, to implement communication between devices, which is not limited in this embodiment of the present application.
Although the present invention is disclosed above, the present invention is not limited thereto. Various changes and modifications may be effected therein by one skilled in the art without departing from the spirit and scope of the invention as defined in the appended claims.
Claims (18)
1. A driving image display method is characterized by comprising the following steps:
when 2D environment information and 3D environment information are synchronously displayed, 2D display parameters and 3D display parameters are obtained, wherein the 2D display parameters are parameters when the 2D environment information is displayed, and the 3D display parameters are parameters when the 3D environment information is displayed;
acquiring running information of a CPU (central processing unit), wherein the CPU comprises a plurality of cores;
adjusting the frequency of the CPU and/or changing an execution core according to the 2D display parameters, the 3D display parameters and/or the running information of the CPU, wherein the execution core is a CPU core for running a target thread, and the target thread is a thread for processing 2D environment information and/or 3D environment information;
wherein the adjusting the frequency of the CPU and/or changing an execution core according to the 2D display parameters, the 3D display parameters, and/or the operation information of the CPU includes:
monitoring the execution conditions of the 2D algorithm and the 3D algorithm according to the 2D display parameters and the 3D display parameters, monitoring the execution conditions of hardware equipment for realizing the method through the running information of the CPU, and adjusting the frequency of the CPU and/or replacing the execution core when detecting that the execution core of the CPU can not support synchronous real-time display of the 2D and 3D environmental information.
2. The method of claim 1, wherein the operating information of the CPU comprises a temperature of the CPU and/or a frequency of the CPU.
3. The method of claim 2, wherein the 2D display parameters comprise a display frame rate for displaying 2D environment information, and wherein the 3D display parameters comprise a display frame rate for displaying 3D environment information.
4. The method of claim 3, wherein the CPU cores comprise at least one big core and a number of little cores, and wherein adjusting the execution cores comprises:
and if the CPU frequency value needing to be set is adjusted to the maximum frequency value and the temperature of the CPU is increased within first preset time, changing the execution core into a large core.
5. The method of claim 4, wherein after changing the execution core to a big core, further comprising:
and if the temperature of the CPU is reduced within a second preset time, changing the execution core into a small core.
6. The method of claim 4, wherein the CPU frequency value to be set is adjusted according to the following formula:
the value of P satisfies the formula:
wherein,freq new freq for the CPU frequency to be set max Is the maximum frequency, K is the transition factor, and load is the current load of the CPU max For the maximum load of the CPU, P is a compensation factor, CPU temp Is the temperature of the CPU, fps 2D/3D The display frame rate is 2D/3D.
7. The method of any of claims 1 to 3, further comprising:
and controlling the CPU core without executing the task to enter a shallow sleep mode.
8. The method according to any one of claims 1 to 3, wherein before adjusting the execution core according to the 2D display parameters, the 3D display parameters and/or the running information of the CPU, the method further comprises:
acquiring 2D display parameters and 3D display parameters from an application layer;
and acquiring the temperature of the CPU and the frequency of the CPU from a CPU core.
9. The method of claim 1, wherein the target thread comprises at least: the method comprises a shot image acquisition thread, a 2D algorithm processing thread and a 3D algorithm processing thread.
10. The method of claim 9, further comprising:
acquiring shot images from a plurality of cameras of a vehicle through the shot images, and storing the acquired shot images in a cache region, wherein the shot images comprise 2D images and 3D images;
intercepting at least one frame of 2D image from the cache area each time, sending the frame of 2D image to the 2D algorithm processing thread, and processing the 2D image by the 2D algorithm processing thread to obtain displayable 2D environment information;
and intercepting at least one frame of 3D image from the buffer area every time, sending the frame of 3D image to the 3D algorithm processing thread, and processing the 3D image by the 3D algorithm processing thread to obtain displayable 3D environment information.
11. The method of claim 10, further comprising:
defining at least 5 pointers, the pointers including a first pointer, a second pointer, a third pointer, a fourth pointer, and a fifth pointer;
when at least one frame of 2D image is intercepted from the cache region each time, the first pointer and the second pointer respectively point to the starting position and the stopping position of the current interception;
when at least one frame of 3D image is intercepted from the cache region each time, the third pointer and the fourth pointer respectively point to the starting position and the stopping position of the current interception;
the fifth pointer points to the maximum capacity position of the cache region, and the positions pointed by the first pointer, the second pointer, the third pointer and the fourth pointer cannot exceed the position pointed by the fifth pointer.
12. Method according to any of claims 9 to 11, characterized in that Zero-Copy technology is used in at least one of the following operations: acquiring a shot image, intercepting a 2D image and intercepting a 3D image.
13. The method of claim 1, further comprising:
receiving control information sent by a vehicle-mounted central control and/or gear controller;
and synchronously displaying the 2D environment information and the 3D environment information according to the control information.
14. The method of claim 1, wherein a size of an area displaying the 2D environment information is smaller than a size of an area displaying the 3D environment information when the 2D environment information and the 3D environment information are displayed simultaneously.
15. A driving image display device, comprising:
the device comprises a parameter acquisition module, a parameter display module and a parameter display module, wherein the parameter acquisition module is used for acquiring 2D display parameters and 3D display parameters when synchronously displaying 2D environment information and 3D environment information, the 2D display parameters are parameters when displaying the 2D environment information, and the 3D display parameters are parameters when displaying the 3D environment information;
the CPU operation information acquisition module is used for acquiring the operation information of the CPU, and the CPU comprises a plurality of cores;
an execution core adjusting module, configured to adjust a frequency of the CPU and/or change an execution core according to the 2D display parameter, the 3D display parameter, and/or the operation information of the CPU, where the execution core is a CPU core that runs a target thread, and the target thread is a thread that is used to process 2D environment information and/or 3D environment information;
wherein the adjusting the frequency of the CPU and/or changing an execution core according to the 2D display parameters, the 3D display parameters, and/or the operation information of the CPU includes:
monitoring the execution conditions of the 2D algorithm and the 3D algorithm according to the 2D display parameters and the 3D display parameters, monitoring the operation conditions of hardware equipment applying the device through the operation information of the CPU, and adjusting the frequency of the CPU and/or replacing the execution core when detecting that the execution core of the CPU cannot support synchronous real-time display of the 2D and 3D environment information.
16. A storage medium having a computer program stored thereon, the computer program, when being executed by a processor, performing the steps of the method according to any one of claims 1 to 14.
17. An embedded device, wherein the CPU of the embedded device is multi-core, the embedded device may include a memory and a processor, the memory stores a computer program operable on the processor, and the processor executes the computer program to perform the steps of the method according to any one of claims 1 to 14.
18. A driving image display platform, characterized in that the platform comprises a plurality of cameras for collecting environmental information of vehicles and acquiring shot images, the embedded device of claim 17, and a vehicle-mounted display terminal for displaying 2D environmental information and/or 3D environmental information.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011373475.7A CN112486684B (en) | 2020-11-30 | 2020-11-30 | Driving image display method, device and platform, storage medium and embedded equipment |
PCT/CN2021/127903 WO2022111225A1 (en) | 2020-11-30 | 2021-11-01 | Driving image display method, apparatus and platform, and storage medium and embedded device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011373475.7A CN112486684B (en) | 2020-11-30 | 2020-11-30 | Driving image display method, device and platform, storage medium and embedded equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112486684A CN112486684A (en) | 2021-03-12 |
CN112486684B true CN112486684B (en) | 2022-08-12 |
Family
ID=74937388
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011373475.7A Active CN112486684B (en) | 2020-11-30 | 2020-11-30 | Driving image display method, device and platform, storage medium and embedded equipment |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN112486684B (en) |
WO (1) | WO2022111225A1 (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112486684B (en) * | 2020-11-30 | 2022-08-12 | 展讯半导体(成都)有限公司 | Driving image display method, device and platform, storage medium and embedded equipment |
CN115242695B (en) * | 2022-07-22 | 2023-08-15 | 高新兴物联科技股份有限公司 | Method, device and computer readable storage medium for monitoring environmental state of server |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3893983B2 (en) * | 2002-01-17 | 2007-03-14 | ソニー株式会社 | Information providing apparatus, information providing method, storage medium, and computer program |
US20160019062A1 (en) * | 2014-07-16 | 2016-01-21 | Ahmad Yasin | Instruction and logic for adaptive event-based sampling |
US9794340B2 (en) * | 2014-09-15 | 2017-10-17 | Ge Aviation Systems Llc | Mechanism and method for accessing data in a shared memory |
CN105511824A (en) * | 2015-11-30 | 2016-04-20 | 深圳市灵动飞扬科技有限公司 | Split-screen display method and system |
CN106598596A (en) * | 2016-12-14 | 2017-04-26 | 天津光电通信技术有限公司 | OpenCL Image Processing Method Based on Andorid Platform |
CN106951320B (en) * | 2017-01-23 | 2022-03-08 | 斑马信息科技有限公司 | System and method for dynamically adjusting CPU frequency of vehicle machine of internet vehicle |
DE102017109239A1 (en) * | 2017-04-28 | 2018-10-31 | Ilnumerics Gmbh | COMPUTER IMPLEMENTED PROCESS, COMPUTER READABLE MEDIA AND HETEROGICAL COMPUTER SYSTEM |
JP2019057178A (en) * | 2017-09-21 | 2019-04-11 | 東芝メモリ株式会社 | Memory system and control method |
CN107844177A (en) * | 2017-10-18 | 2018-03-27 | 歌尔科技有限公司 | Device parameter method of adjustment, device and electronic equipment |
CN109947569B (en) * | 2019-03-15 | 2021-04-06 | Oppo广东移动通信有限公司 | Method, device, terminal and storage medium for binding core |
CN110083460A (en) * | 2019-03-25 | 2019-08-02 | 华东师范大学 | A kind of design method of the microkernel architecture using event bus technology |
CN110413417A (en) * | 2019-08-02 | 2019-11-05 | 广州小鹏汽车科技有限公司 | The running optimizatin methods, devices and systems of onboard system process |
CN110532091B (en) * | 2019-08-19 | 2022-02-22 | 中国人民解放军国防科技大学 | Graph computation edge vector load balancing method and device based on graph processor |
CN110696720A (en) * | 2019-10-31 | 2020-01-17 | 广东好帮手丰诺电子科技有限公司 | 3D panorama system of backing a car of original car key control in area |
CN112486684B (en) * | 2020-11-30 | 2022-08-12 | 展讯半导体(成都)有限公司 | Driving image display method, device and platform, storage medium and embedded equipment |
-
2020
- 2020-11-30 CN CN202011373475.7A patent/CN112486684B/en active Active
-
2021
- 2021-11-01 WO PCT/CN2021/127903 patent/WO2022111225A1/en active Application Filing
Also Published As
Publication number | Publication date |
---|---|
WO2022111225A1 (en) | 2022-06-02 |
CN112486684A (en) | 2021-03-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11704781B2 (en) | Enhanced high-dynamic-range imaging and tone mapping | |
DE112020005206T5 (en) | Determining object orientation from an image using machine learning | |
US11908104B2 (en) | Weighted normalized automatic white balancing | |
CN112486684B (en) | Driving image display method, device and platform, storage medium and embedded equipment | |
US20230024474A1 (en) | Stitching quality assessment for surround view systems | |
US12118353B2 (en) | Performing load and permute with a single instruction in a system on a chip | |
JP2022105256A (en) | Image synthesis in multi-view automotive and robotics systems | |
US20230111014A1 (en) | Using a hardware sequencer in a direct memory access system of a system on a chip | |
US12273632B2 (en) | Image signal processing pipelines for high dynamic range sensors | |
US20120307062A1 (en) | Vehicle-mounted image processing apparatus | |
US20240134645A1 (en) | Using a vector processor to configure a direct memory access system for feature tracking operations in a system on a chip | |
CN110733444A (en) | ADAS driving assistance system based on MPSOC platform | |
US12244938B2 (en) | Brightness based chromaticity weighting for improved illuminant color estimation for auto white balancing | |
CN110730304B (en) | Intelligent camera for accelerating image acquisition and display | |
DE102022117475A1 (en) | TRANSMITTING ERRORS TO AN ISOLATED SECURITY AREA OF A SYSTEM ON A CHIP | |
DE102022131123A1 (en) | CONSISTENT SAMPLING FOR SPATIAL HASHING | |
CN108594818A (en) | Intelligent driving control method, intelligent vehicle-carried equipment and system | |
US20240196105A1 (en) | Fallback mechanism for auto white balancing | |
US12327413B2 (en) | Image stitching with color harmonization for surround view systems and applications | |
JP2020140133A (en) | Display control system, display system, moving object, display control method and program | |
CN113704156B (en) | Sensing data processing device, board card, system and method | |
US20250240381A1 (en) | Color correction matrix calibration for high dynamic range sensors | |
CN111324081A (en) | Intelligent integrated control device and system comprising same | |
US20250139981A1 (en) | Data fragmentation techniques for reduced data processing latency | |
US20240214545A1 (en) | Method and device for naked eye 3d displaying vehicle instrument |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |