Disclosure of Invention
The disclosure provides a method and a device for detecting a jam, which at least solve the problem that in the related art, only detecting whether a CPU is jammed or not can cause inaccurate jam detection in an image processing process. The technical scheme of the present disclosure is as follows:
According to a first aspect of an embodiment of the present disclosure, there is provided a stuck detection method, including:
if the CPU is detected to acquire the display information aiming at the image, triggering a time-consuming detection sub-thread of the graphic processor;
acquiring a first processing time length corresponding to the image by adopting the time-consuming detection sub-thread, wherein the first processing time length is a time length from a drawing starting time point corresponding to the image to a rendering finishing time point corresponding to the image;
And if the first processing time length is greater than or equal to a first time length threshold value, determining that the graphics processor is in a stuck state as a stuck detection result of the graphics processor.
Optionally, the acquiring, by using the time-consuming detection sub-thread, a first processing duration corresponding to the image includes:
If the frame callback processing is determined to be carried out on the image, acquiring a drawing starting time point corresponding to the image;
adding blank image views corresponding to the images, and sending the blank image view addition completion information to the time-consuming detection sub-thread;
If the trigger graphic processor is determined to finish rendering the image view of the image, acquiring a rendering finish time point corresponding to the image by adopting the time-consuming detection sub-thread;
And determining the duration from the drawing starting time point corresponding to the image to the rendering finishing time point corresponding to the image as a first processing duration corresponding to the image.
Optionally, after triggering the time-consuming detection sub-thread of the graphics processor, the method further includes:
Acquiring a second processing time length corresponding to the image by adopting the time-consuming detection sub-thread, wherein the second processing time length is a time length from a drawing starting time point corresponding to the image to a rendering time point of a current rendering image view, and the current rendering image view is a current rendering image view in all image views corresponding to the image;
if the second processing time length reaches a second time length threshold value and it is determined that rendering of all image views corresponding to the image is not completed, determining that the graphics processor is in a stuck state as a result of the stuck detection of the graphics processor.
Optionally, the method further comprises:
acquiring a sub-duration of processing an image view when the graphic processor is not in the stuck state;
Obtaining target time lengths corresponding to the preset number of sub-time lengths, and determining the target time lengths as time length thresholds.
Optionally, the method further comprises:
and if the graphic processor is in the stuck state, adjusting the rendering quantity corresponding to at least one image view rendered at the same time in the graphic processor.
According to a second aspect of embodiments of the present disclosure, there is provided a stuck detection apparatus, including:
The sub-thread triggering unit is used for triggering the time-consuming detection sub-thread of the graphic processor if the CPU is detected to acquire the display information aiming at the image;
A time length obtaining unit, configured to obtain a first processing time length corresponding to the image by using the time-consuming detection sub-thread, where the first processing time length is a time length from a drawing start time point corresponding to the image to a rendering completion time point corresponding to the image;
And the result determining unit is used for determining that the graphics processor is in a stuck state according to the stuck detection result of the graphics processor if the first processing time length is greater than or equal to a first time length threshold.
Optionally, the duration obtaining unit includes a time point obtaining subunit, a view adding subunit, and a duration obtaining subunit, where the duration obtaining unit is configured to obtain, by using the time-consuming detection sub-thread, a first processing duration corresponding to the image, and includes:
The time point obtaining subunit is configured to obtain a drawing start time point corresponding to the image if it is determined that frame callback processing is performed on the image;
The view adding subunit is configured to add a blank image view corresponding to the image, and send the blank image view addition completion information to the time-consuming detection sub-thread;
the time point obtaining subunit is further configured to, if it is determined that the triggering graphics processor finishes rendering the image view of the image, obtain a rendering completion time point corresponding to the image by using the time-consuming detection sub-thread;
The duration obtaining subunit is configured to determine, as a first processing duration corresponding to the image, a duration from a rendering start time point corresponding to the image to a rendering completion time point corresponding to the image.
Optionally, the duration obtaining unit is further configured to obtain, after triggering a time-consuming detection sub-thread of the graphics processor, a second processing duration corresponding to the image by using the time-consuming detection sub-thread, where the second processing duration is a duration between a drawing start time point corresponding to the image and a rendering time point of a current rendered image view, and the current rendered image view is an image view in all image views corresponding to the image;
And the result determining unit is further configured to determine that the graphics processor is in a stuck state according to a stuck detection result of the graphics processor if the second processing duration reaches a second duration threshold and it is determined that all image views corresponding to the image are not rendered.
Optionally, the device further includes a threshold setting unit, configured to obtain a sub-duration for processing an image view when the graphics processor is not in the katon state;
Obtaining target time lengths corresponding to the preset number of sub-time lengths, and determining the target time lengths as time length thresholds.
Optionally, the apparatus further includes a quantity adjusting unit, configured to adjust a rendering quantity corresponding to at least one image view rendered at the same time in the graphics processor if it is determined that the graphics processor is in a stuck state.
According to a third aspect of embodiments of the present disclosure, there is provided a terminal comprising:
At least one processor, and
A memory communicatively coupled to the at least one processor, wherein,
The memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of the preceding aspects.
According to a fourth aspect of the present disclosure, there is provided a non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the method of any one of the preceding aspects.
According to a fifth aspect of the present disclosure, there is provided a computer program product comprising a computer program which, when executed by a processor, implements the method of any one of the preceding aspects.
The technical scheme provided by the embodiment of the disclosure at least brings the following beneficial effects:
In some or related embodiments, if the central processing unit is detected to acquire presentation information for an image, triggering a time-consuming detection sub-thread of the graphics processor, acquiring a first processing duration corresponding to an image in an image set corresponding to the image by using the time-consuming detection sub-thread, where the first processing duration is a duration from a drawing start time point corresponding to the image to a rendering completion time point corresponding to the image, and if the first processing duration is greater than or equal to a first time duration threshold, determining that a stuck detection result of the graphics processor is that the graphics processor is in a stuck state. Therefore, the first processing duration can be obtained through the time-consuming detection sub-thread, whether the graphics processor is stuck or not is determined based on the duration, the situation that whether the CPU is stuck or not can only be determined when the CPU is stuck or not in the image processing process can be improved, the situation that the detection of the stuck or not in the image processing process is inaccurate due to the fact that whether the CPU is stuck or not can be reduced, the accuracy of the stuck or not can be improved, and the use experience of a user can be improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Detailed Description
In order to enable those skilled in the art to better understand the technical solutions of the present disclosure, the technical solutions of the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings.
It should be noted that the terms "first," "second," and the like in the description and claims of the present disclosure and in the foregoing figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the disclosure described herein may be capable of operation in sequences other than those illustrated or described herein. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
With the development of scientific technology, terminal technology is mature increasingly, and convenience of production and life of users is improved. In the process of operating the terminal by a user, a display interface of the terminal needs to be subjected to screen imaging. Fig. 1 is a schematic diagram of a terminal framework of a method for detecting a jam according to an exemplary embodiment. As shown in fig. 1, the terminal 1 includes a central processor 11 (central processing unit, CPU), a graphics processor 12 (graphics processing unit, GPU), and a view controller 13. In the process of the terminal 1 for screen imaging, firstly, a central processing unit 11 calculates a view to be displayed and a corresponding display mode, secondly, a graphic processor 12 renders the view to be displayed into a bitmap and stores the bitmap into a frame buffer area, and finally, a view controller 13 reads the content of the frame buffer area according to frequency and displays the view to be displayed on a display interface of the terminal according to the corresponding display mode.
In some embodiments, a frame loss occurs when the processing time of the CPU or GPU in the data of a certain frame exceeds a processing threshold, resulting in the data of the next frame not yet being ready, such that the terminal is still displaying the image of the previous frame. If the frame loss time is too long, the display interface of the terminal is blocked, and the experience of the user is reduced. Therefore, the detection of the jamming of the terminal has important practical significance for the developer to optimize the terminal performance and improve the user experience.
Fig. 2 is a background schematic diagram illustrating a method of detecting a stuck in accordance with an example embodiment, according to some embodiments. As shown in fig. 2, in the process of screen imaging, the terminal monitors the time consumed by the CPU in calculating the view to be displayed and the cycle of the event processing runloop in the corresponding display mode. And when the terminal detects that the time consumption of runloop cycles exceeds the time consumption threshold, judging that the terminal generates a clamp.
In some embodiments, the detection scheme for whether the GPU is stuck is not involved, so that if the GPU is stuck when rendering, and the CPU processing time does not exceed the threshold, only monitoring runloop cycles time can not detect the stuck condition generated in the terminal image processing process, thereby affecting the user experience.
Fig. 3 is a schematic diagram illustrating an architecture of a katon detection method according to an example embodiment, according to some embodiments. As shown in fig. 3, the terminal 110 may determine whether the terminal 110 is stuck by monitoring the time consumed by the CPU processing each runloop cycles. When the terminal 110 determines that the terminal 110 generates a click, the terminal 110 may upload the click information to the server 130 through the network 120. When the server 130 receives the katon information, the server 130 may make statistics on the katon information. The user can optimize the terminal based on the counted katon information, so that the use experience of the user is improved.
It is readily understood that the terminal includes, but is not limited to, a wearable device, a handheld device, a personal computer, a tablet computer, a vehicle mounted device, a smart phone, a computing device, or other processing device connected to a wireless modem, etc. The terminal devices in different networks may be referred to by different names, such as user equipment, access terminals, subscriber units, subscriber stations, mobile stations, remote terminals, mobile devices, user terminals, wireless communication devices, user agents or user equipment, cellular telephones, cordless telephones, personal Digital Assistants (PDAs), terminal devices in fifth generation mobile communication technology (5th generation mobile networks,5G) networks or future evolution networks, etc. The live terminal can be provided with an operating system, wherein the operating system refers to an operating system which can run in the terminal, is a program for managing and controlling terminal hardware and terminal application, and is an indispensable system application in the terminal. The operating system may be an IOS system.
According to some embodiments, the terminal 110 may be connected to the server 130 through the network 120. Network 120 is used to provide a communication link between terminal 110 and server 130. The network 120 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others. It should be understood that the number of terminals 110, networks 120, and servers 130 in fig. 2 are merely illustrative. There may be any number of terminals, networks and servers as practical. For example, the server 130 may be a server cluster formed by a plurality of servers. A user may interact with server 130 through network 120 using terminal 110 for katon detection, etc.
Fig. 4 is a flowchart illustrating a method of detecting a click, as shown in fig. 4, which may be applied, for example, in an image processing scenario, according to an exemplary embodiment, the method may be implemented by a computer program and may be executed on a terminal including an image display function, and includes the following steps:
In step S11, if it is detected that the central processor acquires the presentation information for the image, triggering a time-consuming detection sub-thread of the graphics processor;
According to some embodiments, the execution body of the embodiments of the present disclosure may be a terminal, and the operating system included in the terminal may be an IOS system. The central processing unit (central processing unit, abbreviated as CPU) is used as an operation and control core of the computer system and is a final execution unit for information processing and program running. The CPU of the embodiments of the present disclosure may be a processor in a terminal. The CPU may be used to calculate a view of the presentation and specific presentation information. The CPU is not particularly limited to a fixed CPU. For example, when the identity of a CPU changes, the CPU will also change accordingly.
In some embodiments, the image may be a certain image, or may be an image in a certain image set. Wherein, the image set refers to a collective formed by gathering at least one image. The image set does not refer specifically to a fixed set. For example, when the number of images included in the image set changes, the image set may also change accordingly. For example, when an image included in the image set changes, the image may also change accordingly.
According to some embodiments, the image is also not specific to a fixed image, e.g. when the image content of the image changes, the image may also change accordingly. For example, when the image identification of an image changes, the image may also change accordingly.
It is easy to understand that the presentation information refers to presentation information corresponding to an image determined by the CPU. The presentation information includes, but is not limited to, image view information of the presentation image and a presentation manner of the image, a presentation position of the image, and the like.
Optionally, the graphics processor (graphics processing unit, GPU), also known as a display core, vision processor, display chip, is a microprocessor that performs image and graphics related operations specifically on personal computers, workstations, gaming machines, and some mobile devices (e.g., tablet computers, smartphones, etc.). The GPU of the embodiments of the present disclosure refers to a microprocessor that renders view information to be presented into a bitmap and then stores a frame buffer. The GPU is not specific to a particular fixed GPU. For example, when the identity of the GPU changes, the GPU will also change accordingly.
In some embodiments, a thread (thread) is the smallest unit that an operating system can schedule for operations. A thread refers to a single sequential control flow in a process. A process may have many threads, each thread executing different tasks in parallel. For example, the image processing process may include a GPU processing image thread and a CPU processing image thread. The GPU processes the image thread for rendering the view to be presented into a bitmap, and then stores the bitmap in the frame buffer. The CPU processes the image thread for calculating the view to be exhibited and the corresponding exhibiting mode.
It is easy to understand that the time-consuming detecting sub-thread refers to a sub-thread added in the GPU processing image thread for monitoring the time consumption of the GPU during the image processing process. That is, the time-consuming detection sub-thread does not affect the operation of the GPU processing the image thread during the image processing, and is only used to obtain the time that a GPU spends in the rendering process for a certain image. The time-consuming detection sub-thread may be a thread that is executed concurrently with the GPU processing the image thread during image processing. The time consuming detection sub-thread is not specific to a fixed sub-thread. The time-consuming detection sub-thread may also change accordingly, for example, when a modification instruction is received for the time-consuming detection sub-thread.
According to some embodiments, when the terminal executes the katon detection method, if the central processing unit is detected to acquire presentation information for the image, the time-consuming detection sub-thread for triggering the graphics processor is detected.
In step S12, a time-consuming detection sub-thread is adopted to obtain a first processing time length corresponding to the image;
According to some embodiments, the first processing duration is a duration between a drawing start time point corresponding to the image and a rendering completion time point corresponding to the image. The first processing time period is not particularly limited to a certain fixed time period. For example, when the graphics processor is in a stuck state, the first processing duration may also vary accordingly. For example, when the rendering speed of the graphics processor changes, the first processing duration may also change accordingly.
In some embodiments, when the terminal executes the katon detection method, if the central processing unit is detected to acquire the presentation information for the image, the time-consuming detection sub-thread for triggering the graphics processor is detected. The terminal can adopt a time-consuming detection sub-thread to acquire a first processing time length corresponding to the image. For example, the terminal may use a time-consuming detection sub-thread to obtain a time period from a drawing start time point corresponding to the image to a rendering completion time point corresponding to the image, that is, the terminal may obtain a time period from when rendering of the image is started to when rendering of the image is completed.
In step S13, if the first processing time period is greater than or equal to the first time period threshold, it is determined that the graphics processor is in a stuck state as a result of the stuck detection of the graphics processor.
According to some embodiments, when the terminal executes the katon detection method, if the central processing unit is detected to acquire presentation information for the image, the time-consuming detection sub-thread for triggering the graphics processor is detected. The terminal can adopt a time-consuming detection sub-thread to acquire a first processing time length corresponding to the image. For example, the terminal may use a time-consuming detection sub-thread to obtain a time period from a drawing start time point corresponding to the image to a rendering completion time point corresponding to the image, that is, the terminal may obtain a time period from when rendering of the image is started to when rendering of the image is completed. When the terminal obtains the first processing time length, if the first processing time length is greater than or equal to the first time length threshold value, the terminal can determine that the jamming detection result of the graphics processor is that the graphics processor is in a jamming state.
In some or related embodiments, if the central processing unit detects that the presentation information for the image is acquired, triggering a time-consuming detection sub-thread of the graphics processor, and acquiring a first processing duration corresponding to the image in the image set corresponding to the image by using the time-consuming detection sub-thread, where the first processing duration is a duration from a drawing start time point corresponding to the image to a rendering completion time point corresponding to the image, and if the first processing duration is greater than or equal to a first time duration threshold, determining that a stuck detection result of the graphics processor is that the graphics processor is in a stuck state. Therefore, the first processing duration can be obtained through the time-consuming detection sub-thread, whether the graphics processor is stuck or not is determined based on the duration, the situation that whether the CPU is stuck or not can only be determined when the CPU is stuck or not in the image processing process can be improved, the situation that the detection of the stuck or not in the image processing process is inaccurate due to the fact that whether the CPU is stuck or not can be reduced, the accuracy of the stuck or not can be improved, and the use experience of a user can be improved.
Fig. 5 is a flow chart illustrating a method of detecting stuck in accordance with an exemplary embodiment. As shown in fig. 5, the method for detecting the stuck state comprises the following steps:
in step S21, if it is detected that the cpu acquires the presentation information for the image, a time-consuming detection sub-thread of the graphics processor is triggered;
the specific process is as described above, and will not be described here again.
In some embodiments, fig. 6 is a timing diagram illustrating a method of detecting a stuck in accordance with an example embodiment. As shown in fig. 6, the terminal may detect whether the CPU is in a stuck state using runloop cycles. When the terminal detects that the central processing unit acquires the presentation information for the image, the terminal can trigger a time-consuming detection sub-thread of the graphics processor.
According to some embodiments, the terminal control central processing unit obtains the display information for the image, for example, the display position of the obtained image and the displayed view information.
In step S22, if it is determined that the frame callback processing is performed on the image, a drawing start time point corresponding to the image is obtained;
According to some embodiments, the drawing start point in time refers to the point in time when the GPU begins rendering for an image. The drawing start time point is not particularly specified as a certain fixed time point. The drawing start time point may be, for example, a current time point at which drawing of an image starts from zero. For example, the drawing start time point may be a current time point at which the drawing of the image is started, and when the time point at which the CPU acquires presentation information for the image changes, the drawing start time point may also change accordingly.
In some embodiments, when the terminal detects that the central processor acquires presentation information for the image and triggers a time-consuming detection sub-thread of the graphics processor, if it is determined that frame callback processing is performed on the image, the terminal may acquire a drawing start time point corresponding to the image.
It is easy to understand that the drawing start time point at which the terminal acquires the image may be, for example, 1.20 seconds of 25 minutes at 13 points of 1 month and 5 days in 2020.
In step S23, adding a blank image view corresponding to the image, and sending blank image view addition completion information to the time-consuming detection sub-thread;
According to some embodiments, the blank image view refers to a simple view for whether to add a stuck to the GPU. Blank views do not refer specifically to a fixed view. An image view refers to a view or control for displaying an image. When the rendering information of the terminal for the image changes, the blank image view can also change correspondingly.
According to some embodiments, blank image view addition completion information is used to indicate that addition for a blank image view has been completed. When the terminal determines to perform frame callback processing on the image, the terminal can acquire a drawing starting time point corresponding to the image. The terminal can add blank image views corresponding to the images and send blank image view addition completion information to the time-consuming detection sub-thread.
In step S24, if it is determined that the graphics processor is triggered to complete the rendering of the image view of the image, a time-consuming detection sub-thread is adopted to obtain a rendering completion time point corresponding to the image;
According to some embodiments, when the terminal adds a blank image view corresponding to an image, the terminal may control the GPU to render all views corresponding to the image on the blank image view. That is, the terminal determines that the rendering of the image view of the image by the trigger graphics processor is completed, and the terminal can acquire a rendering completion time point corresponding to the image by using the time-consuming detection sub-thread.
In some embodiments, the rendering completion time point corresponding to the time-consuming detection sub-thread for the terminal to acquire the image may be, for example, 1.29 seconds of 25 minutes from 13 points on 1 month 5 days 2020.
In step S25, a time period from a drawing start time point corresponding to the image to a rendering completion time point corresponding to the image is determined as a first processing time period corresponding to the image;
the specific process is as described above, and will not be described here again.
According to some embodiments, the first processing duration is a duration between a drawing start time point corresponding to the image and a rendering completion time point corresponding to the image.
In some embodiments, if it is determined that the frame callback processing is performed on the image, the terminal may acquire a drawing start time point corresponding to the image. If the fact that the image view rendering of the image is triggered to be completed by the triggering graphic processor is determined, a time-consuming detection sub-thread is adopted to obtain a rendering completion time point corresponding to the image. The terminal may determine a duration from a drawing start time point corresponding to the image to a rendering completion time point corresponding to the image as a first processing duration corresponding to the image.
In some embodiments, the drawing start time point at which the terminal acquires the image may be, for example, 1.20 seconds of 25 minutes at 13 points on 1 month and 5 days in 2020. The rendering completion time point corresponding to the image acquired by the terminal through the time-consuming detection sub-thread may be, for example, 25 th minute 1.29 seconds from 1 month, 5 days and 13 points in 2020. The terminal determines, as the first processing duration corresponding to the image, a duration from a drawing start time point corresponding to the image to a rendering completion time point corresponding to the image, for example, may be 0.09 seconds.
In step S26, if the first processing time period is greater than or equal to the first time period threshold, it is determined that the graphics processor is in a stuck state as a result of the stuck detection of the graphics processor.
According to some embodiments, when the terminal obtains the first processing duration, the terminal may determine a stuck detection result of the graphics processor based on the first processing duration, and specifically, if the first processing duration is greater than or equal to the first duration threshold, the terminal may determine that the stuck detection result of the graphics processor is that the graphics processor is in a stuck state. If the first processing duration is less than the first duration threshold, the terminal may determine that the graphics processor is not in a stuck state as a result of the stuck detection of the graphics processor.
In some embodiments, the first processing duration threshold refers to a threshold for detecting whether the graphics processor is stuck based on the first processing duration. The first time period threshold is not specific to a fixed time period threshold. For example, when the terminal acquires a modification instruction for the first time length threshold, the terminal may modify the first time length threshold based on the modification instruction.
According to some embodiments, the first time length threshold may be, for example, 0.08s. The drawing start time point at which the terminal acquires the image may be, for example, 1.20 seconds from 25 minutes at 13 points on 1 month and 5 days in 2020. The rendering completion time point corresponding to the image acquired by the terminal through the time-consuming detection sub-thread may be, for example, 25 th minute 1.29 seconds from 1 month, 5 days and 13 points in 2020. The terminal determines, as the first processing duration corresponding to the image, a duration from a drawing start time point corresponding to the image to a rendering completion time point corresponding to the image, for example, may be 0.09 seconds. When the terminal determines that the first processing duration is greater than the first duration threshold by 0.09 seconds and is greater than the first duration threshold by 0.08 seconds, the terminal may determine that the result of the stuck detection of the graphics processor is that the graphics processor is in a stuck state.
In some embodiments, the first time length threshold may be, for example, 0.08s. The drawing start time point at which the terminal acquires the image may be, for example, 1.20 seconds from 25 minutes at 13 points on 1 month and 5 days in 2020. The rendering completion time point of the terminal corresponding to the image acquired by the time-consuming detection sub-thread may be, for example, 1.25 seconds of 25 minutes of 13 points of 1 month and 5 days of 2020. The terminal determines, as the first processing duration corresponding to the image, a duration from a drawing start time point corresponding to the image to a rendering completion time point corresponding to the image, for example, may be 0.09 seconds. When the terminal determines that the first processing duration is less than the first duration threshold value for 0.08 seconds, the terminal can determine that the stuck detection result of the graphics processor is that the graphics processor is not in a stuck state.
In some embodiments, when the terminal determines the first time threshold, the terminal may acquire the sub-time length of processing one image view when the graphics processor is not in the stuck state, acquire the target time length corresponding to the preset number of sub-time lengths, and determine the target time length as the first time threshold, so that accuracy of setting the time length threshold may be improved, accuracy of determining the stuck detection result of the graphics processor may be improved, and use experience of a user may be improved.
According to some embodiments, when the terminal determines the first time length threshold, the terminal may acquire a sub-time length for processing one image view when the graphics processor is not in a stuck state, and the sub-time length acquired by the terminal may be 16ms, for example. The terminal acquires a preset number of 5 sub-lengths of which the target time length corresponds to 16ms and is 80ms, and determines the target time length of 80ms as a first time length threshold.
According to some embodiments, when the first processing time period is greater than or equal to the first time period threshold, the terminal determines that the stuck detection result of the graphics processor is that the graphics processor is in a stuck state, and the terminal may display the stuck detection result through the display interface. An exemplary schematic of the terminal interface at this time may be shown in fig. 7.
In some or related embodiments, if the central processing unit is detected to acquire presentation information for an image, a time-consuming detection sub-thread of the graphics processor is triggered, if frame callback processing is determined to be performed on the image, a drawing start time point corresponding to the image is acquired, a blank image view corresponding to the image is added, and blank image view addition completion information is sent to the time-consuming detection sub-thread, if the time-consuming detection sub-thread is determined to trigger the graphics processor to complete image view rendering of the image, a rendering completion time point corresponding to the image is acquired by adopting the time-consuming detection sub-thread, a time period from the drawing start time point corresponding to the image to the rendering completion time point corresponding to the image can be determined as a first processing time period corresponding to the image, accuracy of acquiring the first processing time period can be improved, and accuracy of determining a katon detection result can be improved. In addition, if the first processing time length is greater than or equal to the first time length threshold value, the fact that the jamming detection result of the graphics processor is in a jamming state is determined, the fact that whether the jamming detection is inaccurate in the image processing process caused by the fact that whether the CPU is jammed or not is reduced, the jamming detection result can be obtained based on the first time length threshold value, and accuracy in obtaining the detection result can be improved.
Fig. 8 is a flowchart illustrating a method of detecting a jam, according to an exemplary embodiment, as shown in fig. 8, including the steps of:
In step S31, if it is detected that the cpu acquires the presentation information for the image, triggering a time-consuming detection sub-thread of the graphics processor;
the specific process is as described above, and will not be described here again.
In step S32, a time-consuming detection sub-thread is adopted to obtain a second processing duration corresponding to the image;
according to some embodiments, the second processing duration is a duration between a drawing start time point corresponding to the image and a rendering time point of a current rendering image view, and the current rendering image view is a current rendering image view in all image views corresponding to the image. The second processing time period is not particularly limited to a certain fixed time period. The second processing duration may also change accordingly, for example, when the current rendering view changes. For example, when the drawing start time point corresponding to the image changes, the second processing duration may also change accordingly.
In some embodiments, the drawing start time point at which the terminal acquires the image may be, for example, 1.20 seconds of 25 minutes at 13 points on 1 month and 5 days in 2020. The time point when the terminal obtains the current rendered image view by using the time-consuming detection sub-thread can be, for example, 25 minutes 1.29 seconds from 1 month, 5 days and 13 points in 2020. The terminal determines, as the first processing duration corresponding to the image, a duration from a drawing start time point corresponding to the image to a rendering time point of the current rendered image view, for example, may be 0.09 seconds.
In step S33, if the second processing duration reaches the second duration threshold and it is determined that rendering of all image views corresponding to the image is not completed, determining that the graphics processor is in a stuck state as a result of the stuck detection of the graphics processor;
According to some embodiments, when the terminal obtains the second processing duration, the terminal may determine a result of the stuck detection of the graphics processor based on the second processing duration.
According to some embodiments, if the second processing duration reaches the second duration threshold and it is determined that rendering of all image views corresponding to the image is not completed, the terminal may determine that the result of the stuck detection by the graphics processor is that the graphics processor is in a stuck state.
According to some embodiments, when the terminal obtains the second processing duration, the terminal may determine a stuck detection result of the graphics processor based on the second processing duration, and specifically may determine that the stuck detection result of the graphics processor is in a stuck state if the second processing duration is greater than or equal to the second duration threshold and it is determined that rendering of all image views corresponding to the image is not completed. If the second processing duration is less than the second duration threshold and it is determined that rendering of all image views corresponding to the image is completed, the terminal may determine that the result of the stuck detection of the graphics processor is that the graphics processor is not in a stuck state.
In some embodiments, the second duration threshold refers to a threshold for detecting whether the graphics processor is stuck based on the second processing duration. The second time period threshold is not specific to a fixed time period threshold. For example, when the terminal acquires a modification instruction for the second duration threshold, the terminal may modify the second duration threshold based on the modification instruction.
According to some embodiments, determining that rendering of all image views corresponding to the image is not completed by the terminal may be, for example, the terminal acquiring a view index of a currently rendered view, and determining whether rendering of all image views corresponding to the image is completed based on the view index. If the view index indicates that the current view is not the last image view, the terminal may determine that all image views corresponding to the image are not rendered.
In some embodiments, the second duration threshold may be, for example, 0.1s. In some embodiments, the drawing start time point at which the terminal acquires the image may be, for example, 1.20 seconds of 25 minutes at 13 points on 1 month and 5 days in 2020. The time point when the terminal obtains the current rendered image view by adopting the time-consuming detection sub-thread can be, for example, 25 minutes 1.31 seconds of 13 points on 1 month and 5 days in 2020. The terminal determines a time period from a drawing start time point corresponding to the image to a rendering time point of the current rendering image view as a first processing time period corresponding to the image, for example, may be 1.01 seconds. When the terminal determines that the second processing time length is 1.01 seconds and is greater than the second time length threshold value by 0.1 seconds, and determines that all image views corresponding to the images are not rendered, the terminal can determine that the jamming detection result of the graphics processor is that the graphics processor is in a jamming state.
In some embodiments, the second duration threshold may be, for example, 0.1s. In some embodiments, the drawing start time point at which the terminal acquires the image may be, for example, 1.20 seconds of 25 minutes at 13 points on 1 month and 5 days in 2020. The time point when the terminal obtains the current rendered image view by using the time-consuming detection sub-thread can be, for example, 25 minutes 1.22 seconds from 1 month, 5 days and 13 points in 2020. The terminal determines a time period from a drawing start time point corresponding to the image to a rendering time point of the current rendering image view as a first processing time period corresponding to the image, for example, may be 0.02 seconds. When the terminal determines that the second processing time length is less than the second time length threshold value by 0.02 seconds and determines that rendering of all image views corresponding to the images is completed, the terminal can determine that the stuck detection result of the graphics processor is that the graphics processor is not in a stuck state.
In some embodiments, when the terminal determines the second duration threshold, the terminal may acquire the sub-duration of processing one image view when the graphics processor is not in the katon state, acquire the target duration corresponding to the preset number of sub-durations, and determine the target duration as the second duration threshold, so that accuracy of setting the second duration threshold may be improved, accuracy of determining a katon detection result of the graphics processor may be improved, and use experience of a user may be improved.
According to some embodiments, when the terminal determines the second duration threshold, the terminal may acquire a sub-duration of processing one image view when the graphics processor is not in a stuck state, and the sub-duration acquired by the terminal may be 16ms, for example. The terminal acquires a preset number of 5 sub-periods of which the target time length is 80ms and corresponds to 16ms, and determines the target time length of 80ms as a second time length threshold.
In step S34, if it is determined that the graphics processor is in the stuck state, the rendering number corresponding to at least one image view rendered at the same time in the graphics processor is adjusted.
According to some embodiments, when the terminal determines that the graphics processor is in a stuck state, the terminal may adjust a rendering number corresponding to at least one image view rendered at the same time in the graphics processor.
In some embodiments, the number of renderings corresponding to at least one image view rendered by the terminal at the same time may be, for example, 5. When the terminal determines that the graphics processor is in a stuck state, the terminal can adjust the rendering number corresponding to at least one image view rendered at the same time in the graphics processor from 5 to 3.
In some or related embodiments, if the central processing unit is detected to acquire the presentation information for the image, triggering a time-consuming detection sub-thread of the graphics processor, acquiring a second processing time length corresponding to the image by adopting the time-consuming detection sub-thread, and determining a stuck detection result of the graphics processor based on the second processing time length. And secondly, if the graphics processor is in a stuck state, adjusting the rendering quantity corresponding to at least one image view rendered at the same time in the graphics processor, so that the quantity of the image views rendered at the same time in the graphics processor can be reduced, the stuck condition of the graphics processor can be reduced, the condition that only whether the CPU is stuck to cause inaccurate stuck detection in the image processing process is reduced, the image processing efficiency is improved, and the use experience of a user can be improved.
Fig. 9 is a block diagram illustrating an apparatus for stuck detection according to an example embodiment. Referring to fig. 9, the stuck detecting apparatus 900 includes a sub-thread triggering unit 901, a duration acquiring unit 902, and a result determining unit 903.
The sub-thread triggering unit 901 is configured to trigger a time-consuming detection sub-thread of the graphics processor if the cpu is detected to acquire presentation information for the image;
a time length obtaining unit 902, configured to obtain a first processing time length corresponding to the image by using a time-consuming detection sub-thread, where the first processing time length is a time length from a drawing start time point corresponding to the image to a rendering completion time point corresponding to the image;
The result determining unit 903 is configured to determine that the graphics processor is in a stuck state according to the stuck detection result of the graphics processor if the first processing time period is greater than or equal to the first time period threshold.
According to some embodiments, fig. 10 is a block diagram of a katon detection apparatus according to an exemplary embodiment, and as shown in fig. 10, a duration obtaining unit 902 includes a time point obtaining subunit 912, a view adding subunit 922, and a duration obtaining subunit 932, where the duration obtaining unit 902 is configured to obtain a first processing duration corresponding to an image by using a time-consuming detection sub-thread, and includes:
a time point obtaining subunit 912, configured to obtain a drawing start time point corresponding to the image if it is determined that frame callback processing is performed on the image;
A view adding subunit 922, configured to add a blank image view corresponding to the image, and send blank image view addition completion information to the time-consuming detection sub-thread;
The time point obtaining subunit 912 is further configured to obtain a rendering completion time point corresponding to the image by using the time-consuming detection sub-thread if it is determined that the graphics processor is triggered to complete rendering of the image view of the image;
A duration obtaining subunit 932, configured to determine a duration from a rendering start time point corresponding to the image to a rendering completion time point corresponding to the image as a first processing duration corresponding to the image.
According to some embodiments, the duration obtaining unit 902 is further configured to obtain, after triggering the time-consuming detection sub-thread of the graphics processor, a second processing duration corresponding to the image by using the time-consuming detection sub-thread, where the second processing duration is a duration between a drawing start time point corresponding to the image and a rendering time point of a current rendered image view, and the current rendered image view is an image view in all image views corresponding to the image;
The result determining unit 903 is further configured to determine that the graphics processor is in a stuck state if the second processing duration reaches the second duration threshold and it is determined that rendering of all image views corresponding to the image is not completed.
Fig. 11 is a block diagram of a device for detecting a click through, according to some embodiments, and as shown in fig. 11, the device 900 further includes a threshold setting unit 904 configured to obtain a sub-duration for processing an image view when the graphics processor is not in a click through state;
obtaining target time lengths corresponding to the preset number of sub-time lengths, and determining the target time lengths as time length thresholds.
According to some embodiments, fig. 12 is a block diagram of a device for detecting a click through, as shown in fig. 12, where the device 900 further includes a number adjustment unit 905 configured to adjust, if it is determined that the graphics processor is in a click-through state, a rendering number corresponding to at least one image view rendered at the same time in the graphics processor.
In some or related embodiments, if the sub-thread triggering unit detects that the central processor acquires the presentation information for the image, the time-consuming detection sub-thread of the graphics processor is triggered, the time-consuming detection sub-thread may be adopted by the time-consuming detection sub-thread to acquire a first processing time length corresponding to the image, where the first processing time length is a time length between a drawing start time point corresponding to the image and a rendering completion time point corresponding to the image, and the result determining unit may determine that the stuck detection result of the graphics processor is that the graphics processor is in a stuck state if the first processing time length is greater than or equal to a first time length threshold. Therefore, the first processing duration can be obtained through the time-consuming detection sub-thread, whether the graphics processor is blocked or not is determined based on the duration, the situation that whether the CPU is blocked or not can only be determined when the blocking occurs in the image processing process can be improved, the situation that whether the blocking detection is inaccurate in the image processing process due to the fact that whether the CPU is blocked or not is reduced is detected, the accuracy of the blocking detection can be improved, and the use experience of a user can be improved.
Referring to fig. 13, a block diagram of a terminal is shown according to an exemplary embodiment. As shown in fig. 13, terminal 1300 may include at least one processor 1301, at least one network interface 1304, a user interface 1303, a memory 1305, and at least one communication bus 1302.
Wherein a communication bus 1302 is used to enable connected communications between these components.
The user interface 1303 may include a speaker and a display, and the optional user interface 1303 may further include a standard wired interface, a wireless interface, among others.
The network interface 1304 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface), among others.
Processor 1301 may include one or more processing cores, among other things. The processor 1301 connects various parts within the entire terminal 1300 using various interfaces and lines, and performs various functions of the terminal 1300 and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 1305, and calling data stored in the memory 1305. Alternatively, the processor 1301 may be implemented in at least one hardware form of digital signal Processing (DIGITAL SIGNAL Processing, DSP), field-Programmable gate array (Field-Programmable GATE ARRAY, FPGA), programmable logic array (Programmable Logic Array, PLA). Processor 1301 may integrate one or a combination of several of a central processing unit (Central Processing Unit, CPU), a graphics processor (Graphics Processing Unit, GPU), and a modem, etc. The CPU mainly processes an operating system, a user interface, an application program and the like, the GPU is used for rendering and drawing contents required to be displayed by the display screen, and the modem is used for processing wireless communication. It will be appreciated that the modem may not be integrated into the processor 1301 and may be implemented by a single chip.
The Memory 1305 may include a random access Memory (Random Access Memory, RAM) or a Read-Only Memory (RAM). Optionally, the memory 1305 includes a non-transitory computer readable medium (non-transitory computer-readable storage medium). The memory 1305 may be used to store instructions, programs, code sets, or instruction sets. The memory 1305 may include a stored program area that may store instructions for implementing an operating system, instructions for at least one function (e.g., a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the various method embodiments described above, etc., and a stored data area that may store data related to the various method embodiments described above, etc. Memory 1305 may also optionally be at least one storage device located remotely from the aforementioned processor 1301. As shown in fig. 13, an operating system, a network communication module, a user interface module, and an application program for detecting a click-through may be included in the memory 1305 as one type of computer storage medium.
In the terminal 1300 shown in fig. 13, the user interface 1303 is mainly used to provide an input interface for a user to obtain data input by the user, and the processor 1301 may be used to invoke the application program for detecting the card in the memory 1305, and specifically execute the steps in the embodiments of the method of fig. 4-8.
Accordingly, the disclosed embodiments also provide a computer-readable storage medium storing a computer program. The computer-readable storage medium stores a computer program that, when executed by one or more processors, causes the one or more processors to perform the steps in the method embodiments of fig. 4-8.
It will be appreciated by those skilled in the art that embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should be noted that in this document, relational terms such as "first" and "second" and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises an element.
The above is merely a specific embodiment of the disclosure to enable one skilled in the art to understand or practice the disclosure. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the disclosure. Thus, the present disclosure is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any adaptations, uses, or adaptations of the disclosure following the general principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.