CN118042163A - Image processing method and device, electronic equipment and storage medium - Google Patents
Image processing method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN118042163A CN118042163A CN202211423122.2A CN202211423122A CN118042163A CN 118042163 A CN118042163 A CN 118042163A CN 202211423122 A CN202211423122 A CN 202211423122A CN 118042163 A CN118042163 A CN 118042163A
- Authority
- CN
- China
- Prior art keywords
- image
- filtering
- sub
- region
- filtered
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 25
- 238000001914 filtration Methods 0.000 claims abstract description 341
- 238000012545 processing Methods 0.000 claims abstract description 69
- 238000000034 method Methods 0.000 claims description 92
- 230000008569 process Effects 0.000 claims description 56
- 230000000694 effects Effects 0.000 claims description 45
- 238000013528 artificial neural network Methods 0.000 claims description 44
- 230000003044 adaptive effect Effects 0.000 claims description 40
- 238000004590 computer program Methods 0.000 claims description 10
- 238000012937 correction Methods 0.000 description 13
- 238000013139 quantization Methods 0.000 description 12
- 238000010586 diagram Methods 0.000 description 10
- 238000004891 communication Methods 0.000 description 9
- 238000001514 detection method Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 8
- 230000009466 transformation Effects 0.000 description 7
- 238000004364 calculation method Methods 0.000 description 6
- 241000023320 Luma <angiosperm> Species 0.000 description 5
- OSWPMRLSEDHDFF-UHFFFAOYSA-N methyl salicylate Chemical compound COC(=O)C1=CC=CC=C1O OSWPMRLSEDHDFF-UHFFFAOYSA-N 0.000 description 5
- 238000013527 convolutional neural network Methods 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 230000010365 information processing Effects 0.000 description 3
- 210000002569 neuron Anatomy 0.000 description 3
- 238000007630 basic procedure Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 230000004927 fusion Effects 0.000 description 2
- 230000001537 neural effect Effects 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 101150039623 Clip1 gene Proteins 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000009795 derivation Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000000306 recurrent effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/80—Details of filtering operations specially adapted for video compression, e.g. for pixel interpolation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/102—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
- H04N19/117—Filters, e.g. for pre-processing or post-processing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/169—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
- H04N19/186—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a colour or a chrominance component
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/30—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using hierarchical techniques, e.g. scalability
- H04N19/31—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using hierarchical techniques, e.g. scalability in the temporal domain
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/503—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
- H04N19/51—Motion estimation or motion compensation
- H04N19/513—Processing of motion vectors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/593—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving spatial prediction techniques
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/80—Details of filtering operations specially adapted for video compression, e.g. for pixel interpolation
- H04N19/82—Details of filtering operations specially adapted for video compression, e.g. for pixel interpolation involving filtering within a prediction loop
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/85—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression
- H04N19/86—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression involving reduction of coding artifacts, e.g. of blockiness
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/90—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using coding techniques not provided for in groups H04N19/10-H04N19/85, e.g. fractals
- H04N19/91—Entropy coding, e.g. variable length coding [VLC] or arithmetic coding
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
Abstract
An image processing method, an image processing device, electronic equipment and a storage medium relate to the field of image processing and solve the problem that the image quality of different authority areas in an image is poor after the image is decoded and filtered. The image processing method is executed by an electronic device and includes: dividing an image to be processed into a plurality of image subareas, and acquiring the authority level of each image subarea; and if the authority levels of at least two adjacent image subareas in the plurality of image subareas are different, filtering the connected boundary between the at least two image subareas.
Description
Technical Field
The present application relates to the field of image processing, and in particular, to an image processing method, an image processing device, an electronic device, and a storage medium.
Background
In video encoding and decoding, the purpose of filtering processing is to realize smooth noise reduction and detail removal in video image blocks, and to keep the image edge function to the greatest extent. Typically, the electronic device filters the boundaries of multiple regions in the image to preserve the edges of the image. However, viewing rights of different users to different areas in an image are different, and in the case that a low-rights user does not have viewing rights of a high-rights area in the image, the electronic device does not perform filtering processing on a boundary between the high-rights area and the low-rights area in the video decoding process, so that the quality of the image is poor.
Disclosure of Invention
The application provides an image processing method, an image processing device, electronic equipment and a storage medium, which solve the problem that the image quality of different authority areas in an image is poor after the image is decoded and filtered.
The application adopts the following technical scheme.
In a first aspect, the present application provides an image processing method, which is executed by an electronic apparatus, comprising: dividing an image to be processed into a plurality of image subareas, and acquiring the authority level of each image subarea; and if the authority levels of at least two adjacent image subareas in the plurality of image subareas are different, filtering the connected boundary between the at least two image subareas.
In this embodiment, the electronic device divides the image to be processed into a plurality of image sub-areas, acquires the authority level of each image sub-area, determines the authority level of each image sub-area, and if the authority levels of at least two adjacent image sub-areas in the plurality of image sub-areas are different, it is indicated that after the filtering in the decoding process, there may be a situation that the filtering process is not performed on the two image sub-areas, and the quality of the image to be processed is not high. Therefore, based on decoding information such as division of the authority areas, the electronic equipment carries out filtering processing on the connected boundary between at least two image subareas again, so that the image quality of different authority areas in the image to be processed is improved.
In some embodiments, the permission levels include a first permission level, a second permission level, a zero permission level; the image subregion of zero permission level is an image subregion that any user can view.
In some embodiments, before dividing the image to be processed into a plurality of image sub-areas and acquiring the authority level of each image sub-area, the method includes: and receiving a decoding code stream of the image to be processed, and decoding based on the decoding code stream to obtain the decoded image to be processed.
In some embodiments, the filtering process includes: at least one of deblocking vertical filtering, deblocking horizontal filtering, sample adaptive compensation filtering, adaptive loop filtering, and neural network filtering.
In some embodiments, there is a vertical boundary between at least two image sub-regions, the filtering includes deblocking vertical filtering, and the filtering includes: determining a target region to be filtered on the vertical side of a first sub-region of the at least two image sub-regions; the vertical side is the left side or the right side of the boundary of the first subarea, and the target area to be filtered is any one of the following: the right side of the border to be filtered is connected with the right side of the border to be filtered, the left side of the border to be filtered is connected with the left side of the border to be filtered, and the right side of the border to be filtered is connected with the left side of the border to be filtered; and performing deblocking effect vertical filtering on the target region to be filtered.
In some embodiments, the at least two adjacent image sub-regions further include a second sub-region horizontally adjacent to the first sub-region, where horizontally adjacent means that the second sub-region is located to the left or right of the horizontal direction of the first sub-region. The authority levels of the adjacent at least two image subregions are different, including: the authority level of the first subarea is lower than that of the second subarea; or the authority level of the first subarea is higher than that of the second subarea; or the authority level of the first subarea is not equal to the authority level of the second subarea; or the authority level of the first subarea is not equal to that of the second subarea, and the filtering strength of the connected boundary is greater than or equal to the preset filtering strength.
In some embodiments, there is a horizontally-oriented contiguous boundary between at least two image sub-regions, the filtering includes deblocking horizontal filtering, and the filtering includes: determining a target region to be filtered on the horizontal side of a first sub-region of the at least two image sub-regions; the horizontal side is the upper side or the lower side of the boundary of the first subarea, and the target area to be filtered is any one of the following: the device comprises a lower side to-be-filtered area of a connected boundary, an upper side to-be-filtered area of the connected boundary, a lower side to-be-filtered area of the connected boundary and an upper side to-be-filtered area; and performing deblocking effect horizontal filtering on the target region to be filtered.
In some embodiments, the at least two adjacent image sub-regions further include a third sub-region, and the third sub-region is vertically adjacent to the first sub-region, where vertically adjacent means that the third sub-region is located above or below the first sub-region in a horizontal direction. The authority levels of the adjacent at least two image subregions are different, including: the authority level of the first subarea is lower than that of the third subarea; or the authority level of the first subarea is higher than that of the third subarea; or the authority level of the first subarea is not equal to the authority level of the third subarea; or the authority level of the first subarea is not equal to that of the third subarea, and the filtering strength of the connected boundary is greater than or equal to the preset filtering strength.
In some embodiments, the filtering includes sample adaptive compensation filtering, and the filtering the boundary between the at least two image sub-regions includes: according to the set filtering sequence, determining a target region to be filtered from the surrounding side of a first sub-region in at least two image sub-regions; the surrounding side includes in order: the right level of the first subarea is different from that of other subareas, and the target area to be filtered is any one of the following: a first side region within the first sub-region that is opposite the contiguous boundary, a second side region outside the first sub-region that is opposite the contiguous boundary, a first side region, and a second side region; and carrying out sample self-adaptive compensation filtering on the target region to be filtered according to the set filtering sequence.
In some embodiments, the filtering includes adaptive loop filtering, said filtering the meeting boundary between the at least two image sub-regions includes: according to the set filtering sequence, determining a target region to be filtered from the surrounding side of a first sub-region in at least two image sub-regions; the surrounding side includes in order: the right level of the first subarea is different from that of other subareas, and the target area to be filtered is any one of the following: a first side region within the first sub-region that is opposite the contiguous boundary, a second side region outside the first sub-region that is opposite the contiguous boundary, a first side region, and a second side region; and carrying out self-adaptive loop filtering on the target region to be filtered according to the set filtering sequence.
In some embodiments, according to the decoding information of the image to be processed, judging whether filtering processing based on a neural network is required to be performed on a first subarea in at least two image subareas; under the condition that the first subarea needs to be subjected to filtering processing based on a neural network, inputting a reconstructed pixel value of the first subarea and a permission level of the first subarea into the neural network; and obtaining a filtered reconstructed pixel of the first sub-region of the output.
In some embodiments, in the case that the first sub-region needs to be subjected to filtering processing based on the neural network, inputting the reconstructed pixel value of the first sub-region and the authority level of the first sub-region into the neural network, the method further includes: the authority level of at least one image sub-area adjacent to the first sub-area is input into the neural network.
In some embodiments, the electronic device may obtain information of the image sub-region that is lower than or equal to the permission level of the user according to the permission level of the user; the information of the image area comprises pixel values of the image sub-area.
In some embodiments, the filtering the boundary between at least two image subregions includes: filtering the brightness channel component on the connected boundary between at least two image subregions; or filtering the chrominance channel component on the joint boundary between at least two image subregions; or filtering the luminance channel component and the chrominance channel component on the meeting boundary between at least two image subregions.
In a second aspect, the present application provides an image processing apparatus comprising: an image dividing unit, an image filtering unit; the image dividing unit is used for dividing the image to be processed into a plurality of image subareas; the image decoding unit is used for acquiring the authority level of each image subarea; and the image filtering unit is used for filtering the connected boundary between at least two image subareas if the authority levels of the adjacent at least two image subareas in the plurality of image subareas are different.
In some embodiments, the permission level of each image sub-region is used to indicate: whether each image subarea is a set authority area or not, and if each image subarea is the authority area, judging the authority level of each image subarea by the electronic equipment; the permission levels include a first permission level and a second permission level.
In some embodiments, the image decoding unit is further configured to receive a decoded code stream of the image to be processed, and decode the image based on the decoded code stream to obtain a decoded image to be processed.
In some embodiments, the image filtering unit is further configured to determine a target to-be-filtered region on a vertical side of a first sub-region of the at least two image sub-regions; the vertical side is the left side or the right side of the boundary of the first subarea, and the target area to be filtered is any one of the following: the right side of the border to be filtered is connected with the right side of the border to be filtered, the left side of the border to be filtered is connected with the left side of the border to be filtered, and the right side of the border to be filtered is connected with the left side of the border to be filtered; and the image filtering unit is also used for performing deblocking effect vertical filtering on the target region to be filtered.
In some embodiments, the at least two adjacent image sub-regions further include a second sub-region horizontally adjacent to the first sub-region, where horizontally adjacent means that the second sub-region is located to the left or right of the horizontal direction of the first sub-region. The authority levels of at least two adjacent image subregions are different, comprising: the authority level of the first subarea is lower than that of the second subarea; or the authority level of the first subarea is higher than that of the second subarea; or the authority level of the first subarea is not equal to the authority level of the second subarea; or the authority level of the first subarea is not equal to that of the second subarea, and the filtering strength of the connected boundary is greater than or equal to the preset filtering strength.
In some embodiments, the image filtering unit is further configured to determine a target to-be-filtered region on a horizontal side of a first sub-region of the at least two image sub-regions; the horizontal side is the upper side or the lower side of the boundary of the first subarea, and the target area to be filtered is any one of the following: the device comprises a lower side to-be-filtered area of a connected boundary, an upper side to-be-filtered area of the connected boundary, a lower side to-be-filtered area of the connected boundary and an upper side to-be-filtered area; and the image filtering unit is also used for carrying out deblocking effect horizontal filtering on the target region to be filtered.
In some embodiments, the at least two adjacent image sub-regions further include a third sub-region, and the third sub-region is vertically adjacent to the first sub-region, where vertically adjacent means that the third sub-region is located above or below the first sub-region in a horizontal direction. The authority levels of at least two adjacent image subregions are different, comprising: the authority level of the first subarea is lower than that of the third subarea; or the authority level of the first subarea is higher than that of the third subarea; or the authority level of the first subarea is not equal to the authority level of the third subarea; or the authority level of the first subarea is not equal to that of the third subarea, and the filtering strength of the connected boundary is greater than or equal to the preset filtering strength.
In some embodiments, the image filtering unit is further configured to determine a target area to be filtered from a surrounding side of a first sub-area of the at least two image sub-areas according to the set filtering order; the surrounding side includes in order: the right level of the first subarea is different from that of other subareas, and the target area to be filtered is any one of the following: a first side region within the first sub-region that is opposite the contiguous boundary, a second side region outside the first sub-region that is opposite the contiguous boundary, a first side region, and a second side region; and the image filtering unit is also used for carrying out sample self-adaptive compensation filtering on the target region to be filtered according to the set filtering sequence.
In some embodiments, the image filtering unit is further configured to determine a target area to be filtered from a surrounding side of a first sub-area of the at least two image sub-areas according to the set filtering order; the surrounding side includes in order: the right level of the first subarea is different from that of other subareas, and the target area to be filtered is any one of the following: a first side region within the first sub-region that is opposite the contiguous boundary, a second side region outside the first sub-region that is opposite the contiguous boundary, a first side region, and a second side region; and the image filtering unit is also used for carrying out self-adaptive loop filtering on the target region to be filtered according to the set filtering sequence.
In some embodiments, the image filtering unit is further configured to determine, according to decoding information of the image to be processed, whether filtering processing based on a neural network needs to be performed on a first sub-area of the at least two image sub-areas; under the condition that the first subarea needs to be subjected to filtering processing based on a neural network, inputting a reconstructed pixel value of the first subarea and a permission level of the first subarea into the neural network; and obtaining a filtered reconstructed pixel of the first sub-region of the output.
In some embodiments, the image filtering unit is further configured to input the authority level of at least one image sub-region adjacent to the first sub-region into the neural network.
In some embodiments, the image filtering unit is further configured to perform a filtering process of the luminance channel component on a boundary between at least two image sub-regions; or filtering the chrominance channel component on the joint boundary between at least two image subregions; or filtering the luminance channel component and the chrominance channel component on the meeting boundary between at least two image subregions.
In a third aspect, the present application provides an electronic device comprising: a memory and a processor; the memory is coupled to the processor; the memory is for storing computer program code, the computer program code comprising computer instructions; wherein the processor, when executing the computer instructions, causes the image processing apparatus to perform the image processing method as in the first aspect and any one of its possible implementations.
In a fourth aspect, the present application provides a computer-readable storage medium comprising: computer software instructions; the computer software instructions, when run in an image processing apparatus, cause the image processing apparatus to implement the image processing method of the first aspect described above and any one of its possible implementations.
In a fifth aspect, the present application provides a computer program product which, when run on an image processing apparatus, causes the image processing apparatus to perform the image processing method of the first aspect and any one of its possible implementations. Advantageous effects of the second aspect to the fifth aspect described above may refer to corresponding descriptions of the first aspect, and are not repeated.
Further combinations of the present application may be made to provide further implementations based on the implementations provided in the above aspects.
Drawings
FIG. 1 is a schematic diagram of an image processing flow framework provided by the application;
FIG. 2 is a schematic diagram of a deblocking filtered region according to the present application;
FIG. 3 is a schematic diagram of a filter block boundary according to the present application;
FIG. 4 is a schematic diagram of a filter according to the present application;
FIG. 5 is a schematic diagram of an image processing apparatus according to the present application;
FIG. 6 is a schematic flow chart of an image processing method according to the present application;
FIG. 7 is a schematic diagram of a detection area according to the present application;
FIG. 8 is a schematic diagram of a filtering region according to the present application;
FIG. 9 is a flowchart of another image processing method according to the present application;
FIG. 10 is a schematic view of another filtering region according to the present application;
FIG. 11 is a flowchart of another image processing method according to the present application;
FIG. 12 is a schematic view of another filtering region according to the present application;
FIG. 13 is a flowchart of another image processing method according to the present application;
FIG. 14 is a schematic view of a filtering region according to another embodiment of the present application;
FIG. 15 is a flowchart of another image processing method according to the present application;
FIG. 16 is a schematic view of another filtering region according to the present application;
FIG. 17 is a flowchart of another image processing method according to the present application;
FIG. 18 is a flowchart of another image processing method according to the present application;
Fig. 19 is a schematic diagram of a hardware structure of an electronic device according to the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
It should be noted that, in the embodiments of the present application, words such as "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In order to clearly describe the technical solution of the embodiments of the present application, in the embodiments of the present application, the terms "first", "second", etc. are used to distinguish the same item or similar items having substantially the same function and effect, and those skilled in the art will understand that the terms "first", "second", etc. are not limited in number and execution order.
In order to enable those skilled in the art to better understand the technical solutions provided by the embodiments of the present application, the following is a brief description of some technical terms related to the embodiments of the present application and main processes of video encoding and decoding.
1. Technical terminology
1. Deblocking filtering (deblocking filter, DBF): for removing block boundary effects generated by block coding, including deblocking vertical filtering and deblocking horizontal filtering.
2. Sample adaptive compensation filtering (SAMPLE ADAPTIVE offset, SAO): by classifying based on the pixel values of the samples and the gradient values of surrounding blocks, different compensation values are added for each class of pixel values, so that the reconstructed image is closer to the original image. The basic principle of SAO is to compensate for the addition of negative values to the peak pixels and positive values to the trough pixels in the reconstructed curve. SAO takes an acquisition and transmission unit (Collect Transfer unit, CTU) as a basic unit, and comprises two main compensation forms: boundary Offset (EO) and sideband Offset (BO), and parameter fusion techniques are introduced. The sample offset compensation filtering includes base sample offset compensation (SAO described above), enhanced sample offset compensation (ESAO), cross-component sample offset compensation (CCSAO), and is performed only for chroma.
3. Adaptive loop filtering (adaptive loop filter, ALF): and performing enhanced filtering on the reconstructed image through a wiener filter, so that the reconstructed image is closer to the original image. Wherein the adaptive loop filtering may enable enhanced adaptive correction filtering (EALF).
4. Neural network (Neural Network, NN): a neural network is an operational model, which is formed by interconnecting a large number of nodes (or neurons). In an artificial neural network, the neuron processing units may represent different objects, such as features, letters, concepts, or some meaningful abstract patterns. The types of processing units in the network fall into three categories: input unit, output unit and hidden unit. The input unit receives signals and data of the outside world; the output unit is used for outputting a system processing result; the hidden unit is a unit that is located between the input and output units and cannot be observed from outside the system. The connection weight among the neurons reflects the connection strength among the units, and the representation and the processing of the information are embodied in the connection relation of the network processing units. The artificial neural network is a non-programmed brain-like information processing mode, and is essentially characterized in that a parallel distributed information processing function is obtained through network transformation and dynamic behaviors, and the information processing functions of the human brain nervous system are simulated in different degrees and levels. Currently, in the field of video processing, common neural networks include Convolutional Neural Networks (CNNs), recurrent Neural Networks (RNNs), fully connected networks, and the like.
2. Main flow of video coding and decoding
As shown in fig. 1, taking video coding as an example, video coding generally includes processes of prediction, transformation, quantization, entropy coding, and the like, and further, the coding process may be implemented according to the framework in fig. 1.
The prediction can be divided into intra-frame prediction and inter-frame prediction, wherein the intra-frame prediction is to utilize surrounding coded blocks as references to predict a current uncoded block, so that redundancy in a space domain is effectively removed. Inter prediction is to predict a current picture using neighboring coded pictures, effectively removing redundancy in the temporal domain.
Transformation refers to the conversion of an image from the spatial domain to the transform domain, which is represented by transform coefficients. Most images contain more flat areas and areas with slow change, and proper transformation can convert the images from scattered distribution in a space domain to relatively concentrated distribution in a transformation domain, remove the frequency domain correlation among signals and can effectively compress a code stream by matching with a quantization process.
Entropy coding is a lossless coding method that converts a series of element symbols into a binary code stream for transmission or storage, where the input symbols may include quantized transform coefficients, motion vector information, prediction mode information, transform quantization-related syntax, and the like. Entropy encoding can effectively remove redundancy of video element symbols.
The foregoing description is presented by taking coding as an example, and the video decoding and the video coding are opposite, that is, the video decoding generally includes entropy decoding, prediction, inverse quantization, inverse transformation, filtering, and the like, where the implementation principle of each process is the same as or similar to that of the entropy coding. The filtering process comprises deblocking filtering, sample adaptive compensation filtering, adaptive loop filtering and filtering based on a neural network.
3. Deblocking filtering (deblocking filter, DBF)
The implementation of the DBF filtering process will be briefly described again.
The DBF filtering process includes two processes: filtering decision and filtering operation.
The filtering decision includes: 1) Acquiring boundary filter strength (BS value); 2) A filtering switch decision; 3) And selecting filtering strength. For chrominance, there is only step 1), and BS values of luminance are directly multiplexed. For chrominance, the filtering operation is only performed when the BS value is 2 (i.e. at least one of the image areas on both sides is in intra mode).
The filtering operation includes: 1) Strong filtering and weak filtering for luminance; 2) Filtering for chrominance.
The boundary to be filtered that satisfies one of the following conditions does not require DBF filtering:
condition 1: if the boundary to be filtered is an image boundary, the boundary does not need to be filtered.
Condition 2: if the boundary to be filtered is a slice boundary and the cross-slice loop filter enable flag is 0, then the boundary does not require filtering.
Condition 3: if the boundary to be filtered is a luminance filtering boundary and the luminance filtering boundary is not a boundary of a luminance coding block or a luminance transformation block, the boundary does not need filtering.
Condition 4: if the boundary to be filtered is a chroma filtering boundary, and the chroma filtering boundary is not a boundary of a chroma coding block or a chroma transform block, and the luma filtering boundary corresponding to the chroma filtering boundary is not a boundary of a luma coding block, the boundary does not need filtering.
Condition 5: if the boundary to be filtered is a luminance filtering boundary and the sub-block transform flag of the coding unit where the luminance filtering boundary is located is 1 and the luminance filtering boundary is not the boundary of the luminance coding block, the boundary does not need filtering.
Condition 6: if the intra prediction mode flag of the string copy of the coding unit where the boundary to be filtered is located is 1, the boundary does not need filtering.
The DBF filtering process generally performs deblocking vertical filtering and deblocking horizontal filtering in units of 8x8, and filters at most 3 pixels on both sides of a contiguous boundary of a current block (image area to be filtered), and filters at most 4 pixels on both sides of the contiguous boundary, so that different block vertical/horizontal filtering does not affect each other, and can be performed in parallel. As shown in fig. 2, for the current 8x8 block, vertical filtering is performed for the left 3 columns of the current block and the right 3 columns of the left block, and then horizontal filtering is performed for the upper 3 rows of the current block and the lower 3 rows of pixels of the upper block.
The derivation of the boundary strength, i.e., BS value, is as follows:
As shown in fig. 3, a schematic diagram of the vertical side contiguous boundary (vertical boundary) and the horizontal side contiguous boundary (horizontal boundary) of the current block is shown, with 8 pixel samples on both sides of the vertical boundary/horizontal boundary being denoted as p 0、p1、p2、p3 and q 0、q1、q2、q3, respectively. It should be noted that when calculating the boundary filter strength BS value, the boundary filter strength BS value is calculated for the pixel of the current block, i.e. the boundary is the boundary for the pixel of the current block, not the boundary for the current luminance block or the current chrominance block.
Boundary filtering strength determination mode,
The boundary filter strength BS value is equal to 0 if all the following conditions are satisfied.
A) The quantization coefficients of the coding unit transform blocks where p 0 and q 0 are located are all 0. Wherein, if p 0 (or q 0) is a luminance sample and the coding unit where p 0 (or q 0) is located contains only luminance samples, the coding unit where p 0 (or q 0) is located refers to a luminance coding unit containing p 0 (or q 0); if p 0 (or q 0) is a chroma sample and the coding unit where p 0 (or q 0) is located contains only chroma samples, then the coding unit where p 0 (or q 0) is located refers to the coding unit containing p 0 (or q 0) corresponding luma samples; otherwise (i.e., the coding unit where p 0 or q 0 is located contains both luma and chroma samples), the coding unit where p 0 (or q 0) is located refers to the coding unit containing p 0 (or q 0).
B) The prediction type of the coding unit where p 0 and q 0 are located is not intra.
C) Note that B P and B Q are 4×4 luma coded blocks where p 0 and q 0 are located, respectively, and motion information of B P and B Q satisfies the following conditions 1 and 2 simultaneously or satisfies the conditions 3,4, and 5 simultaneously.
1) The L0 reference indexes of the spatial domain motion information storage units corresponding to B P and B Q are equal to-1, or the reference frames corresponding to the L0 reference indexes of the spatial domain motion information storage units corresponding to B P and B Q are the same frame, and the difference of all components of the L0 motion vector of the spatial domain motion information storage unit is smaller than one integral pixel point.
2) The L1 reference indexes of the spatial domain motion information storage units corresponding to B P and B Q are equal to-1, or the reference frames corresponding to the L1 reference indexes of the spatial domain motion information storage units corresponding to B P and B Q are the same frame, and the difference of all components of the L1 motion vector of the spatial domain motion information storage unit is smaller than one integral pixel point.
3) One of the following conditions is satisfied:
The L0 reference index of the spatial motion information storage unit corresponding to B P is equal to-1 and the L1 reference index of the spatial motion information storage unit corresponding to B Q is equal to-1.
The reference frame corresponding to the L0 reference index of the spatial motion information storage unit corresponding to B P and the reference frame corresponding to the L1 reference index of the spatial motion information storage unit corresponding to B Q are the same frame, and the difference between all components of the L0 motion vector of the spatial motion information storage unit corresponding to B P and the L1 motion vector of the spatial motion information storage unit corresponding to B Q is smaller than one integral pixel.
4) One of the following conditions is satisfied:
Condition 1: the L0 reference index of the spatial motion information storage unit corresponding to B Q is equal to-1 and the L1 reference index of the spatial motion information storage unit corresponding to B P is equal to-1.
Condition 2: the reference frame corresponding to the L0 reference index of the spatial motion information storage unit corresponding to B Q and the reference frame corresponding to the L1 reference index of the spatial motion information storage unit corresponding to B P are the same frame, and the difference between all components of the L0 motion vector of the spatial motion information storage unit corresponding to B Q and the L1 motion vector of the spatial motion information storage unit corresponding to B P is smaller than one integral pixel.
5) The reference frame corresponding to the L0 reference index of the airspace motion information storage unit corresponding to B P and the reference frame corresponding to the L0 reference index of the airspace motion information storage unit corresponding to B Q are not the same frame; the reference frame corresponding to the L1 reference index of the spatial motion information storage unit corresponding to B P and the reference frame corresponding to the L1 reference index of the spatial motion information storage unit corresponding to B Q are not the same frame.
Otherwise, calculating the value of the boundary filtering strength Bs according to the second boundary filtering strength determining mode.
A second mode of determining boundary filtering strength,
Step one, calculate the average quantization parameter QP av for the coding unit where p 0 and q 0 are located. If the sample is a luminance sample, the quantization parameter of the luminance coding block should be used; in case of chroma samples, quantization parameters of the chroma coding block should be used. Let the quantization parameter of the coding unit where p 0 is located be QP q and the average quantization parameter be QP p,q0:
QPav=(QPp+QPq+1)>>1 |
and step two, calculating indexes IndexA and IndexB.
And thirdly, looking up a table according to IndexA and IndexB to obtain values of alpha 'and beta', and obtaining values of alpha and beta according to BitDepth.
Illustratively, "> >" is a right shift operation for substitution division, i.e., "> >5" corresponds to division by 25 (i.e., 32). In addition, in the embodiment of the present application, the multiplication (i.e. "x") may be replaced by a left shift manner when actually implemented. For example, a times 4 may be replaced by a shift to the left by 2 bits, i.e. by a < < 2; a times 10, can be replaced by (a < < 3) + (a < < 1).
Illustratively, "<" is a left shift operation for replacing multiplication, i.e., "a < <2" corresponds to multiplication by 22 (i.e., 4).
Illustratively, in view of the division operation implemented by shifting, the result of the operation is typically rounded directly,
That is, when the result of the operation is a non-integer between N and n+1, the result is N, and when the fractional part is greater than 0.5, the accuracy of the result is n+1 is considered to be higher, so, in order to improve the accuracy of the determined pixel value, 1/2 of the denominator (i.e., the dividend) may be added to the numerator of the weighted sum, so as to achieve the rounding effect.
Step four, if DeblockingFilterType is 1 and Abs (p 0-q 0) is greater than or equal to 4×α, bs is equal to 0; otherwise, bs values are calculated as follows.
1) FS was calculated by setting the values of fL and fR to 0.
2) Bs values were determined from fS, and there were the following cases.
Case one, when fS is equal to 6, if Abs (p 0-p1) is less than or equal to β/4 and Abs (q 0-q1) is less than or equal to β/4 and Abs (p 0-q0) is less than α, then Bs is equal to 4; otherwise Bs value is equal to 3.
Case two, when fS is equal to 6 and DeblockingFilterType is equal to 1, if Abs (p 0-p 1) is less than or equal to β/4 and Abs (q 0-q 1) is less than or equal to β/4 and Abs (p 0-p 3) is less than or equal to β/2 and Abs (q 0-q 3) is less than or equal to β/2 and Abs (p 0-q 0) is less than α, then Bs is equal to 4; otherwise Bs value is equal to 3.
Case three, when fS is equal to 5, if p 0 is equal to p 1 and q 0 is equal to q 1, then Bs is equal to 3; otherwise Bs value equals 2.
Case four, when fS is equal to 5 and DeblockingFilterType is equal to 1, if p0 is equal to p1 and q0 is equal to q1 and Abs (p 2-q 2) is less than α, then Bs is equal to 3; otherwise Bs value equals 2.
Case five, when fS is equal to 4, bs is equal to 2 if fL is equal to 2; otherwise Bs value is equal to 1.
Case six, when fS is equal to 3, bs is equal to 1 if Abs (p 1–q1) is less than β; otherwise Bs value equals 0.
Case seven, bs value equals 0 when fS is other value.
3) If the Bs value obtained in step 2) is not equal to 0 and the filtered boundary is a chroma coded block boundary, the Bs value is subtracted by 1.
(II) determining deblocking filter adjustment parameters
Deblocking filter adjustment parameters DbrThresold, dbrOffset0, dbrOffset1, dbrAltOffset0, and DbrAltOffset1 are determined.
If currently a vertical boundary and PictureDbrVEnableFlag is 1, or currently a horizontal boundary and PictureDbrHEnableFlag is 1, then the value of PictureDbrEnableFlag is 1; otherwise, pictureDbrEnableFlag has a value of 0.
If currently a vertical boundary and PictureAltDbrVEnableFlag is 1, or currently a horizontal boundary and PictureAltDbrHEnableFlag is 1, then the value of PictureAltDbrEnableFlag is 1; otherwise, pictureAltDbrEnableFlag has a value of 0.
For vertical boundaries:
For horizontal boundaries:
Boundary filtering process when (III) luminance component Bs is equal to 4
At a boundary filter strength Bs value of 4, the calculation process for the P 0、p1、p2 and Q 0、q1、q2 filters is as follows (P 0、P1、P2 and Q 0、Q1、Q2 are filtered values):
If PictureDbrEnableFlag has a value of 1, the values of P 0、P1、P2、Q0、Q1 and Q 2 are adjusted by 0.
Boundary filtering process when brightness component Bs is equal to 3
At a boundary filter strength Bs value of 3, the calculation of the P 0、p1 and Q 0、q1 filters is as follows (P 0、P1 and Q 0、Q1 are filtered values):
If PictureDbrEnableFlag has a value of 1, the values of P 0、P1、Q0 and Q 1 are adjusted by 0.
(Fifth) boundary Filtering procedure when luminance component Bs is equal to 2
When the value of the boundary filter strength Bs is 2, the calculation process for filtering P0 and Q0 is as follows (P0 and Q0 are the filtered values):
If PictureDbrEnableFlag has a value of 1, the values of P0 and Q0 are adjusted 3.2.11.
Boundary filtering process when luminance component Bs is equal to 1
When the value of the boundary filter strength Bs is 1, the calculation process for the filtering of P 0 and Q 0 is as follows (P 0 and Q 0 are the filtered values):
If PictureDbrEnableFlag has a value of 1, the values of P 0 and Q 0 are adjusted by 0.
Boundary filtering process when luminance component Bs is equal to 0
When the boundary filter strength Bs has a value of 0 and PictureAltDbrEnableFlag has a value of 1, the filter adjustment for P 0 and Q 0 is calculated as follows (P 0 and Q 0 are filter adjusted values):
Boundary filtering process when chroma component Bs is greater than 0
When the value of the boundary filter strength Bs is greater than 0, the calculation process for the filtering of P 0 and Q 0 is as follows (P 0 and Q 0 are the filtered values):
when the value of the boundary filter strength Bs is equal to 3, the calculation process for the filtering of P 1 and Q 1 is as follows (P 1 and Q 1 are the filtered values):
(nine) deblocking Filter adjustment Process
Adjust the values of P i and Q i (i can be 0, 1, or 2):
Clip1 (x) represents limiting x between [0,2 (bit_depth) -1] (including 0 and 2 (bit_depth) -1). bit_depth represents the bit depth of an image, typically 8, 10, 12, etc.
4. Sample adaptive compensation filtering (SAMPLE ADAPTIVE offset, SAO)
SAO filtering is used to eliminate ringing effects. The ringing effect is a phenomenon that ripple is generated around the edges after decoding due to quantization distortion of the high-frequency alternating-current coefficient, and the larger the transform block size is, the more obvious the ringing effect is. The basic principle of SAO is to compensate for the addition of negative values to the peak pixels and positive values to the trough pixels in the reconstructed curve. SAO takes an acquisition and transmission unit (Collect Transfer unit, CTU) as a basic unit, and comprises two main compensation forms: boundary Offset (EO) and sideband Offset (BO), and parameter fusion techniques are introduced.
The sample offset compensation filtering includes base sample offset compensation (SAO described above), enhanced sample offset compensation (ESAO described above), and cross-component sample offset compensation (CCSAO described above, which is done for chroma only).
Basic process of SAO:
And if the current component of the current block starts SAO, leading out a basic sample value offset compensation unit, deriving basic sample value offset compensation information corresponding to the current basic sample value offset compensation unit, and finally operating each component of each sample in the current basic sample value offset compensation unit to obtain an offset sample value. Otherwise, the value of the corresponding component of the filtered sample is directly taken as the value of the sample component after offset.
ESAO basic procedure (ESAO and SAO are mutually exclusive, i.e. SAO enabled and ESAO not enabled):
If the current component of the current block is started ESAO, leading out an enhanced sample value offset compensation unit, and then operating each component of each sample in the enhanced sample value offset compensation unit to obtain an offset sample value; otherwise, the value of the corresponding component of the filtered sample is directly taken as the value of the sample component after offset.
CCSAO basic procedure:
If the current component of the current block is enabled CCSAO, a cross-component sample offset compensation unit is led out, and then each component sample in the current cross-component sample offset compensation unit is operated to obtain a cross-component offset sample value. Otherwise, directly taking the value of the corresponding component of the sample after the offset as the value of the sample component after the cross-component offset.
For the chroma pixels that need CCSAO, the classification of the current chroma pixel is determined based on the deblocking filtered (adjusted) luminance value (Y5 described above) at the corresponding location and the current chroma pixel (the pixel after ESAO). Based on the classification, a compensation value for the current chroma pixel is determined, and the current chroma pixel is added with the compensation value to obtain CCSAO filtered chroma pixel values.
5. Adaptive loop filtering (adaptive loop filter ALF)
The ALF filtering is an optimal filter in the mean square sense, i.e. a wiener filter, calculated from the original signal and the distorted signal. The adaptive correction filter (ALF described above) may or may not be enabled with the enhanced adaptive correction filter (EALF described above).
The basic process of ALF is as follows:
If the ALF is not enabled by the current component of the current image, directly taking the value of the sample component after the offset as the value of the corresponding reconstructed sample component; otherwise, the corresponding offset sample components are subjected to adaptive correction filtering.
The adaptive correction filtering unit is derived from the maximum encoding unit, and sequentially processes the units in the raster scan order. Firstly, decoding the self-adaptive correction filter coefficient of each component, then leading out the self-adaptive correction filter unit, determining the index of the self-adaptive correction filter coefficient of the brightness component of the current self-adaptive correction filter unit, and finally carrying out self-adaptive correction filter on the brightness component and the chromaticity component of the self-adaptive correction filter unit to obtain a reconstruction sample.
If EALF is enabled for the current image, 15 filter coefficients are set for each group of ALF of the current image, the shape is shown in (a) of fig. 4, and the maximum value of the number of the adaptive correction filters for the luminance component of the current image is 64; otherwise, there are 9 filter coefficients per set of ALF of the current image, and the maximum value of the number of the current image luminance component adaptive correction filters is 16 as shown in (b) of fig. 4.
Whether the DBF filtering process, the SAO processing filtering process or the ALF filtering process, the filtering process is classified based on the current pixel value of the boundary to be filtered or the relation (such as the high or low of the authority level) between the pixel value of the current block and the pixel value of the adjacent block at the surrounding side of the current block, and then different filtering operations are performed based on different categories.
As described above, in video encoding and decoding, the purpose of the filtering process is to achieve smooth noise reduction and detail removal in the video image block, and to preserve the image edge function to the maximum extent. Typically, the electronic device filters the boundaries of multiple regions in the image to preserve the edges of the image.
In order to protect the data security of the user, a general image may be provided with a right area (e.g., a high right level area, a low right level area) and a non-right area (a zero right level area), and the right area may include a plurality of right areas of different high and low levels. The rights area refers to an area where only a user having an associated right can correctly decode the image content of the area. For example, only a user having a high authority can correctly view an image area of an arbitrary authority level, while a user having only a low authority cannot view an image area of a higher authority level, and can only view an image area of a low authority level and an image area of a zero authority level.
Therefore, as the viewing rights of different users to different areas in the image are different, under the condition that the low-right user does not have the viewing rights of the high-right area in the image, the electronic equipment can not carry out filtering processing on the connected boundaries of the image areas with different right grades in the video decoding process, so that the quality of the image is poor.
In order to solve the problem, an embodiment of the application provides an image processing method, an image processing device, an electronic device and a storage medium, wherein the method divides an image to be processed into a plurality of image subareas through the electronic device, acquires the authority level of each image subarea, judges the authority level of each image subarea, and if the authority levels of at least two adjacent image subareas in the plurality of image subareas are different, it is indicated that the condition that the filtering processing is not performed may still exist after the filtering of the two image subareas in the decoding process, and the quality of the image to be processed is not high. Therefore, based on decoding information such as division of the authority areas, the electronic equipment can carry out filtering processing on the connected boundary between at least two image subareas again, and the image quality of different authority areas in the image to be processed can be improved.
The image processing method provided by the embodiment of the application can be applied to the image processing device 11 shown in fig. 5, wherein the image processing device 11 comprises an image dividing unit 101, an image filtering unit 102 and an image decoding unit 103.
An image dividing unit 101 for dividing the image to be processed into a plurality of image sub-areas,
An image decoding unit 103 for acquiring the authority level of each image sub-area.
And the image filtering unit 102 is configured to perform filtering processing on a boundary of the at least two image subregions if authority levels of at least two adjacent image subregions in the plurality of image subregions are different.
The image decoding unit 103 is further configured to receive a decoded code stream of the image to be processed, perform decoding based on the decoded code stream, and obtain a decoded image to be processed.
In some embodiments, the image filtering unit 102 is further configured to determine a target to-be-filtered region on a vertical side of a first sub-region of the at least two image sub-regions; the vertical side is the left side or the right side of the boundary of the first subarea, and the target area to be filtered is any one of the following: the right side of the border to be filtered is connected with the right side of the border to be filtered, the left side of the border to be filtered is connected with the left side of the border to be filtered, and the right side of the border to be filtered is connected with the left side of the border to be filtered; the image filtering unit 102 is further configured to perform deblocking vertical filtering on the target area to be filtered.
In some embodiments, the image filtering unit 102 is further configured to determine a target to-be-filtered region on a horizontal side of a first sub-region of the at least two image sub-regions; the horizontal side is the upper side or the lower side of the boundary of the first subarea, and the target area to be filtered is any one of the following: the device comprises a lower side to-be-filtered area of a connected boundary, an upper side to-be-filtered area of the connected boundary, a lower side to-be-filtered area of the connected boundary and an upper side to-be-filtered area; the image filtering unit 102 is further configured to perform deblocking effect level filtering on the target area to be filtered.
In some embodiments, the image filtering unit 102 is further configured to determine, according to the set filtering order, a target area to be filtered from a surrounding side of a first sub-area of the at least two image sub-areas; the surrounding side includes in order: the right level of the first subarea is different from that of other subareas, and the target area to be filtered is any one of the following: a first side region within the first sub-region that is opposite the contiguous boundary, a second side region outside the first sub-region that is opposite the contiguous boundary, a first side region, and a second side region; the image filtering unit 102 is further configured to perform sample adaptive compensation filtering on the target region to be filtered according to the set filtering sequence.
In some embodiments, the image filtering unit 102 is further configured to determine, according to the set filtering order, a target area to be filtered from a surrounding side of a first sub-area of the at least two image sub-areas; the surrounding side includes in order: the right level of the first subarea is different from that of other subareas, and the target area to be filtered is any one of the following: a first side region within the first sub-region that is opposite the contiguous boundary, a second side region outside the first sub-region that is opposite the contiguous boundary, a first side region, and a second side region; the image filtering unit 102 is further configured to perform adaptive loop filtering on the target to-be-filtered region according to the set filtering sequence.
In some embodiments, the image filtering unit 102 is further configured to determine, according to decoding information of the image to be processed, whether filtering processing based on a neural network needs to be performed on a first sub-area of the at least two image sub-areas; under the condition that the first subarea needs to be subjected to filtering processing based on a neural network, inputting a reconstructed pixel value of the first subarea and a permission level of the first subarea into the neural network; and obtaining a filtered reconstructed pixel of the first sub-region of the output.
In some embodiments, the image filtering unit 102 is further configured to input the authority level of at least one image sub-area adjacent to the first sub-area into the neural network.
In some embodiments, the image filtering unit 102 is further configured to perform a filtering process on the luminance channel component on the boundary between at least two image sub-regions; or filtering the chrominance channel component on the joint boundary between at least two image subregions; or filtering the luminance channel component and the chrominance channel component on the meeting boundary between at least two image subregions.
Fig. 6 is a flowchart of an image processing method according to an embodiment of the present application. The image processing method provided by the application can be applied to the image processing device shown in fig. 5. As shown in fig. 6, the image processing method provided by the present application specifically may include the following steps:
s101, dividing an image to be processed into a plurality of image subregions, and acquiring the authority level of each image subregion.
Wherein the authority class is divided into at least two authority classes with different classes.
In some embodiments, the image may be provided with a rights area and a non-rights area (zero rights level area). The image subregion of the non-rights region (zero-rights-level region) is viewable by any user. The authority area may include a plurality of authority areas with different levels, for example, the authority area may include a first authority level area (high authority level area), a second authority level area (medium authority level area), and a third authority level area (low authority level area), where the authority level of the first authority level area is higher than that of the second authority level area, and the authority level of the second authority level area is higher than that of the third authority level area. While each authority level may be divided into a plurality of sub-authority levels, the present embodiment is not particularly limited.
In some embodiments, the permission level of the target area in the image is configured according to the category of the target area. For example, the face of a person in an image is a high authority level region, the upper body part is a low authority level region, and the lower body part is a zero authority level region; the license plate part in the vehicle image is a high-authority level region, the vehicle window part is a low-authority level region, the vehicle body part is a zero-authority level region and the like.
In some embodiments, the authority levels of the image sub-regions in the image to be processed may be divided into a high authority level, a low authority level region, and a zero authority level region, and as an example, as shown in fig. 7, the plurality of image sub-regions may include a low authority level region, a high authority level region, and a zero authority level region.
It will be appreciated that different users may view different regions of the image with different permissions, a high-permission user may view a sub-region of the image at any permission level (e.g., high-permission level, low-permission level region, zero-permission level region), a low-permission user may view a sub-region of the image at a low-permission level and a sub-region of the image at a zero-permission level, and a non-permission user may view only a sub-region of the image at a zero-permission level.
In some embodiments, before dividing the image to be processed into a plurality of image sub-areas, the image decoding unit 103 is utilized to receive a decoding code stream of the image to be processed, and perform decoding based on the decoding code stream, so as to obtain a decoded image to be processed.
In other embodiments, when the image to be processed includes a target image area, the target image area is regarded as a filtering processing object, a decoded code stream of the image to be processed is received by the image decoding unit 103, and the target image area is decoded based on the decoded code stream. After the target image area finishes decoding, the target image area can be directly divided into a plurality of image subareas without waiting for the whole image to be processed to finish decoding, and the authority level of each image subarea is obtained.
In other embodiments, when the image to be processed includes the target image area, the target image area may be used as a filtering object, and the target image area may be expanded to a surrounding side (left side, right side, upper side or lower side) to obtain a detection area with a width CW and a height CH, and the detection area is divided to obtain image sub-block areas with a plurality of NxN (N is preferably 2 or 4 or 8) pixel units, where authority levels in each image sub-block area are the same. The detection regions may have different or identical extended pixel units in the four directions.
For example, as shown in (a) of fig. 8, the detection area may be extended to the left and up by T (T is preferably 4 or 8 or 16) unit pixel points centering on the target image area.
For example, as shown in fig. 8 (b), the detection area may be added with an upper right-side area as the entire detection area to the detection area of fig. 8 (a).
S102, if authority levels of at least two adjacent image subareas in the plurality of image subareas are different, filtering processing is carried out on a connected boundary (boundary to be filtered) between the at least two image subareas.
In some embodiments, when the image to be processed includes the target image area of the filtering processing object, after the target image area is decoded by the image decoding unit 103, the target image area may be directly divided into a plurality of image sub-areas, and if the authority levels of at least two adjacent image sub-areas in the plurality of image sub-areas are different, the filtering processing may be directly performed on the boundary between the two image sub-areas, without waiting for the whole image to be processed to complete the decoding before performing the filtering processing.
In some embodiments, the filtering process includes: at least one of deblocking vertical filtering, deblocking horizontal filtering, sample adaptive compensation filtering, adaptive loop filtering, and neural network filtering.
In some embodiments, the at least two adjacent image sub-regions may be two image sub-regions that meet, or may be two image sub-regions that do not meet but are closer together.
In some embodiments, the filtering of the luminance channel component may be performed on the meeting boundary between at least two image subregions; or filtering the chrominance channel component on the joint boundary between at least two image subregions; or filtering the luminance channel component and the chrominance channel component on the meeting boundary between at least two image subregions.
In some embodiments, the image processing apparatus may acquire image area information of any authority level in the image to be processed, and image area information of a non-authority area; or the image processing device may acquire the image area information of the second authority level and the image area information of the non-authority in the image to be processed.
It can be understood that the image to be processed is divided into a plurality of image subareas, and the authority level of each image subarea is obtained, so that the authority level of each image subarea is judged, if the authority levels of at least two adjacent image subareas in the plurality of image subareas are different, it is indicated that the condition that the filtering processing is not performed on the two image subareas may still exist after the filtering in the decoding process, and the quality of the image to be processed is not high. Therefore, based on decoding information such as division of the authority areas, filtering processing is carried out again on the connected boundary between at least two image subareas, so that the image quality of different authority areas in the image to be processed can be improved.
In some embodiments, if there is a vertical boundary between the at least two image sub-regions (i.e., the positional relationship between the two image sub-regions is horizontally connected), the filtering process includes deblocking vertical filtering, and if the condition of DBF filtering is satisfied, deblocking vertical filtering may be performed on the boundary between the at least two image sub-regions, and referring to fig. 9, the decoding method provided in this embodiment includes steps Sa1 to Sa2 as follows.
Sa1, determining a target to-be-filtered region on the vertical side of a first sub-region of the at least two image sub-regions.
The vertical side is the left side or the right side of the boundary of the first subarea, and the target area to be filtered is any one of the following: the right side of the border to be filtered area, the left side of the border to be filtered area, the right side of the border to be filtered area and the left side to be filtered area.
Sa2, performing deblocking effect vertical filtering on the target region to be filtered.
In some embodiments, the at least two adjacent image sub-regions further comprise a second sub-region, the second sub-region being horizontally adjacent to the first sub-region, where horizontally adjacent means that the second sub-region is to the left or right of the horizontal direction of the first sub-region. The authority levels of the adjacent at least two image subregions are different, including any one of the following cases:
1. the authority level of the first sub-area is lower than the authority level of the second sub-area.
2. The authority level of the first sub-area is higher than the authority level of the second sub-area.
3. The permission level of the first sub-area is not equal to the permission level of the second sub-area.
4. The authority level of the first subarea is not equal to that of the second subarea, and the filtering intensity (BS value) of the connected boundary is larger than or equal to the preset filtering intensity.
As shown in (a) of fig. 10, in the case where the second sub-region is located on the left side of the first sub-region, the deblocking vertical filtering is performed on the target to-be-filtered region for the different cases of the authority levels of the above-described adjacent at least two image sub-regions, as described in detail.
In the first embodiment, when the authority level of the first sub-region is lower than that of the second sub-region, the target to-be-filtered region is the right to-be-filtered region (in the first sub-region) of the boundary, and the deblocking effect vertical filtering is performed on the target to-be-filtered region. The pixel point area n with the right-side to-be-filtered area being n columns of pixel point areas with the right-side to-be-filtered area being 1.
In the second embodiment, when the authority level of the first sub-region is higher than that of the second sub-region, the target to-be-filtered region is the right to-be-filtered region (in the first sub-region) of the boundary, and the deblocking effect vertical filtering is performed on the target to-be-filtered region. The pixel point area n with the right-side to-be-filtered area being n columns of pixel point areas with the right-side to-be-filtered area being 1.
In the third embodiment, under the condition that the authority level of the first sub-area is not lower than that of the second sub-area, the target to-be-filtered area is the right to-be-filtered area (in the first sub-area) of the boundary, and the deblocking effect vertical filtering is performed on the target to-be-filtered area. The pixel point area n with the right-side to-be-filtered area being n columns of pixel point areas with the right-side to-be-filtered area being 1.
In the fourth embodiment, when the authority level of the first sub-region is not equal to the authority level of the second sub-region and the BS value of the boundary is greater than or equal to 1, the target to-be-filtered region is the right to-be-filtered region (within the first sub-region) of the boundary, and the deblocking effect vertical filtering is performed on the target to-be-filtered region. The pixel point area n with the right-side to-be-filtered area being n columns of pixel point areas with the right-side to-be-filtered area being 1.
As shown in fig. 10 b, in the first to fourth embodiments, the target to-be-filtered regions are all right to-be-filtered regions (in the first sub-region) of the boundary.
In the fifth embodiment, when the authority level of the first sub-region is lower than that of the second sub-region, the target to-be-filtered region is the left to-be-filtered region (within the second sub-region) of the contiguous boundary, and the deblocking effect vertical filtering is performed on the target to-be-filtered region. The pixel point area n with the left side to-be-filtered area being n columns with the left width of the boundary is preferably 1.
In the sixth embodiment, when the authority level of the first sub-region is higher than that of the second sub-region, the target to-be-filtered region is the left to-be-filtered region (within the second sub-region) of the boundary, and the deblocking effect vertical filtering is performed on the target to-be-filtered region. The pixel point area n with the left side to-be-filtered area being n columns with the left width of the boundary is preferably 1.
In the seventh embodiment, when the authority level of the first sub-region is not equal to the authority level of the second sub-region, the target to-be-filtered region is the left to-be-filtered region (within the second sub-region) of the contiguous boundary, and the deblocking effect vertical filtering is performed on the target to-be-filtered region. The pixel point area n with the left side to-be-filtered area being n columns with the left width of the boundary is preferably 1.
In the eighth embodiment, when the authority level of the first sub-region is not equal to the authority level of the second sub-region and the BS value of the boundary is greater than or equal to 1, the target to-be-filtered region is the left to-be-filtered region (within the second sub-region) of the boundary, and the deblocking effect vertical filtering is performed on the target to-be-filtered region. The pixel point area n with the left side to-be-filtered area being n columns with the left width of the boundary is preferably 1.
As shown in fig. 10 c, in the fifth to eighth embodiments, the target to-be-filtered region is the left to-be-filtered region (the second sub-region) of the boundary.
In the ninth embodiment, when the authority level of the first sub-region is lower than that of the second sub-region, the target to-be-filtered region is a right to-be-filtered region and a left to-be-filtered region (in the first region and in the second sub-region) of the boundary, and the deblocking effect vertical filtering is performed on the target to-be-filtered region. The target to-be-filtered area is a pixel area with the width of 2n, n is preferably 1, and the right n columns and the left n sides of the boundary are connected.
In the tenth embodiment, under the condition that the authority level of the first sub-area is higher than that of the second sub-area, the target to-be-filtered area is a right to-be-filtered area and a left to-be-filtered area (in the first area and in the second sub-area) of the boundary, and the deblocking effect vertical filtering is performed on the target to-be-filtered area. The target to-be-filtered area is a pixel area with the width of 2n, n is preferably 1, and the right n columns and the left n sides of the boundary are connected.
In the eleventh embodiment, in the case that the authority level of the first sub-area is not equal to the authority level of the second sub-area, the target to-be-filtered area is a right to-be-filtered area and a left to-be-filtered area (in the first area and in the second sub-area) of the boundary, and the deblocking effect vertical filtering is performed on the target to-be-filtered area. The target to-be-filtered area is a pixel area with the width of 2n, n is preferably 1, and the right n columns and the left n sides of the boundary are connected.
In the twelfth embodiment, when the authority level of the first sub-region is not equal to the authority level of the second sub-region and the BS value of the boundary is greater than or equal to 1, the target to-be-filtered region is a right to-be-filtered region and a left to-be-filtered region (in the first region and in the second sub-region) of the boundary, and the deblocking effect vertical filtering is performed on the target to-be-filtered region. The target to-be-filtered area is a pixel area with the width of 2n, n is preferably 1, and the right n columns and the left n sides of the boundary are connected.
As shown in fig. 10 d, in the ninth to twelfth embodiments, the target to-be-filtered regions are the right to-be-filtered region and the left to-be-filtered region (in the first region and the second sub-region) of the boundary.
It will be appreciated that if there is a vertical boundary between the at least two image sub-regions, and the filtering includes deblocking vertical filtering, a target to-be-filtered region on a vertical side of a first sub-region of the at least two image sub-regions may be determined, and the deblocking vertical filtering may be performed on the target to-be-filtered region. Based on the authority level of the two image subareas, the corresponding target to-be-filtered area can be selected for deblocking effect vertical filtering. Under the condition that the image processing device acquires the information of the corresponding image area according to the authority level of the user, the method can meet the requirement of performing deblocking effect vertical filtering on the decoded image again for the user with any authority level, and improves the image quality of different authority areas in the image to be processed, so that the user experience feeling under each authority level is improved.
In some embodiments, if there is a boundary between the at least two image sub-areas in a horizontal direction (i.e., the positional relationship between the two image sub-areas is vertically connected), the filtering process includes deblocking horizontal filtering, and if the condition of DBF filtering is satisfied, the deblocking horizontal filtering is performed on the boundary between the at least two image sub-areas, and referring to fig. 11, the decoding method provided in this embodiment includes the following steps Sb1 to Sb2.
Sb1, determining a target area to be filtered on the horizontal side of a first subarea in at least two image subareas.
The horizontal side is the upper side or the lower side of the boundary of the first subarea, and the target area to be filtered is any one of the following: the upper side of the border to be filtered is connected with the upper side of the border to be filtered, the lower side of the border to be filtered is connected with the lower side of the border to be filtered, and the upper side of the border to be filtered is connected with the lower side of the border to be filtered.
And Sb2, performing deblocking effect horizontal filtering on the target region to be filtered.
In some embodiments, the at least two adjacent image sub-regions further comprise a third sub-region, the third sub-region being vertically adjacent to the first sub-region, where vertically adjacent means that the third sub-region is located above or below the first sub-region in a horizontal direction. The authority levels of the adjacent at least two image subregions are different, including any one of the following cases:
1. The authority level of the first sub-area is lower than the authority level of the third sub-area.
2. The authority level of the first sub-area is higher than the authority level of the third sub-area.
3. The permission level of the first sub-area is not equal to the permission level of the third sub-area.
4. The authority level of the first subarea is not equal to that of the third subarea, and the filtering intensity (BS value) of the connected boundary is larger than or equal to the preset filtering intensity.
As shown in fig. 12 (a), the third sub-region is located above the first sub-region, and the deblocking effect level filtering is performed on the target to-be-filtered region for the different cases of the authority levels of the above-mentioned adjacent at least two image sub-regions, as described in detail below.
In the thirteenth embodiment, under the condition that the authority level of the first sub-area is lower than that of the third sub-area, the target to-be-filtered area is the lower to-be-filtered area (in the first sub-area) of the contiguous boundary, and the deblocking effect level filtering is performed on the target to-be-filtered area. The lower region to be filtered is a pixel point region with n rows of upward-wide adjacent boundaries, and n is preferably 1.
In the fourteenth embodiment, when the authority level of the first sub-area is higher than that of the third sub-area, the target to-be-filtered area is a lower to-be-filtered area (in the first sub-area) of the contiguous boundary, and the deblocking effect level filtering is performed on the target to-be-filtered area. The lower region to be filtered is a pixel point region with n rows of upward-wide adjacent boundaries, and n is preferably 1.
In the fifteenth embodiment, under the condition that the authority level of the first sub-area is not lower than that of the third sub-area, the target to-be-filtered area is the lower to-be-filtered area (in the first sub-area) of the contiguous boundary, and the deblocking effect level filtering is performed on the target to-be-filtered area. The lower region to be filtered is a pixel point region with n rows of upward-wide adjacent boundaries, and n is preferably 1.
In the sixteenth embodiment, when the authority level of the first sub-region is not equal to the authority level of the third sub-region and the BS value of the boundary is greater than or equal to 1, the target area to be filtered is the lower area (in the first sub-region) of the boundary, and the deblocking effect level filtering is performed on the target area to be filtered. The lower region to be filtered is a pixel point region with n rows of upward-wide adjacent boundaries, and n is preferably 1.
As shown in fig. 12 b, in the thirteenth to sixteenth embodiments, the target to-be-filtered region is the lower to-be-filtered region (within the first sub-region) of the boundary.
In the seventeenth embodiment, when the authority level of the first sub-area is lower than that of the third sub-area, the target to-be-filtered area is an upper to-be-filtered area (in the third sub-area) of the contiguous boundary, and the deblocking effect level filtering is performed on the target to-be-filtered area. The upper region to be filtered is a pixel point region with a downward wide n rows of contiguous boundaries, and n is preferably 1.
In the eighteenth embodiment, when the authority level of the first sub-region is higher than that of the third sub-region, the target to-be-filtered region is an upper to-be-filtered region (within the third sub-region) of the contiguous boundary, and the deblocking effect level filtering is performed on the target to-be-filtered region. The upper region to be filtered is a pixel point region with a downward wide n rows of contiguous boundaries, and n is preferably 1.
In the nineteenth embodiment, when the authority level of the first sub-region is not equal to the authority level of the third sub-region, the target to-be-filtered region is an upper to-be-filtered region (within the third sub-region) of the contiguous boundary, and the deblocking effect level filtering is performed on the target to-be-filtered region. The upper region to be filtered is a pixel point region with a downward wide n rows of contiguous boundaries, and n is preferably 1.
In the twenty-second embodiment, when the authority level of the first sub-region is not equal to the authority level of the third sub-region and the BS value of the boundary is greater than or equal to 1, the target area to be filtered is the upper area to be filtered (within the third sub-region) of the boundary, and the deblocking effect level filtering is performed on the target area to be filtered. The upper region to be filtered is a pixel point region with a downward wide n rows of contiguous boundaries, and n is preferably 1.
As shown in fig. 12 c, in the seventeenth to twenty embodiments, the target to-be-filtered region is the upper to-be-filtered region (in the third sub-region) of the boundary.
In twenty-first embodiment, under the condition that the authority level of the first subarea is lower than that of the third subarea, the target to-be-filtered area is a lower to-be-filtered area and an upper to-be-filtered area (in the first area and the third subarea) of the contiguous boundary, and the target to-be-filtered area is subjected to deblocking effect level filtering. The target to-be-filtered area is an upward n row and a downward n side of the boundary, the width of the pixel area is 2n, and n is preferably 1.
In twenty-second embodiment, under the condition that the authority level of the first sub-area is higher than that of the third sub-area, the target to-be-filtered area is a lower to-be-filtered area and an upper to-be-filtered area (in the first area and in the third sub-area) of the connected boundary, and the deblocking effect level filtering is performed on the target to-be-filtered area. The target to-be-filtered area is an upward n row and a downward n side of the boundary, the width of the pixel area is 2n, and n is preferably 1.
In twenty-third embodiments, in the case that the authority level of the first sub-region is not equal to the authority level of the third sub-region, the target to-be-filtered region is a lower to-be-filtered region and an upper to-be-filtered region (in the first region and in the third sub-region) of the contiguous boundary, and the deblocking effect level filtering is performed on the target to-be-filtered region. The target to-be-filtered area is an upward n row and a downward n side of the boundary, the width of the pixel area is 2n, and n is preferably 1.
In twenty-fourth embodiment, when the authority level of the first sub-region is not equal to the authority level of the third sub-region and the BS value of the boundary is greater than or equal to 1, the target area to be filtered is a lower area to be filtered and an upper area to be filtered (in the first area and in the third sub-region) of the boundary, and the deblocking effect level filtering is performed on the target area to be filtered. The target to-be-filtered area is an upward n row and a downward n side of the boundary, the width of the pixel area is 2n, and n is preferably 1.
In the twenty-first to twenty-fourth embodiments, the target region to be filtered is the upper and lower regions to be filtered (in the first region and the third sub-region) of the boundary, as shown in (d) of fig. 12.
It will be appreciated that if there is a boundary in the horizontal direction between the at least two image sub-regions, and the filtering includes deblocking horizontal filtering, a target region to be filtered on the horizontal side of the first sub-region of the at least two image sub-regions may be determined, and the deblocking horizontal filtering may be performed on the target region to be filtered. Based on the authority level of the two image subareas, the corresponding target area to be filtered can be selected for deblocking effect level filtering. Under the condition that the image processing device acquires the information of the corresponding image area according to the authority level of the user, the method can meet the requirement of performing deblocking effect horizontal filtering on the decoded image again for the user with any authority level, and improves the image quality of different authority areas in the image to be processed, so that the user experience feeling under each authority level is improved.
In some embodiments, the filtering includes sample adaptive compensation filtering, and if the authority levels of at least two adjacent image sub-areas in the plurality of image sub-areas are different, the sample adaptive compensation filtering is performed on the boundary between the at least two image sub-areas, referring to fig. 13, and the decoding method provided in this embodiment includes the following steps Sc1 to Sc2.
Sc1, determining a target region to be filtered from the surrounding side of a first subarea in at least two image subareas according to a set filtering sequence.
Wherein, the surrounding side includes in proper order: the left side, the upper side, the right side and the lower side of the meeting boundary of the first subarea and other subareas, and the target area to be filtered is any one of the following under the condition that the authority level of the first subarea is different from that of the other subareas: a first side region within the first sub-region that is opposite the contiguous boundary, a second side region outside the first sub-region that is opposite the contiguous boundary, a first side region, and a second side region.
In some embodiments, the authority levels of other sub-areas around the first sub-area may be sequentially determined according to the set filtering sequence, and the target area to be filtered is determined when the authority level of the other sub-areas around the first sub-area is different from the authority level of the first sub-area. The target region to be filtered may include the following two cases:
In the first case, if other sub-regions with different authority levels from the first sub-region are located at the vertical side of the first sub-region, the first side region is a pixel region with n rows of widths from the boundary to the direction of the first sub-region; the second side region is a pixel region with n rows of widths in the direction from the boundary to the other sub-regions.
If the other subarea with different authority levels from the first subarea is positioned at the horizontal side of the first subarea, the first side area is a pixel area with n rows of widths from the joint boundary to the direction of the first subarea; the second side region is a pixel region with n rows of widths in the direction from the boundary to the other sub-regions.
Both of the above conditions may exist simultaneously.
Illustratively, as shown in fig. 14, there are a left adjacent sub-area a, an upper left adjacent sub-area B, an upper adjacent sub-area C, and an upper right adjacent sub-area D around the first sub-area. The authority levels of the other subregions on the periphery of the first subregion are sequentially judged in the filtering order from right to left and from top to bottom, and in the case where the authority levels of the adjacent subregion a and the adjacent subregion C are different from those of the first subregion, and in the case where the authority levels of the adjacent subregion B and the adjacent subregion D are the same as those of the first subregion, the target region to be filtered is determined to be a column pixel region having a width of 1 in the left direction and a column pixel region having a width of 1 in the right direction with the boundary of the adjacent subregion a, and a row pixel region having a width of 1 in the upward direction and a row pixel region having a width of 1 in the downward direction with the boundary of the adjacent subregion C, that is, the region of the hatched portion shown in fig. 14.
In other embodiments, the target area to be filtered is determined in case the authority level of the other sub-areas around the first sub-area is higher than the authority level of the first sub-area. The target region to be filtered may include the following two cases:
In the first case, if other subareas higher than the authority level of the first subarea are located at the vertical side of the first subarea, the first side area is a pixel area with n rows of widths from the joint boundary to the direction of the first subarea; the second side region is a pixel region with n rows of widths in the direction from the boundary to the other sub-regions.
If the other subareas with authority level higher than that of the first subarea are positioned at the horizontal side of the first subarea, the first side area is a pixel area with n rows of widths from the joint boundary to the direction of the first subarea; the second side region is a pixel region with n rows of widths in the direction from the boundary to the other sub-regions.
Both of the above conditions may exist simultaneously.
And Sc2, performing sample self-adaptive compensation filtering on the target region to be filtered according to a set filtering sequence.
In some embodiments, as shown in fig. 14 above, the target region to be filtered, i.e., the region of the hatched portion in fig. 14, is subjected to sample adaptive compensation filtering in a filtering order from right to left and then from top to bottom.
In some embodiments, adjacent sub-regions on the periphery of the first sub-region that are not subject to the filtering process retain the original pixel values. For example, as shown in fig. 14, the pixel value of the boundary between the first sub-region and the adjacent sub-region B is constant.
It can be understood that if authority levels of at least two adjacent image sub-areas in the plurality of image sub-areas are different, the target area to be filtered can be determined from the surrounding side of the first sub-area in the at least two image sub-areas according to the set filtering sequence, and the sample adaptive compensation filtering is performed on the target area to be filtered according to the set filtering sequence. Based on the authority level of the adjacent subareas at the periphery of the first subarea, a corresponding target area to be filtered can be selected for sample self-adaptive compensation filtering. Under the condition that the image processing device acquires the information of the corresponding image area according to the authority level of the user, the method can meet the requirement of sample self-adaptive compensation filtering on the decoded image again for the user with any authority level, and improves the image quality of different authority areas in the image to be processed, thereby improving the user experience under each authority level.
In some embodiments, the filtering process includes adaptive loop filtering, and if the authority levels of at least two adjacent image sub-areas in the plurality of image sub-areas are different, the adaptive loop filtering is performed on the boundary between the at least two image sub-areas, referring to fig. 15, the decoding method provided in this embodiment includes the following steps Sd1 to Sd2.
Sd1, determining a target region to be filtered from the surrounding side of a first subarea in at least two image subareas according to a set filtering sequence.
Wherein, the surrounding side includes in proper order: the left side, the upper side, the right side and the lower side of the meeting boundary of the first subarea and other subareas, and the target area to be filtered is any one of the following under the condition that the authority level of the first subarea is different from that of the other subareas: a first side region within the first sub-region that is opposite the contiguous boundary, a second side region outside the first sub-region that is opposite the contiguous boundary, a first side region, and a second side region.
In some embodiments, the authority levels of other sub-areas around the first sub-area may be sequentially determined according to the set filtering sequence, and the target area to be filtered is determined when the authority level of the other sub-areas around the first sub-area is different from the authority level of the first sub-area. The target region to be filtered may include the following two cases:
In the first case, if other sub-regions with different authority levels from the first sub-region are located at the vertical side of the first sub-region, the first side region is a pixel region with n rows of widths from the boundary to the direction of the first sub-region; the second side region is a pixel region with n rows of widths in the direction from the boundary to the other sub-regions.
If the other subarea with different authority levels from the first subarea is positioned at the horizontal side of the first subarea, the first side area is a pixel area with n rows of widths from the joint boundary to the direction of the first subarea; the second side region is a pixel region with n rows of widths in the direction from the boundary to the other sub-regions.
Both of the above conditions may exist simultaneously.
Illustratively, as shown in fig. 16, there are a left adjacent sub-area a, an upper left adjacent sub-area B, an upper adjacent sub-area C, and an upper right adjacent sub-area D around the first sub-area. And judging the authority levels of other subareas at the periphery of the first subarea in sequence according to the filtering sequence from right to left to top to bottom, determining that the target to-be-filtered area is a row pixel area with the width of 1 in the downward direction and a row pixel area with the width of 1 in the upward direction of the connecting boundary of the adjacent subarea B, the adjacent subarea C and the adjacent subarea D, namely the area of the shadow part shown in fig. 16, when the authority levels of the adjacent subarea B, the adjacent subarea C and the adjacent subarea D are different from those of the first subarea and the authority level of the adjacent subarea A is the same as that of the first subarea.
In other embodiments, the target area to be filtered is determined in case the authority level of the other sub-areas around the first sub-area is higher than the authority level of the first sub-area. The target region to be filtered may include the following two cases:
In the first case, if other subareas higher than the authority level of the first subarea are located at the vertical side of the first subarea, the first side area is a pixel area with n rows of widths from the joint boundary to the direction of the first subarea; the second side region is a pixel region with n rows of widths in the direction from the boundary to the other sub-regions.
If the other subareas with authority level higher than that of the first subarea are positioned at the horizontal side of the first subarea, the first side area is a pixel area with n rows of widths from the joint boundary to the direction of the first subarea; the second side region is a pixel region with n rows of widths in the direction from the boundary to the other sub-regions.
Both of the above conditions may exist simultaneously.
Sd2, performing adaptive loop filtering on the target region to be filtered according to the set filtering sequence.
In some embodiments, as shown in fig. 16 described above, the target region to be filtered, i.e., the region of the hatched portion in fig. 16, is subjected to adaptive loop filtering in a filtering order from right to left and then from top to bottom.
In some embodiments, adjacent sub-regions on the periphery of the first sub-region that are not subject to the filtering process retain the original pixel values. For example, as shown in fig. 16, the pixel value of the boundary between the first sub-region and the adjacent sub-region a remains unchanged.
It can be understood that if authority levels of at least two adjacent image sub-areas in the plurality of image sub-areas are different, the target area to be filtered can be determined from the surrounding side of the first sub-area in the at least two image sub-areas according to the set filtering sequence, and the sample adaptive compensation filtering is performed on the target area to be filtered according to the set filtering sequence. The corresponding target to-be-filtered region can be selected for self-adaptive loop filtering based on the authority level of the adjacent sub-region on the periphery of the first sub-region. Under the condition that the image processing device acquires the information of the corresponding image area according to the authority level of the user, the method can meet the requirement of the user of any authority level on the self-adaptive loop filtering of the decoded image again, and improves the image quality of different authority areas in the image to be processed, so that the user experience under each authority level is improved.
In some embodiments, the filtering process includes a filtering process based on a neural network, and if authority levels of at least two adjacent image sub-areas in the plurality of image sub-areas are different, the filtering process based on the neural network may be performed on a boundary between the at least two image sub-areas, and referring to fig. 17, the decoding method provided in this embodiment includes the following steps Se1 to Se2.
And Se1, under the condition that the first subarea needs to be subjected to filtering processing based on the neural network, inputting the pixel value of the first subarea and the authority level of the first subarea into the neural network.
In some embodiments, the pixel value of the first sub-region may be a reconstructed pixel value, where the reconstructed pixel value is a pixel value of the first sub-region after being subjected to decoding filtering processing, and may be a pixel value after being subjected to other filtering processing (such as deblocking vertical filtering) after being decoded.
In some embodiments, in the event that the first sub-region requires a neural network-based filtering process, the reconstructed pixel values of the first sub-region and the authority level of the first sub-region, as well as other decoding information, are input into the neural network. Wherein the other decoding information includes the above-mentioned boundary filter strength Bs value, quantization coefficients of the coding unit transform block, and the like. The present embodiment is not particularly limited.
In some embodiments, the neural network may be a Convolutional Neural Network (CNN), or may be other neural network. The present embodiment is not particularly limited.
In some embodiments, step Se1 may comprise step Se12.
Se12, inputting the authority level of at least one image subarea adjacent to the first subarea into the neural network.
Illustratively, the authority level of the adjacent sub-area a, the authority level of the adjacent sub-area B, the authority level of the adjacent sub-area C, and the authority level of the adjacent sub-area D around the first sub-area in fig. 16 are all input into the neural network.
Se2, obtaining filtering pixels of the first subarea of the output.
As shown in fig. 18, the reconstructed pixel value of the first sub-region, the authority level of at least one image sub-region adjacent to the first sub-region, and other decoding information are input into the neural network, and the output filtered reconstructed pixel of the first sub-region is acquired.
It can be understood that, based on the level of authority of the image subarea, a corresponding target area to be filtered can be selected for filtering processing based on the neural network. Under the condition that the image processing device acquires the information of the corresponding image area according to the authority level of the user, the method can meet the requirement that the decoded image is subjected to filtering processing based on the neural network again for the user with any authority level, and improves the image quality of different authority areas in the image to be processed, so that the user experience feeling under each authority level is improved.
In some embodiments, the processes of performing Sa1 to Sa2, sb1 to Sb2, sc1 to Sc2, sd1 to Sd2 may be sequentially performed, that is, the outputs of Sa1 to Sa2 are inputs of Sb1 to Sb2, the outputs of Sb1 to Sb2 are inputs of Sc1 to Sc2, and the outputs of Sc1 to Sc2 are inputs of Sd1 to Sd 2.
In other embodiments, the processes of performing Sa1 to Sa2, sb1 to Sb2, sc1 to Sc2, sd1 to Sd2, se1 to Se2 may be sequentially performed, that is, the outputs of Sa1 to Sa2 are inputs of Sb1 to Sb2, the outputs of Sb1 to Sb2 are inputs of Sc1 to Sc2, the outputs of Sc1 to Sc2 are inputs of Sd1 to Sd2, and the outputs of Sd1 to Sd2 are inputs of Se1 to Se 2.
In other embodiments, the above-described Sa1 to Sa2, sb1 to Sb2, sc1 to Sc2, sd1 to Sd2, se1 to Se2 may be performed in any combination.
It can be understood that, based on the level of authority of the image subarea, a corresponding target area to be filtered can be selected to perform the filtering process. And the pixel points on one side or two sides of the boundary of the image subregion can be selected to be subjected to filter processing according to the filter processing requirement of the image to be processed, and meanwhile, different filter processing methods can be combined to perform filter processing on the image to be processed according to the combination sequence, so that the image quality of different authority regions in the image to be processed is improved, and the use experience of a user can be improved.
The present application also provides an electronic device, as shown in fig. 19, fig. 19 is a schematic structural diagram of an electronic device provided by the present application, and the electronic device 12 includes a processor 201 and a communication interface 202. The processor 201 and the communication interface 202 are coupled to each other. It is understood that the communication interface 202 may be a transceiver or an input-output interface. Optionally, the electronic device 12 may further comprise a memory 203 for storing instructions for execution by the processor 201 or for storing input data required by the processor 201 to execute instructions or for storing data generated after the processor 201 executes instructions.
In some embodiments, the processor 201 and the communication interface 202 are configured to perform the functions of the image dividing unit 101, the image filtering unit 102, and the image decoding unit 103 described above.
The specific connection medium between the communication interface 202, the processor 201, and the memory 203 is not limited in the embodiment of the present application. In the embodiment of the present application, the communication interface 202, the processor 201 and the memory 203 are connected through the bus 204 in fig. 17, and the bus 204 is shown by a thick line in fig. 19, and the connection manner between other components is only schematically illustrated, but not limited thereto. The buses may be classified as address buses, data buses, control buses, etc. For ease of illustration, only one thick line is shown in fig. 19, but not only one bus or one type of bus.
The memory 203 may be used to store software programs and modules, such as program instructions/modules corresponding to the decoding method or the encoding method provided in the embodiments of the present application, and the processor 201 executes the software programs and modules stored in the memory 203, thereby performing various functional applications and data processing. The communication interface 202 may be used for communication of signaling or data with other devices. The electronic device 12 may have a plurality of communication interfaces 202 in the present application.
It is to be appreciated that the processor in embodiments of the application may be a central processing unit (central processing Unit, CPU), a neural processor (neural processing unit, NPU) or a graphics processor (graphic processing unit, GPU), but may also be other general purpose processors, digital Signal Processors (DSPs), application Specific Integrated Circuits (ASICs), field programmable gate arrays (field programmable GATE ARRAY, FPGAs) or other programmable logic devices, transistor logic devices, hardware components or any combination thereof. The general purpose processor may be a microprocessor, but in the alternative, it may be any conventional processor.
The method steps in the embodiments of the present application may be implemented by hardware, or may be implemented by executing software instructions by a processor. The software instructions may be comprised of corresponding software modules that may be stored in random access memory (random access memory, RAM), flash memory, read-only memory (ROM), programmable ROM (PROM), erasable programmable ROM (erasable PROM, EPROM), electrically Erasable Programmable ROM (EEPROM), registers, hard disk, removable disk, CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC. In addition, the ASIC may reside in a network device or terminal device. The processor and the storage medium may reside as discrete components in a network device or terminal device.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer programs or instructions. When the computer program or instructions are loaded and executed on a computer, the processes or functions described in the embodiments of the present application are performed in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, a network device, a user device, or other programmable apparatus. The computer program or instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another computer readable storage medium, for example, the computer program or instructions may be transmitted from one website site, computer, server, or data center to another website site, computer, server, or data center by wired or wireless means. The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that integrates one or more available media. The usable medium may be a magnetic medium, e.g., floppy disk, hard disk, tape; but also optical media such as digital video discs (digital video disc, DVD); but also semiconductor media such as Solid State Drives (SSDs) STATE DRIVE.
In various embodiments of the application, where no special description or logic conflict exists, terms and/or descriptions between the various embodiments are consistent and may reference each other, and features of the various embodiments may be combined to form new embodiments based on their inherent logic. In the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone, wherein a, B may be singular or plural. In the text description of the present application, the character "/", generally indicates that the associated objects are an or relationship; in the formula of the present application, the character "/" indicates that the front and rear associated objects are a "division" relationship.
It will be appreciated that the various numerical numbers referred to in the embodiments of the present application are merely for ease of description and are not intended to limit the scope of the embodiments of the present application. The sequence number of each process does not mean the sequence of the execution sequence, and the execution sequence of each process should be determined according to the function and the internal logic.
Claims (17)
1. An image processing method, characterized in that the image processing method is performed by an electronic device, comprising:
dividing an image to be processed into a plurality of image subareas, and acquiring the authority level of each image subarea;
And if the authority levels of at least two adjacent image subareas in the plurality of image subareas are different, filtering the connected boundary between the at least two image subareas.
2. The method of claim 1, wherein the permission levels comprise a first permission level, a second permission level, and a zero permission level, the image sub-region of the zero permission level being a user viewable image sub-region of any permission level.
3. A method according to claim 1 or 2, characterized in that before dividing the image to be processed into a plurality of image sub-areas and obtaining the rights level for each image sub-area, it comprises:
And receiving a decoding code stream of the image to be processed, and decoding based on the decoding code stream to obtain the decoded image to be processed and decoding information of the image to be processed.
4. The method according to claim 1 or 2, wherein the filtering process comprises: at least one of deblocking vertical filtering, deblocking horizontal filtering, sample adaptive compensation filtering, adaptive loop filtering, and neural network filtering.
5. The method of claim 4, wherein there is a vertical boundary between the at least two image sub-regions, wherein the filtering comprises deblocking vertical filtering, and wherein the filtering the boundary between the at least two image sub-regions comprises:
Determining a target region to be filtered on the vertical side of a first sub-region of the at least two image sub-regions; the vertical side is the left side or the right side of the boundary of the first subarea, and the target area to be filtered is any one of the following: the right side of the border to be filtered is connected with the right side of the border to be filtered, the left side of the border to be filtered is connected with the left side of the border to be filtered, and the right side of the border to be filtered is connected with the left side of the border to be filtered;
And performing deblocking effect vertical filtering on the target region to be filtered.
6. The method of claim 5, wherein the adjacent at least two image sub-regions further comprise a second sub-region horizontally adjacent to the first sub-region, the authority levels of the adjacent at least two image sub-regions being different, comprising:
The authority level of the first subarea is lower than that of the second subarea;
Or the authority level of the first subarea is higher than that of the second subarea;
Or the authority level of the first subarea is not equal to the authority level of the second subarea;
Or the authority level of the first subarea is not equal to that of the second subarea, and the filtering strength of the connected boundary is larger than or equal to the preset filtering strength.
7. The method of claim 4, wherein there is a horizontally-oriented contiguous boundary between the at least two image sub-regions, the filtering includes deblocking horizontal filtering, and the filtering the contiguous boundary between the at least two image sub-regions includes:
Determining a target region to be filtered on the horizontal side of a first sub-region of the at least two image sub-regions; the horizontal side is the upper side or the lower side of the boundary of the first subarea, and the target area to be filtered is any one of the following: the device comprises a lower side to-be-filtered area of a connected boundary, an upper side to-be-filtered area of the connected boundary, a lower side to-be-filtered area of the connected boundary and an upper side to-be-filtered area;
and performing deblocking effect horizontal filtering on the target region to be filtered.
8. The method of claim 7, wherein the adjacent at least two image sub-regions further comprise a third sub-region vertically adjacent to the first sub-region, the authority levels of the adjacent at least two image sub-regions being different, comprising:
the authority level of the first subarea is lower than that of the third subarea;
Or the authority level of the first subarea is higher than that of the third subarea;
Or the authority level of the first subarea is not equal to the authority level of the third subarea;
Or the authority level of the first subarea is not equal to the authority level of the third subarea, and the filtering strength of the connected boundary is larger than or equal to the preset filtering strength.
9. The method of claim 4, wherein the filtering comprises sample adaptive compensation filtering, and wherein filtering the meeting boundary between the at least two image sub-regions comprises:
according to the set filtering sequence, determining a target region to be filtered from the surrounding side of a first sub-region in the at least two image sub-regions; the surrounding side includes in order: the right level of the first subarea is different from that of the other subareas, and the target area to be filtered is any one of the following: a first side region within the first sub-region that is opposite the contiguous boundary, a second side region outside the first sub-region that is opposite the contiguous boundary, the first side region, and the second side region;
and carrying out sample self-adaptive compensation filtering on the target region to be filtered according to the set filtering sequence.
10. The method of claim 4, wherein the filtering comprises adaptive loop filtering, and wherein filtering the contiguous boundary between the at least two image sub-regions comprises:
according to the set filtering sequence, determining a target region to be filtered from the surrounding side of a first sub-region in the at least two image sub-regions; the surrounding side includes in order: the right level of the first subarea is different from that of the other subareas, and the target area to be filtered is any one of the following: a first side region within the first sub-region that is opposite the contiguous boundary, a second side region outside the first sub-region that is opposite the contiguous boundary, the first side region, and the second side region;
and performing adaptive loop filtering on the target region to be filtered according to the set filtering sequence.
11. The method as recited in claim 4, further comprising:
Judging whether filtering processing based on a neural network is needed to be carried out on a first subarea in the at least two image subareas according to the decoding information of the image to be processed;
under the condition that the first subarea needs to be subjected to filtering processing based on a neural network, inputting a reconstructed pixel value of the first subarea and a permission level of the first subarea into the neural network;
And obtaining the output filtered reconstruction pixels of the first subarea.
12. The method according to claim 11, wherein, in the case where the first sub-region needs to be subjected to a filtering process based on a neural network, inputting the pixel value of the first sub-region and the authority level of the first sub-region into the neural network, further comprises:
the authority level of at least one image subarea adjacent to the first subarea is input into the neural network.
13. The method according to claim 1, wherein the electronic device obtains information of the image sub-area lower than or equal to the authority level of the user according to the authority level of the user; the information of the image sub-area comprises pixel values of the image sub-area.
14. The method according to claim 1, wherein filtering the meeting boundary between at least two image subregions of the plurality of image subregions if the authority levels of the at least two adjacent image subregions are different comprises:
Filtering the brightness channel component on the connected boundary between the at least two image subregions;
Or filtering the chrominance channel component on the connected boundary between the at least two image subregions;
or filtering the boundary of the at least two image subareas, wherein the boundary is used for filtering the luminance channel component and the chrominance channel component.
15. An image processing apparatus, characterized in that the image processing apparatus comprises: the device comprises an image dividing unit, an image filtering unit and an image decoding unit;
the image dividing unit, the image filtering unit, the image decoding unit for implementing the method of any one of claims 1 to 14.
16. An electronic device comprising a processor and a memory, the memory for storing computer instructions, the processor for invoking and executing the computer instructions from the memory to perform the method of any of claims 1 to 14.
17. A computer readable storage medium, characterized in that the storage medium has stored therein a computer program or instructions which, when executed by an electronic device, implement the method of any one of claims 1 to 14.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211423122.2A CN118042163A (en) | 2022-11-14 | 2022-11-14 | Image processing method and device, electronic equipment and storage medium |
PCT/CN2024/072113 WO2024104504A1 (en) | 2022-11-14 | 2024-01-12 | Image processing method and apparatus, and electronic device and storage medium |
GBGB2508081.3A GB202508081D0 (en) | 2022-11-14 | 2024-01-12 | Image processing method and apparatus, and electronic device and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211423122.2A CN118042163A (en) | 2022-11-14 | 2022-11-14 | Image processing method and device, electronic equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN118042163A true CN118042163A (en) | 2024-05-14 |
Family
ID=90984626
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211423122.2A Pending CN118042163A (en) | 2022-11-14 | 2022-11-14 | Image processing method and device, electronic equipment and storage medium |
Country Status (3)
Country | Link |
---|---|
CN (1) | CN118042163A (en) |
GB (1) | GB202508081D0 (en) |
WO (1) | WO2024104504A1 (en) |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100669905B1 (en) * | 2004-12-02 | 2007-01-16 | 한국전자통신연구원 | Apparatus and method for removing blocking phenomenon using block boundary region characteristics |
CN105491443A (en) * | 2014-09-19 | 2016-04-13 | 中兴通讯股份有限公司 | Method and device for processing and accessing images |
EP3777145A1 (en) * | 2018-03-28 | 2021-02-17 | Huawei Technologies Co., Ltd. | An image processing device and method for performing efficient deblocking |
PL3931748T3 (en) * | 2019-03-11 | 2024-08-12 | Huawei Technologies Co. Ltd. | Sub-image based slice addresses in video encoding |
CN112514390B (en) * | 2020-03-31 | 2023-06-20 | 深圳市大疆创新科技有限公司 | Video coding method and device |
CN114078134B (en) * | 2020-08-21 | 2025-02-25 | Oppo广东移动通信有限公司 | Image processing method, device, equipment, computer storage medium and system |
CN117044212A (en) * | 2021-03-09 | 2023-11-10 | 现代自动车株式会社 | Video coding and decoding method and device using deblocking filtering based on segmentation information |
-
2022
- 2022-11-14 CN CN202211423122.2A patent/CN118042163A/en active Pending
-
2024
- 2024-01-12 WO PCT/CN2024/072113 patent/WO2024104504A1/en active Application Filing
- 2024-01-12 GB GBGB2508081.3A patent/GB202508081D0/en active Pending
Also Published As
Publication number | Publication date |
---|---|
WO2024104504A1 (en) | 2024-05-23 |
GB202508081D0 (en) | 2025-07-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
RU2154918C1 (en) | Method and device for loop filtration of image data | |
JPH08186714A (en) | Noise removal of picture data and its device | |
US7003173B2 (en) | Filter for combined de-ringing and edge sharpening | |
Zhang et al. | Low-rank-based nonlocal adaptive loop filter for high-efficiency video compression | |
KR100308016B1 (en) | Block and Ring Phenomenon Removal Method and Image Decoder in Compressed Coded Image | |
CN106331709A (en) | Method and apparatus for processing video using in-loop processing | |
TW202103492A (en) | Image encoding device, image decoding device, and storage medium | |
IL225591A (en) | Method and apparatus for adaptive loop filtering | |
US7092580B2 (en) | System and method using edge processing to remove blocking artifacts from decompressed images | |
KR102495550B1 (en) | Deblocking filter method and apparatus | |
US12238342B2 (en) | Video compression with in-loop sub-image level controllable noise generation | |
WO2019196941A1 (en) | Adaptive implicit transform setting | |
CN1362832A (en) | Data processing method | |
JP2022533074A (en) | Deblocking filter for video coding | |
JP2024511272A (en) | Intra prediction method, encoder, decoder and storage medium | |
CN101742292A (en) | Image content information-based loop filtering method and filter | |
KR100675498B1 (en) | Filtering device and method | |
CN109565592B (en) | Video coding device and method using partition-based video coding block partitioning | |
CN118042163A (en) | Image processing method and device, electronic equipment and storage medium | |
CN108521575A (en) | Noise reduction method and device for image noise | |
CN104506867A (en) | Sample adaptive offset parameter estimation method and device | |
Alanazi | An Optimized Implementation of a Novel Nonlinear Filter for Color Image Restoration. | |
JP4065287B2 (en) | Method and apparatus for removing noise from image data | |
KR20040111436A (en) | Video signal post-processing method | |
CN114697650A (en) | Intra-frame division method based on down-sampling, related device equipment and medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |