GB2498225A - Encoding and Decoding Information Representing Prediction Modes - Google Patents
Encoding and Decoding Information Representing Prediction Modes Download PDFInfo
- Publication number
- GB2498225A GB2498225A GB1200285.3A GB201200285A GB2498225A GB 2498225 A GB2498225 A GB 2498225A GB 201200285 A GB201200285 A GB 201200285A GB 2498225 A GB2498225 A GB 2498225A
- Authority
- GB
- United Kingdom
- Prior art keywords
- prediction mode
- image portion
- mode value
- value
- current image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 claims abstract description 237
- 230000008569 process Effects 0.000 claims abstract description 129
- 238000004590 computer program Methods 0.000 claims description 6
- 230000001419 dependent effect Effects 0.000 claims description 4
- 240000008042 Zea mays Species 0.000 claims 1
- 235000005824 Zea mays ssp. parviglumis Nutrition 0.000 claims 1
- 235000002017 Zea mays subsp mays Nutrition 0.000 claims 1
- 235000005822 corn Nutrition 0.000 claims 1
- 241000122133 Panicum mosaic virus Species 0.000 abstract 2
- 101150029975 MPM1 gene Proteins 0.000 description 27
- 239000013598 vector Substances 0.000 description 23
- 238000013461 design Methods 0.000 description 15
- 238000004891 communication Methods 0.000 description 13
- FFBHFFJDDLITSX-UHFFFAOYSA-N benzyl N-[2-hydroxy-4-(3-oxomorpholin-4-yl)phenyl]carbamate Chemical compound OC1=C(NC(=O)OCC2=CC=CC=C2)C=CC(=C1)N1CCOCC1=O FFBHFFJDDLITSX-UHFFFAOYSA-N 0.000 description 11
- 238000012545 processing Methods 0.000 description 11
- 230000011664 signaling Effects 0.000 description 8
- 238000009795 derivation Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 7
- 230000002441 reversible effect Effects 0.000 description 7
- 208000037170 Delayed Emergence from Anesthesia Diseases 0.000 description 5
- 238000013507 mapping Methods 0.000 description 5
- 230000002123 temporal effect Effects 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 4
- 230000006835 compression Effects 0.000 description 4
- 238000007906 compression Methods 0.000 description 4
- 238000013139 quantization Methods 0.000 description 4
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 101150082661 MFM1 gene Proteins 0.000 description 2
- 235000014676 Phragmites communis Nutrition 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000000354 decomposition reaction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000001052 transient effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/134—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
- H04N19/157—Assigned coding mode, i.e. the coding mode being predefined or preselected to be further used for selection of another element or parameter
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/102—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
- H04N19/103—Selection of coding mode or of prediction mode
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/102—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
- H04N19/103—Selection of coding mode or of prediction mode
- H04N19/11—Selection of coding mode or of prediction mode among a plurality of spatial predictive coding modes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/169—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
- H04N19/17—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/169—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
- H04N19/17—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
- H04N19/176—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a block, e.g. a macroblock
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/46—Embedding additional information in the video signal during the compression process
- H04N19/463—Embedding additional information in the video signal during the compression process by compressing encoding parameters before transmission
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/593—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving spatial prediction techniques
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
Abstract
Information representing a prediction mode (one of a plurality) for encoding a current image portion by intra coding is encoded. A current image portion prediction mode value (PMV) is compared with reference, or most probable, PMVs derived from modes of reference image portions to determine an encoding process from among a plurality to encode the information. If the current portion PMV is different from the reference PMVs, the former is compared with a further PMV. An encoding process is selected based on the further comparison, which may constitute encoding information representative of the current portion PMV. Also disclosed is deriving a first reference PMV based on the PMV of a single reference image portion, and comparing the PMV of the current portion with at least the first reference PMV to determine a process to encode the mode information of the current portion from among a plurality. A method of representing mode information by an angular index representing the angular direction of the current image portion prediction mode is disclosed. If the value of the current portion is different from reference angular indexes values, the determined encoding process comprises encoding the angular index value of the current portion.
Description
METHOD AND DEVICE FOR ENCODING OR DECODING INFORMATION
REPRESENTING PREDICTION MODES
Field of the invention
The invention relates to a method and a device for encoding or decoding mode information representative of a prediction mode. Particularly, but not exclusively the invention relates more specifically to intra mode coding in the High Efficiency Video Coding (HEVC) standard under development.
Description of the prior-art
Video applications are continuously moving towards higher resolution. A large quantity of video material is distributed in digital form over broadcast channels, digital networks and packaged media, with a continuous evolution towards higher quality and resolution (e.g. higher number of pixels per frame, higher frame rate, higher bit-depth or extended color gamut). This technology evolution puts higher pressure on the distribution networks that are already facing difficulties in bringing HOT'! resolution and high data rates economically to the end user. Consequently, any further data rate increases will put additional pressure on the networks. To handle this challenge, ITU-T and ISO/MPEG decided to launch in January 2010 a new video coding standard project, named High Efficiency Video Coding (HEVC).
The HEVC codec design is similar to that of previous so-called block-based hybrid transform codecs such as H.263, H.264, MPEG-i, MPEG-2, MPEG-4, SVC. Video compression algorithms such as those standardized by standardization bodies ITU, ISO and SMPTE use the spatial and temporal redundancies of the images in order to generate data bit streams of reduced size compared with the video sequences. Such compression techniques render the transmission and/or storage of the video sequences more effective.
Figure 1 shows an example of an image coding structure used in HEVC.
A video sequence is made up of a sequence of digital images 101 represented by one or more matrices the coefficients of which represent pixels.
An image 101 is made up of one or more slices 102. A slice may be part of the image or, in some cases, the entire image. Slices are divided into non-overlapping blocks, typically referred to as Largest Coding Units (LCUs) 103; LCUs are generally blocks of size 64 pixels x 64 pixels. Each LCU may in turn be iteratively divided into smaller variable size Coding Units (CUs) 104 using a quadtree decomposition.
During video compression in HEVC, each block of an image being processed is predicted spatially by an Intra" predictor (so-called "Intra" coding mode), or temporally by an Inter" predictor (so-called "Inter" coding mode).
Each predictor is a block of pixels issued from the same image. In Intra coding mode the predictor (Intra predictor) used for the current block being coded is a block of pixels constructed from information already encoded of the current image. By virtue of the identification of the predictor block and the coding of the residual, it is possible to reduce the quantity of information actually to be encoded.
A CU is thus coded according to an intra coding mode, (samples of the CU are spatially predicted from neighboring samples of the CU) or to an inter coding mode (samples of the CU are temporally predicted from samples of previously coded slices).
Once the CU samples have been predicted, the residual signal between the original CU samples and the prediction CU samples is generated. This residual is then coded after having applied transform and quantization processes.
In the current HEVC design, as well as in previous designs such as MPEG-4 AVC/H.264, intra coding involves deriving an intra prediction block from reconstructed neighboring samples 201 of the block to be encoded (decoded), as illustrated schematically in Figure 2A. Referring to Figure 2B, when coding a current CU 202, Intra mode coding makes use of two neighbouring CUs that have already been coded, namely the Top and Left CUs 203 and 204.
In intra mode coding multiple prediction modes are supported, including directional or non-directional intra prediction modes. When a CU is intra coded, its related intra prediction mode is coded in order to inform a decoder how to decode the coded CU.
Figure 3 illustrates intra prediction modes intraPredMode' supported in the current HEVC design, along with their related mode values used to identify the corresponding intra prediction mode. The number of supported modes depends on the size of a coding unit (CU). As at the tiling date of the present application the HEVC specification is still subject to change but at present the tollowing supported modes are contemplated: 18 modes tor 4x4 CU (modes 0 to 17), 35 modes for CU of other sizes (8x8 to 64x64).
The intra prediction modes include prediction modes which are not directional including a planar prediction mode and a DC mode.
The other modes are directional, which means that the samples are predicted according to a given angular direction. In Figure 3 (i), intra prediction modes not supported by 4x4 CUs are indicated by shaded boxes.
It can be noticed in Figure 3 (i) that the intra prediction modes are numbered in a specific order, more or less reflecting the probabilities of occurrence of the different intra prediction modes in the initial design of the standard specification. For instance, modes 0 (Planar), 1 (Vertical), 2 (Horizontal) and 3 (DC) are statistically the tour most commonly used modes.
This specific order requires the use ot a look-up table 304 that gives, for the angular modes, the link between a mode number (intraPredMode') and its corresponding angular index 302 (noted intraPredOrder' in table 304). An additional look-up table 305 is also used to associate an angular value 303 (noted intraPredAngle' in table 305) with the angular index. As depicted in Figure 30v) the intraPredAngle actually indicates the side opposite the angle when the adjacent side length is equal to 32.
The following definitions summarize intra mode representation: intraPredMode (301) represents the different possible intra prediction modes defined in HEVC, and correspond to the values that are actually coded/decoded.
* intraPredOrder (302) corresponds to the index of the angular mode, when angular prediction applies.
* intraPredAngle (303) corresponds to a displacement value (306), directly linked to the angular value, to be applied when angular intra prediction applies.
* Look-up table (304) establishes the link between intraPredMode and intraPredOrder.
* Look-up table (305) establishes the link between intraPredOrder and intraPredAngle.
Figure 4 is a flowchart illustrating steps of a known method of Intra mode coding performed in the current HEVC design. In a first step 401 the Intra prediction modes of the neighboring Top and Left CLJs 203 and 204, as illustrated in Figure 2B, are identified. The two CLJs may share the same Intra prediction mode or may have different Intra prediction modes. Accordingly, in step 401 one or two different intra prediction modes can be identified. In step 402, two so called Most Probable Modes' (MPM5), are derived from the identified neighbouring Top and Left intra prediction modes. In step 403 the prediction mode of the current coding unit is then compared to the two MPMs. If the prediction mode of the current coding unit is equal to either of the MPMs then in step 404 a first coding process (process 1) is applied.
This first coding process involves coding a flag signaling that the mode of the current block is equal to one of the MPMs, and then, coding the index of the MPM concerned.
If in step 403 it is determined that the prediction mode of the current block is not equal to one of the two MPMs, then in step 405 a second coding process (process 2) is applied.
The second coding process involves coding the mode value of the current block using a longer code word compared to coding of the MPM index.
Using MPMs makes the coding process more efficient. The mode of the CU is often equal to one of the MPM5. As fewer bits are used to signal the MPM than to signal the remaining mode, the overall coding cost is reduced.
A drawback of the current design, however, appears when the mode of the current CU is not one of the MPM5. The remaining mode coding may need to be coded using non fixed-length code (FLC) words, which renders the decoding process design more complex. For instance, when initially 35 prediction modes are supported, 33 remaining modes are possible (once the 2 MPMs have been removed from the possible modes list). Coding a value among 33 possible values requires 5 or 6 bits. If only 32 have been supported, fixed-length coding with 5 bits could have been used.
Another drawback of the current design is that in all cases of the decoding process, the full MPM derivation process, involving the identification of the top and left CU modes (401) and the derivation of the MPMs (402), is required leading to added complexity.
When the current CU is 4x4 size, and a neighboring CU is of a larger size, it is possible that the prediction mode derived from the neighboring CU is not supported by the 4x4 CU. When deriving the mode values from the neighboring CUs, a mapping process is therefore required. In the current HEVC design, the following process applies: If a neighboring mode is supported by the 4x4 CU, it is not modified; otherwise it is enforced to planar mode.
Figure 5 schematically illustrates a known decoding tree of the intra mode.. In this figure, italic bold font is used to indicate syntax elements that are decoded. Blocks correspond to operations performed during the decoding process.
Firstly the flag MPM_flag indicating whether or not the prediction mode is one of the MPM5 (MPMO 505 and MPM1 506) is decoded in step 501. If the MPM_flag is equal to 1, the MPM index mpm_idx is decoded in step 502 to identify if the mode is equal to MPMO or to MFM1. If the flag MPM_flag is equal to 0, remaining_mode is decoded in step 503. Variable length codes (VLC) decoding is used (4 to 5 bits for 4x4 CUs, 5 to 6 bits for other CUs). The prediction mode is finally deduced from the remaining mode value and from the MPMs values in step 504.
To summarize, the decoding process works as follows: If the MPM_flag 1'is decoded, the prediction mode is one of the MPMs o If the mpm_idx flag 0 is decoded, mode=MPMO o Else mode=MPM1 Else, remaining_mode is decoded using VLC codeword; the prediction mode is deduced for remaining_mode and the MPM values.
A drawback of the above described methods is their complexity and heavy use of processing resources.
Summary of the Invention
The present invention has been devised to address one or more of the foregoing concerns.
According to a first aspect of the invention there is provided a method of encoding mode information representing a prediction mode for encoding of a current image portion by an intra mode coding process, the prediction mode being one of a plurality of prediction modes, each prediction mode being represented by a prediction mode value, the method comprising: comparing a prediction mode value of the current image portion to be encoded with reference prediction mode values derived from prediction modes of respective image portions in order to determine an encoding process, from among a plurality of encoding processes, to encode the mode information; wherein if the prediction mode value of the current image portion is different from the reference prediction mode values, the prediction mode value of the current image portion is compared with a further predefined prediction mode value, the method further comprising selecting, based on the further comparison, an encoding process for encoding the mode information.
Since there is an added opportunity to encode the mode information by means of a simple flag due to the comparison with the further predefined prediction mode value, the coding process is simplified. At the decoder side, the process is also simplified. After having decoded the first flag signalling that the mode is not one of the MPMs, then the second flag signalling that the mode value is the further predefined prediction mode value, the mode value is directly obtained without additional process.
In an embodiment in the case where the prediction mode value of the current image portion is equal to the further predefined prediction mode value, the selected encoding process comprises encoding information indicating a predefined relationship between the prediction mode value of the current image portion and the further predefined prediction mode value, otherwise if the prediction mode value of the current image portion is different from the further predefined prediction mode value the selected encoding process comprises encoding information representative of the prediction mode value of the current image portion In an embodiment the further predefined prediction mode value is set to a prediction mode value corresponding to a planar prediction mode In an embodiment the further predefined prediction mode value is set to a mode value corresponding to a horizontal prediction mode, a vertical prediction mode or a DC prediction mode.
In an embodiment the further predefined prediction mode value is dependent upon the content of the image being encoded.
In an embodiment the further predefined prediction mode value can be signalled in the bitstream, at image level, or at image portion level.
In an embodiment the further predefined prediction mode value depends on mode probabilities representative of the probability of occurrence of respective prediction modes.
In an embodiment the mode probabilities are regularly computed and the further predefined prediction mode value is adaptively derived based on said mode probabilities.
In an embodiment the reference prediction mode values comprise a first reference prediction mode value based on the prediction mode of a single reference image portion and a second reference prediction mode value based on the respective prediction modes of at least two reference image portions.
In an embodiment the single reference image portion comprises the left neighbouring image portion of the current image portion.
In an embodiment in the case where the prediction mode value of the single reference image portion corresponds to the further predefined prediction mode value the first reference prediction mode value is set to a second predefined prediction mode value, otherwise the first reference prediction mode value is set to the prediction mode value of the single reference image portion.
In an embodiment in the case where the further predefined prediction mode value is planar mode and the prediction mode value of the single reference image portion corresponds to a planar prediction mode the first reference prediction mode value is set to a DC prediction mode value (i.e the second predefined prediction mode value is a DC prediction mode value), otherwise the first reference prediction mode value is set to the prediction mode value of the single reference image portion.
In an embodiment the two reference image portions, comprise a first neighbouring image portion and a second neighbouring image portion.
In an embodiment the two reference image portions comprise the left neighbouring image portion as the first neighbouring image portion and the top neighbouring image portion as the second neighbouring image portion of the current image portion.
In an embodiment in the case where the prediction mode value of the second neighbouring image portion corresponds to the further predefined prediction mode value or to the prediction mode value of the first neighbouring image portion of the current image portion, the second reference prediction mode value is set to a prediction mode value corresponding to an angular direction adjacent and/or superior, to the angular direction of the first neighbouring image portion, otherwise the second reference prediction mode value is set to the prediction mode value of the second neighbouring image portion.
In an embodiment in the case where the further predefined prediction mode value is planar mode and the prediction mode value of the second neighbouring image portion corresponds to a planar prediction mode or to the prediction mode value of the left neighbouring image portion of the current image portion, the second reference prediction mode value is set to a prediction mode value corresponding to an angular direction superior to the angular direction of the first neighbouring image portion, otherwise the second reference prediction mode value is set to the prediction mode value of the second neighbouring image portion.
In an embodiment in the case where the prediction mode value of the first neighbouring image portion corresponds to a non directional prediction mode, the second reference prediction mode value is set to a prediction mode value corresponding to a third predefined prediction mode.
In an embodiment the third predefined prediction mode is the vertical prediction mode.
In an embodiment in the case where the prediction mode value of the second neighbouring image portion corresponds to the further predefined prediction mode value or to the prediction mode value of the first neighbouring image portion of the current image portion, then the second reference prediction mode value is set to a fourth predefined prediction mode value, otherwise the second reference prediction mode value is set to the prediction mode value of the second neighbouring image portion.
In an embodiment in the case where the further predefined prediction mode value is planar mode and the prediction mode value of the second neighbouring image portion corresponds to a planar prediction mode or to the prediction mode value of the first neighbouring image portion of the current image portion, then the second reference prediction mode value is set to a fourth predefined prediction mode value, otherwise the second reference prediction mode value is set to the prediction mode value of the second neighbouring image portion.
In an embodiment the fourth predefined prediction mode value corresponds to a DC prediction mode.
According to a further aspect of the invention, there is provided an encoder for encoding mode information representing a prediction mode for encoding of a current image portion by an intra mode coding process, the prediction mode being one of a plurality of prediction modes each prediction mode being represented by a prediction mode value, the encoder comprising: comparison means for comparing a prediction mode value of the current image portion to be encoded with reference prediction mode values, each derived from one or more prediction modes of respective image portions in order to determine an encoding process, from among a plurality of encoding processes, to encode the mode information; the comparison means being configured to compare the prediction mode value of the current image portion with a further predefined prediction mode value in the case where the prediction mode value of the current image portion is different from the reference prediction mode values; and selection means for selecting, based on the further comparison, an encoding process for encoding the mode information; and encoding means for encoding the mode information using the selected encoding process.
According to another aspect there is provided a method of decoding mode information representing a prediction mode for decoding of a current image portion by an intra mode decoding process, the prediction mode being one of a plurality of prediction modes, the method comprising: receiving a codeword related to the prediction mode of the current image portion; determining, based on the codeword, the prediction mode from among the plurality of prediction modes for decoding the current image portion; decoding the current image portion using the determined prediction mode wherein in the case where the prediction mode value of the current image portion is different from reference prediction mode values derived from prediction modes of respective image portions and is equal to a further predefined prediction mode value, the codeword comprises a flag indicative that the prediction mode is the further predefined prediction mode and the decoding step comprises decoding the current image portion using the predefined prediction mode; otherwise in the case where the prediction mode value of the current image portion is different from reference prediction mode values derived from prediction modes of respective image portions and is different from the further predefined prediction mode value, the codeword comprises information representative of the prediction mode value of the current image portion and the decoding step comprises decoding the current image portion using the prediction mode represented by the prediction mode value.
A further aspect provides a decoder for decoding mode information representing a prediction mode for decoding of a current image portion by an intra mode decoding process, the prediction mode being one of a plurality of prediction modes, the decoder comprising: reception means for receiving a codeword related to the prediction mode of the current image portion; determining means for determining, based on the codeword, the prediction mode from among the plurality of prediction modes for decoding the current image portion; decoding means for decoding the current image portion using the determined prediction mode wherein in the case where the prediction mode value of the current image portion is different from reference prediction mode values derived from prediction modes of respective image portions and is equal to a further predefined prediction mode value, the codeword comprises a flag indicative that the prediction mode is the further predefined prediction mode and the decoding means is configured to decode the current image portion using the predefined prediction mode; otherwise in the case where the prediction mode value of the current image portion is different from reference prediction mode values derived from prediction modes of respective image portions and is different from the further predefined prediction mode value, the codeword comprises information representative of the prediction mode value of the current image portion, and the decoding means is configured to decode the current image portion using the prediction mode represented by the prediction mode value.
According to a second aspect of the invention there is provided a method of encoding mode information representing a prediction mode for encoding of a current image portion by an intra mode coding process, the prediction mode being one of a plurality of prediction modes, the method comprising: deriving a first reference prediction mode value based on the prediction mode value of a single reference image portion; and comparing the prediction mode value of the current image portion with at least the first reference prediction mode value in order to determine an encoding process, from among a plurality of encoding processes, to encode the mode information of the current image portion.
Since for derivation of the first reference prediction mode value, only one neighbouring image portion is accessed, the coding process is simplified.
Moreover,if the prediction mode of the current image portion is equal to the first reference prediction mode value, the encoding process for the first reference prediction mode value is invoked. It is not necessary, both at the encoder side and at the decoder side, to derive the second reference prediction mode value.
In an embodiment if the prediction mode of the current image portion is not equal to the first reference prediction mode value, the prediction mode of the current portion is compared with a second reference prediction mode value.
The second reference prediction mode value may be derived based on the respective prediction modes of at least two reference image portions.
In an embodiment, the single reference image portion comprises a neighbouring image portion, for example the left neighbouring image portion of the current image portion.
In an embodiment in the case where the prediction mode value of the single reference image portion corresponds to the further predefined prediction mode value, the first reference prediction mode value is set to a second predefined prediction mode value otherwise the first reference prediction mode value is set to the prediction mode value of the single reference image portion.
In an embodiment in the case where the further predefined prediction mode value is the planar mode and the prediction mode value of the single reference image portion corresponds to a planar prediction mode, the first reference prediction mode value is set to a DC prediction mode value (i.e. the second predefined prediction mode value is a DC prediction mode value), otherwise the first reference prediction mode value is set to the prediction mode value of the single reference image portion.
In an embodiment, the two reference image portions comprise a first neighbouring image portion and a second neighbouring image portion of the current image portion. The first neighbouring image portion may be the left neighbouring image portion, and the second neighbouring image portion may be the top neighbouring image portion.
In an embodiment, in the case where the prediction mode value of the second neighbouring image portion corresponds to the further predefined prediction mode value or to the prediction mode value of the first neighbouring image portion of the current image portion, the second reference prediction mode value is set to a prediction mode value corresponding to an angular direction adjacent, for example superior, to the angular direction of the first neighbouring image portion, otherwise the second reference prediction mode value is set to the prediction mode value of the second neighbouring image portion.
In an embodiment, in the case where the prediction mode value of the first neighbouring image portion corresponds to a non directional prediction mode, the second reference prediction mode value is set to a third predefined prediction mode value, for example a mode value corresponding to a vertical prediction mode.
In an embodiment, in the case where the prediction mode value of the second neighbouring image portion corresponds to the further predefined prediction mode value or to the prediction mode value of the first neighbouring image portion of the current image portion, then the second reference prediction mode value is set to a fourth predefined prediction mode value, otherwise the second reference prediction mode value is set to the prediction mode value of the second neighbouring image portion.
In an embodiment, the fourth predetined prediction mode value corresponds to a DC prediction mode.
In an embodiment, each prediction mode value comprises an angular index representative of the angular direction of the corresponding prediction mode; the angular index of the current image portion is compared with first and second reference angular indexes to determine the encoding process; and in the case where the angular index of the current image portion is different from the reference angular indexes, the determined encoding process comprises encoding the angular index of the current image portion.
In an embodiment angular indexes with even numbered values correspond to prediction modes supported by image portions of a specific size, for example of 4x4 pixels.
In an embodiment non-directional prediction modes are attributed angular index values greater than the angular index values of directional prediction modes.
In an embodiment a DC prediction mode has an even number. In an embodiment a planar prediction mode has an odd angular index. In an embodiment a planar prediction mode has an odd angular index of 33.
In an embodiment, in the case where the prediction mode of the reference image portion is not supported by the image portion to be encoded the angular index of the current image portion is set to the closest even number to the angular index of the reference image portion.
In an embodiment, in the case where all the authorized angular indexes, except for a further predefined prediction mode, such as for example the planar mode, of the reference image portion are even, the angular index is divided by two, or, equivalently right-shifted by one bit, prior to encoding.
A further aspect of the invention provides an encoder for encoding mode information representing a prediction mode for encoding of a current image portion by an intra mode coding process, the prediction mode being one of a plurality of prediction modes, each prediction mode being represented by a prediction mode value, the encoder comprising: means for deriving a first reference prediction mode value based on the prediction mode of a single reference image portion; and means for comparing the prediction mode value of the current image portion with at east the first reference prediction mode value in order to determine an encoding process, from among a plurality of encoding processes, to encode the mode information of the current image portion.
The encoder may further comprise means for deriving a second reference prediction mode value based on the respective prediction modes of at least two reference image portions, and the means for comparing may be configured to further compare the prediction mode value of the current image portion with the second reference prediction mode value.
A further aspect of the invention provides a method of decoding mode information representing a prediction mode for decoding of a current image portion by an intra mode decoding process, the prediction mode being one of a plurality of prediction modes, each prediction mode being represented by a prediction mode value, the method comprising: receiving a codeword related to the prediction mode of the current image portion; determining, based on the codeword, whether the prediction mode value of the current image portion corresponds to reference prediction mode values; wherein in the case where the codeword indicates that the prediction mode value of the current image portion is equal to a first reference prediction mode value, the prediction mode value of the current image portion is derived from the prediction mode value of a single reference image portion, and in the case where the codeword indicates that the prediction mode value of the current image portion is equal to a second reference prediction mode value the prediction mode value of the current portion is derived from the prediction mode values of at least two reference image portions, otherwise it is determined that the codeword comprises data representative of the prediction mode value of the current image portion Another aspect of the invention provides a decoder for decoding mode information representing a prediction mode for decoding of a current image portion by an intra mode decoding process, the prediction mode being one of a plurality of prediction modes, each prediction mode being represented by a prediction mode value, the decoder comprising: reception means for receiving a codeword related to the prediction mode of the current image portion; determining means for determining, based on the codeword, whether the prediction made value of the current image portion corresponds to reference prediction mode values; deriving means for deriving the prediction mode value wherein the deriving means is configured to derive the prediction mode value of the current image portion from the prediction mode value of a single reference image portion, in the case where the codeword indicates that the prediction mode value of the current image portion is equal to a first reference prediction mode value, to derive the prediction mode value of the current portion from the prediction mode values of at least two reference image portions in the case where the codeword indicates that the prediction mode value of the current image portion is equal to a second reference prediction mode value; to derive the prediction mode value from the codeword in the case where it is determined that the codeword comprises data representative of the prediction mode value of the current image portion.
According to a third aspect of the invention there is provided a method of encoding mode information representing a prediction mode for encoding of a current image portion by an intra mode coding process, the prediction mode being one of a plurality of prediction modes, the method comprising: representing each prediction mode by an angular index representative of the angular direction of the corresponding prediction mode, comparing the angular index of the current image portion with reference angular indexes in order to determine an encoding process, from among a plurality of encoding processes, to encode the mode information; wherein in the case where the angular index of the current image portion is different from the reference angular indexes, the determined encoding process comprises encoding the angular index of the current image portion.
In an embodiment, angular indexes with even numbered values correspond to prediction modes supported by image portions of predetermined size and/or shape, for example square blocks of 4x4 pixels.
In an embodiment, non-directional prediction modes are attributed an angular index value having a value greater than the angular index values of directional prediction modes.
In an embodiment, a DC prediction mode has an even numbered angular index.
In an embodiment, a planar prediction mode has an odd angular index.
In an embodiment, a planar prediction mode has an angular index of 33.
In an embodiment, in the case where the prediction mode of the reference image portion is not supported by the image portion to be encoded the angular index of the current image portion is set to the closest even numbered angular index of the angular index value of the reference image portion.
In an embodiment, in the case where all the authorized angular indexes, except for a further predefined prediction mode such as the planar mode, of the reference image portion are even, the angular index is divided by two, or, equivalently right-shifted by one bit, prior to encoding. A further aspect of the invention relates to an encoder for encoding mode information representing a prediction mode for encoding of a current image portion by an intra mode coding process, the prediction mode being one of a plurality of prediction modes, wherein each prediction mode is represented by an angular index representative of the angular direction of the corresponding prediction mode; the encoder comprising comparison means for comparing the angular index of the current image portion with reference angular indexes in order to determine an encoding process, from among a plurality of encoding processes, to encode the mode information; encoding means for encoding the angular index of the current image portion in the case where the angular index of the current image portion is different from the reference angular indexes.
A further aspect of the invention provides a method of decoding mode information representing a prediction mode for decoding of a current image portion by an intra mode coding process, the prediction mode being one of a plurality of prediction modes, each prediction mode being represented by a prediction mode value, the method comprising: receiving a codeword related to the prediction mode of the current image portion; wherein the case where the angular index value of the current image portion is different from reference angular indexes values derived from one or more reference image portions, the code word is an angular index value of the current image portion and the method further comprises decoding the angular index value of the current image portion.
Another aspect of the invention provides a decoder for decoding mode information representing a prediction mode for decoding of a current image portion by an intra mode coding process, the prediction mode being one of a plurality of prediction modes, each prediction mode being represented by a prediction mode value, the decoder comprising: reception means for receiving a codeword related to the prediction mode of the current image portion; wherein the case where the angular index value of the current image portion is different from reference angular indexes values derived from one or more reference image portions, the code word is an angular index value of the current image portion; and decoding means for decoding the angular index value of the current image portion.
At least parts of the methods according to the invention may be computer implemented. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit", "module" or "system". Furthermore, the present invention may take the form of a computer program product embodied in any tangible medium of expression having computer usable program code embodied in the medium.
Since the present invention can be implemented in software, the present invention can be embodied as computer readable code for provision to a programmable apparatus on any suitable carrier medium. A tangible carrier medium may comprise a storage medium such as a floppy disk, a CD-ROM, a hard disk drive, a magnetic tape device or a solid state memory device and the like. A transient carrier medium may include a signal such as an electrical signal, an electronic signal, an optical signal, an acoustic signal, a magnetic signal or an electromagnetic signal, e.g. a microwave or RF signal.
Embodiments of the invention will now be described, by way of example only, and with reference to the following drawings in which:-Figure 1 schematically illustrates an example of a HEVC coding structure Figures 2A and 2B, are schematic diagrams for use in explaining how an intra prediction block is derived in a known HEVC design: Figure 3 schematically illustrates intra prediction modes in a known HEVC design, Figure 4 is a flowchart for use in explaining intra mode coding in a known HEVC design; Figure 5 is a flowchart for use in explaining an intra mode decoding tree in a known HEVC design; Figure 6 is a schematic block diagram illustrating a data communication system in which one or more embodiments of the invention may be implemented; Figure 7 is a schematic block diagram illustrating a processing device configured to implement at least one embodiment of the present invention; Figure 8 is a schematic block diagram ram of an encoder according to at least one embodiment of the invention; Figure 9 is a schematic block diagram of a decoder according to at least one embodiment of the invention; Figure 10 is a flow chart illustrating steps of a method according to a first embodiment of the invention for encoding mode information representing a prediction mode; Figure hA is a flow chart illustrating steps of a method according to a second embodiment of the invention for encoding mode information representing a prediction mode; Figure 11 B is a flow chart illustrating steps of a method according to a second embodiment of the invention for encoding mode information representing a prediction mode; Figure 12 is a flow chart illustrating steps of a method according to a third embodiment of the invention for encoding mode information representing a prediction mode; Figure 13 schematically illustrates numbering of prediction modes according to the third embodiment of the invention; and Figure 14 is a flow chart illustrating steps of a method according to a further embodiment of the invention for decoding mode information representing a prediction mode; Figure 6 illustrates a data communication system in which one or more embodiments of the invention may be implemented. The data communication system comprises a sending device, in this case a server 601, which is operable to transmit data packets of a data stream to a receiving device, in this case a client terminal 602, via a data communication network 600. The data communication network 600 may be a Wide Area Network (WAN) or a Local Area Network (LAN). Such a network may be for example a wireless network (Wifi I 802.lla or b or g), an Ethernet network, an Internet network or a mixed network composed of several different networks. In a particular embodiment of the invention the data communication system may be, for example, a digital television broadcast system in which the server 601 sends the same data content to multiple clients.
The data stream 604 provided by the server 601 may be composed of multimedia data representing video and audio data. Audio and video data streams may, in some embodiments, be captured by the server 601 using a microphone and a camera respectively. In some embodiments data streams may be stored on the server 601 or received by the server 601 from another data provider. The video and audio streams are coded by an encoder of the server 601 in particular for them to be compressed for transmission.
In order to obtain a better ratio of the quality of transmitted data to quantity of transmitted data, the compression of the video data may be of motion compensation type, for example in accordance with the HEVC format or H.264IAVC format.
A decoder of the client 602 decodes the reconstructed data stream received by the network 600. The reconstructed images may be displayed display device and received audio data may be reproduced by a loud speaker.
Figure 7 schematically illustrates a processing device 700 configured to implement at least one embodiment of the present invention. The processing device 700 may be a device such as a micro-computer. a workstation or a light portable device such as a smart phone and portable computer. The device 700 comprises a communication bus 713 connected to: -a central processing unit 711, such as a microprocessor, denoted CPU; -a reed only memory 707, denoted ROM, for storing computer programs for implementing embodiments of the invention; -a random access memory 712, denoted RAM, which may be used for storing the executable code of the method of embodiments of the invention as well as the registers adapted to record variables and parameters necessary for implementing the method of encoding a sequence of digital images andfor the method of decoding a bitstream according to embodiments of the invention; and -a communication interface 702 connected to a communication network 703 over which data to be processed is transmitted or received Optionally, the apparatus 700 may also include the following components: -a data storage means 704 such as a hard disk, for storing computer programs for implementing methods of one or more embodiments of the invention and data used or produced during the implementation of one or more embodiments of the invention; -a disk drive 705 for a disk 706, the disk drive being adapted to read data from the disk 706 or to write data onto said disk; -a screen 709 for displaying data and/or serving as a graphical interface with the user, by means of a keyboard 710 or any other pointing means.
The apparatus 700 can be connected to various peripherals, such as for example a digital camera 720 or a microphone 708, each being connected to an input/output card (not shown) so as to supply multimedia data to the apparatus 700.
The communication bus 713 provides communication and interoperability between the various elements included in the apparatus 700 or connected to it. The representation of the communication bus is not limiting and in particular the central processing unit is operable to communicate instructions to any element of the apparatus 700 directly or by means of another element of the apparatus 700.
The disk 706 can be replaced by any information medium such as for example a compact disk (CD-ROM), rewritable or not, a ZIP disk or a memory card and, in general terms, by an information storage means that can be read by a microcomputer or by a microprocessor, integrated or not into the apparatus, possibly removable and adapted to store one or more programs whose execution enables the method of encoding a sequence of digital images and/or the method of decoding a bitstream according to the invention to be implemented.
The executable code may be stored either in read only memory 707, on the hard disk 704 or on a removable digital medium such as for example a disk 706 as described previously. Moreover in some embodiments, the executable code of the programs can be received by means of the communication network 703, via the interface 702, in order to be stored in one of the storage means of the apparatus 700 before being executed, such as the hard disk 704.
The central processing unit 711 is adapted to control and direct the execution of the instructions or portions of software code of the program or programs according to the invention, instructions that are stored in one of the aforementioned storage means. On powering up, the program or programs that are stored in a non-volatile memory, for example on the hard disk 704 or in the read only memory 707, are transferred into the random access memory 712, which then contains the executable code of the program or programs, as well as registers for storing the variables and parameters necessary for implementing the invention.
In this embodiment, the apparatus is a programmable apparatus which uses software to implement the invention. However, alternatively, the present invention may be implemented in hardware (for example, in the form of an Application Specific Integrated Circuit or ASIC).
Figure 8 illustrates a block diagram of an encoder according to at least one embodiment of the invention. The encoder is represented by connected modules, each module being adapted to implement, for example in the form of programming instructions to be executed by the CPU 711 of device 700, at least one corresponding step of a method implementing at least one embodiment of encoding an image of a sequence of images according to one or more embodiments of the invention.
An original sequence of digital images 10 to in 801 is received as an input by the encoder 80. Each digital image is represented by a set of samples, known as pixels. A bitstream 810 is output by the encoder 80 after implementation of the encoding process.
The bitstream 810 comprises a plurality of encoding units or slices, each slice comprising a slice header for transmitting encoding values of encoding parameters used to encode the slice, such as prediction mode information, and a slice body, comprising encoded video data.
The input digital images iO to in 801 are divided into blocks of pixels by module 802. The blocks correspond to image portions and may be of variable sizes (e.g. 4x4, 8x8, 16x16, 32x32 pixels). A coding mode is selected for each input block or coding unit. Two families of coding modes are provided: coding modes based on spatial prediction coding (Intra prediction), and coding modes based on temporal prediction (Inter coding, Merge, SKIP). The possible coding modes are tested.
Module 803 implements Intra prediction, in which a given block to be encoded is predicted by a predictor computed from pixels of the neighbourhood of said block to be encoded. An indication of the selected Intra predictor and the difference between the given block and its predictor is encoded to provide a residual if the Intra coding is selected.
Temporal prediction is implemented by motion estimation module 804 and motion compensation module 805. Firstly a reference image from among a set of reference images 816 is selected, and a portion of the reference image, also called reference area or image portion, which is the closest area to the given block to be encoded, is selected by the motion estimation module 804.
Motion compensation module 805 then predicts the block to be encoded using the selected area. The difference between the selected reference area and the given block, also called a residual block, is computed by the motion compensation module 805. The selected reference area is indicated by a motion vector.
Thus in both cases (spatial and temporal prediction), a residual is computed by subtracting the prediction from the original block.
In the INTRA prediction implemented by module 803, a prediction direction is encoded. In the temporal prediction, at least one motion vector is encoded. Information relative to the motion vector and the residual block is encoded if the intra prediction is selected. The encoding of mode information representing a prediction mode will be explained in more detail hereafter with reference to any one of Figures 10 to 14.
To further reduce the bitrate, assuming that motion is homogeneous, the motion vector is encoded by difference with respect to a motion vector predictor.
Motion vector predictors of a set of motion information predictors is obtained from the motion vectors field 818 by a motion vector prediction and coding module 817.
The encoder 80 further comprises a selection module 806 for selection of the coding mode by applying an encoding cost criterion, such as a rate-distortion criterion.
In order to further reduce redundancies a transform is applied by transform module 807 to the residual block, the transformed data obtained is then quantized by quantization module 808 and entropy encoded by entropy encoding module 809. Finally, the encoded residual block of the current block being encoded is inserted into the bitstream 810, along with the information relative to the predictor used such as the index of the selected motion vector predictor. For the blocks encoded in SKIP' mode, only an index to the predictor is encoded in the bitstream, without any residual block.
The encoder 80 also performs decoding of the encoded image in order to produce a reference image for the motion estimation of the subsequent images. This enables the encoder and the decoder receiving the bitstream to have the same reference frames. The inverse quantization module 811 performs inverse quantization of the quantized data, followed by an inverse transform by reverse transform module 812. The reverse intra prediction module 813 uses the prediction information to determine which predictor to use for a given block and the reverse motion compensation module 814 actually adds the residual obtained by module 812 to the reference area obtained from the set of reference images 816. Optionally, a deblocking filter 815 is applied to remove the blocking effects and enhance the visual quality of the decoded image. The same deblocking filter is applied at the decoder, so that, if there is no transmission loss, the encoder and the decoder apply the same processing.
Figure 9 illustrates a block diagram of a decoder according to at least one embodiment of the invention. The decoder is represented by connected modules, each module being adapted to implement, for example in the form of programming instructions to be executed by the CPU 311 of device 300, a corresponding step of a decoding method.
The decoder 90 receives a bitstream 901 comprising encoding units, each one being composed of a header containing information on encoding parameters and a body containing the encoded video data. As explained with respect to Figure 8, the encoded video data is entropy encoded, and the motion vector predictors' indexes are encoded, for a given block, on a predefined number of bits. The received encoded video data is entropy decoded by module 902. The residual data are then dequantized by module 903 and then a reverse transform is applied by module 904 to obtain pixel values.
The mode data indicating the coding mode are also entropy decoded and based on the mode, an INTRA type decoding or an INTER type decoding is performed on the encoded blocks of image data.
In the case of INTRA mode, an INTRA predictor is determined by intra reverse prediction module 905 based on the intra prediction mode specified in the bitstream.
If the mode is INTER, the motion prediction information is extracted from the bitstream so as to find the reference area used by the encoder. The motion prediction information is composed of the reference frame index and the motion vector residual. The motion vector predictor is added to the motion vector residual in order to obtain the motion vector by motion vector decoding module 910.
Motion vector decoding module 910 applies motion vector decoding for each current block encoded by motion prediction. Once an index of the motion vector predictor, for the current block has been obtained the actual value of the motion vector associated with the current block can be decoded and used to apply reverse motion compensation by module 906. The reference image portion indicated by the decoded motion vector is extracted from a reference image 908 to apply the reverse motion compensation 906. The motion vector field data 911 is updated with the decoded motion vector in order to be used for the inverse prediction of subsequent decoded motion vectors.
Finally, a decoded block is obtained. A deblocking filter 907 is applied; similarly to the deblocking filter 815 applied at the encoder. A decoded video signal 909 is finally provided by the decoder 90.
Figure 10 is a flow chart illustrating steps of a method according to a first embodiment of the invention for encoding mode information representing a prediction mode for encoding a current coding unit with respect to reference coding units by an intra coding process.
In an initial step Si 001 the Intra prediction modes of the neighboring Top and Left CUs of the current CU to be encoded, are identified. In step S1002, two reference prediction mode values, referred to as Most Probable Modes' (MPM5), are derived from the identified intra prediction modes. In step S1003 the prediction mode value of the current coding unit is then compared to the two MPMs. If the prediction mode value of the current coding unit is equal to either of the MPMs then in step S1005 a first coding process (process 1) is applied.
This first coding process involves coding a flag signaling that the mode of the current block is equal to one of the MPMs, and then, coding the index of the MPM concerned.
If in step S1003 it is determined that the prediction mode value of the current coding block is not equal to one of the two MPMs, then in step S1004 the prediction mode value of the current coding block is compared with a predetined prediction mode value in order to determine whether or not the prediction mode value is equal to the predefined prediction mode value. If it is determined that the prediction mode value of the current coding block is equal to the predefined prediction mode value then in step Si 006 a second coding process (process 2) is applied.
The second coding process involves coding a flag signaling that the prediction mode of the current block is equal to the predefined prediction mode value.
Otherwise, if it is determined in step S1004 that the prediction mode value of the current coding block is not equal to the predefined prediction mode value then in step S1007 a third coding process (process 3) is applied.
The third coding process involves coding the actual prediction mode value of the current block.
Thus, in the case where the prediction mode value of the current coding block is equal to the predefined prediction mode only a short flag code needs to be transmitted to the decoder to inform the decoder of the prediction mode of the current coding block thereby leading to a reduction in transmission and processing overhead. Once the decoder has decoded the codeword indicative of the prediction mode, it is directly informed of the mode value without any additional processing. Decoding processing is therefore simplified.
In order to use fewer bits for signalling the predefined mode a coding flag may be set as 0 to indicate that the prediction mode corresponds to the predefined prediction mode.
The predefined prediction mode may be selected such that it corresponds to a prediction mode which is statistically more likely to occur. For example, the predefined prediction mode may correspond to a planar prediction mode.
In alternative embodiments, the predefined prediction mode value may be set to a mode value corresponding to a horizontal prediction mode, a vertical prediction mode or a DC prediction mode.
In some embodiments the predefined prediction mode may be set according to the context of the application. For example, the predefined prediction mode value may be dependent upon the content of the image being encoded. In some particular embodiments the predefined prediction mode value may be adaptively derived based on mode probabilities representative of the probability of occurrence of respective prediction modes, said mode probabilities being regularly computed.
The predefiried prediction mode value can be signalled in the bitstream, at image level, or at image portion level.
In the case of a decoder, such as that of Figure 9, receiving the codeword resulting from the method of Figure 10, the codeword being indicative of the prediction mode of the current, the following steps are implemented.
In an initial step the decoder receives the codeword related to the prediction mode of the current image portion and then determines, based on the codeword, the prediction mode from among the potential prediction modes for decoding the current image portion.
In the case where the prediction mode value of the current coding unit is different from reference prediction mode values MFMO and MPM1 derived from prediction modes of the neighbouring top and left coding units and is equal to the predefined prediction mode value, the codeword is a flag indicative that the prediction mode is the predefined prediction mode (process 2 of Figure 10).
The decoder in this case decodes the current coding unit using the predefined prediction mode e.g. a planar prediction mode.
Otherwise in the case where the prediction mode value of the current coding unit is different from reference prediction mode values MPMO and MPM1 derived from prediction modes of the neighbouring top and left coding units and is different from the further predefined prediction mode value, the codeword is a coded value derived from the coded prediction mode value of the current coding unit (process 3 of Figure 10) and from the reference prediction mode values and the decoding step comprises decoding the current coding unit using the prediction mode represented by the prediction mode value.
In the case where the codeword indicates that the prediction mode value of the current coding unit is equal to the first reference prediction mode value MPMO (process 1 of Figure 10), the decoder decodes the current coding unit using the prediction mode associated with MPMO.
In the case where the codeword indicates that the prediction mode value of the current coding unit is equal to the second reference prediction mode value MFM1 (process 1 of Figure 10), the decoder decodes the current coding unit using the prediction mode associated with MPM1.
Figure 11 is a flow chart illustrating steps of a method according to a second embodiment of the invention for encoding mode information representing a prediction mode for encoding a current coding unit with respect to reference coding units by an intra coding process.
In an initial step SliOl a first reference prediction mode value, referred to as MPMO is derived from the intra prediction mode of a single neighbouring coding unit of the current coding unit to be encoded. In step 51102 a second reference prediction mode value, referred to as MPM1 is derived from intra prediction modes of the top and left neighbouring coding units of the current coding unit to be encoded.
In a particular embodiment, the left coding unit is used as the single neighbouring coding unit for the derivation of MPMO, because it is generally more efficient in practical hardware or software implementations, in terms of access bandwidth, to access to left data than to top data.
In step Si 103 the prediction mode value of the current coding unit is then compared to the two reference prediction mode values MPMO and MPM1. If the prediction mode value of the current coding unit is equal to either of the MPMs then in step S1104 a first coding process (process 1) is applied.
This first coding process involves coding a flag signaling that the mode of the current block is equal to one of the MPMs, and then, coding the index of the MPM concerned.
If in step 51103 it is determined that the prediction mode of the current block is not equal to one of the two MPMs, then in step SilOs a second coding process (process 2) is applied.
The second coding process involves coding the actual prediction mode value of the current block.
Figure 11 B is a flow chart illustrating steps of another possible implementation method according to the second embodiment of the invention for encoding mode information representing a prediction mode for encoding a current coding unit with respect to reference coding units by an intra coding process.
In an initial step S1501 a first reference prediction mode value, referred to as MPMO is derived from the intra prediction mode of a single neighbouring coding unit of the current coding unit to be encoded. In step S1502 the prediction mode value of the current coding unit is then compared to the first reference prediction mode value MPMO. If the prediction mode value of the current coding unit is equal to MPMO then in step S1503 a first coding process (process 1 a) is applied.
If in step S1502 it is determined that the prediction mode of the current block is not equal to MPMO, then in step S1504 a second reference prediction mode value, referred to as MPM1 is derived from intra prediction modes of the top and left neighbouring coding units of the current coding unit to be encoded.
In step S1505 the prediction mode value of the current coding unit is then compared to the second reference prediction mode value MPM1. If the prediction mode value of the current coding unit is equal to MPM1 then in step S1506 a second coding process (process lb) is applied.
If in step S1505 it is determined that the prediction mode of the current block is not equal to MPM1, then a third coding process (process 2) is applied.
A method for deriving the first and second reference prediction mode values MPMO and MPM1 according to an embodiment of the invention will now be described.
In determining first reference mode prediction value MPMO, in the case where the neighbouring left coding unit of the current coding unit has a fifth predefined mode value, MPMO is set to a mode prediction value corresponding to a sixth predefined mode value. Otherwise MPMO is set to a mode prediction value corresponding to that of the mode prediction value of the neighbouring left coding unit.
In an embodiment the fifth predefined mode value is planar prediction mode and the sixth predefined mode value is DC prediction mode.
In determining second reference prediction value MPM1, in the case where the neighbouring top coding unit of the current coding unit has the fifth predefined mode value or the prediction mode value of the neighbouring top coding unit is equal to the neighbouring left coding unit, the second reference prediction mode value MPM1 is set to a prediction mode value corresponding to an angular direction adjacent to the angular direction of the left neighbouring image portion, otherwise the second reference prediction mode value MPM1 is set to the prediction mode value of the top neighbouring image portion.
In one particular embodiment the angular direction adjacent to the angular direction of the left neighbouring image portion is the superior angular direction of the left neighbouring image portion.
In the case where the prediction mode value of the left neighbouring image portion corresponds to a non directional prediction mode then the second reference prediction mode value MPM1 is set to a prediction mode value corresponding to a seventh predefined mode value.
In one particular embodiment, the seventh predefined mode value is set to vertical prediction mode.
In one particular embodiment, in the case where the prediction mode value of the top neighbouring image portion corresponds to the fifth predefined mode value or to the prediction mode value of the left neighbouring image portion of the current image portion, then the second reference prediction mode value MPM1 is set to an eighth predefined prediction mode value, otherwise the second reference prediction mode value MPM1 is set to the prediction mode value of the top neighbouring image portion. For example the eighth predefined prediction mode value may correspond to a DC prediction mode.
The method is summarised as follows: -For MPMO (No access to top CU) a if ( left=Planar), MFMO = DC a else MPMO = left -F0rMPM1: o if ( top==Planar or top==Iett), MPM1 = Iefti-+ o else MPM1 =top * in one embodiment Ieft++ is the nearest authorized superior angular direction to the direction of left. If left is not angular mode (that is, DC mode), left++ is set vertical mode.
* in another embodiment, a given predefined mode is used instead of Ieft++ mode (for instance DC mode).
While in this embodiment the mode prediction value of the left neighbouring coding unit is used to determine the first reference prediction mode value it will be appreciated that in alternative embodiments another coding unit may be used such as the neighbouring top coding unit.
Since the mode derivation process accesses only one neighbouring coding unit to derive the first reference prediction mode value, the coding and corresponding decoding processes are simplified.
In the case of a decoder, such as that of Figure 9, receiving the codeword resulting from the method of Figure 11, the codeword being indicative of the prediction mode of the current, the following steps are implemented.
In an initial step the decoder receives the codeword related to the prediction mode of the current image portion and then determines, based on the codeword, the prediction mode from among the potential prediction modes for decoding the current image portion.
The decoder is configured to determine based on the codeword, whether the prediction mode value of the current coding unit corresponds to reference prediction mode values MPMO or MPM1.
In the case where the codeword indicates that the prediction mode value of the current coding unit is equal to the first reference prediction mode value MPMO (process 1 of Figure 11), the decoder decodes the current coding unit using the prediction mode associated with MPMO. In this case only the derivation process of MPMO needs to be applied, which means access to only one single neighboring image portion.
In the case where the codeword indicates that the prediction mode value of the current coding unit is equal to the second reference prediction mode value MPM1 (process 1 of Figure 11), the decoder decodes the current coding unit using the prediction mode associated with MPM1.
Otherwise the decoder determines that the codeword is coded data representative of the prediction mode value of the current coding unit (process 2 of Figure 11) and decodes the coding using the corresponding prediction mode.
Figure 12 is a flow chart illustrating steps of a method according to a third embodiment of the invention for encoding mode information representing a prediction mode for encoding a current coding unit with respect to reference coding units by an intra coding process.
In an initial step S1201 prediction mode information is represented by angular indexes representative of the angular direction of the corresponding prediction mode. In step S1202 the angular indexes of the prediction modes of the neighboring Top and Left CUs of the current CU to be encoded, are identified. In step Si 203, two reference angular index values, referred to as Most Probable Modes' (MPMs), are derived from the identified angular indexes.
In step S1204 the angular index of the current coding unit is then compared to the two reference angular indexes. If the angular index of the current coding unit is equal to either of the reference angular indexes then in step Si 205 a first coding process (process 1) is applied.
This first coding process involves coding a flag signaling that the mode of the current block is equal to one of the MPMs, and then, coding the index of the MPM concerned.
Otherwise, if it is determined in step Si 204 that the prediction mode value of the current coding block is not equal to the predefined prediction mode value then in step S1206 a second coding process (process 2) is applied.
The second coding process involves coding the angular index of the prediction mode of the current block.
Replacing the current prediction mode numbering by a direct angular index numbering enables a look-up table associating prediction mode values, such as with angular indexes to be omitted.
As illustrated in Figure 13, in this embodiment of the invention the prediction modes are represented as angular indexes ordered in the increasing order.
In Figure 13, the prediction modes represented by the shaded boxes correspond to prediction modes that are only supported for coding units which are larger then 4x4 pixels. Accordingly, it can be noted that angular modes supported by coding units of size 4x4 pixels are represented by even numbered angular index values.
Non-directional prediction modes such as planar prediction mode and DC prediction mode are attributed values 33 and 34, greater than the highest angular index of the directional modes. The DC prediction mode is attributed an even numbered value. It may be noted that DC and planar modes are also supported by coding units of size 4x4 pixels.
As a consequence in this particular embodiment of the invention: -Modes 0 to 32 correspond to angular indexes 0 to 32 -Mode 33 corresponds to PLANAR prediction mode -Mode 34 corresponds to DC prediction mode Therefore, with the exception of the planar mode, all modes supported by 4x4 pixel sized CUs have even numbered angular index values. When a planar mode is considered as a preferred mode, as in the first embodiment of the invention, this enables further modifications and simplifications for the 4x4 CU case, as will be described in what follows.
Since 4X4 sized CUs only support prediction modes represented by even numbered values (except in the case of a planar prediction mode), the mapping process, invoked to map the mode of a neighbouring coding unit to the authorized modes of 4x4 coding units, can thus be improved as follows.
The new mapping process for attributing an angular index value includes the following operation: mapped_mode = (neighbor_mode>> 1) <C 1 which means that the closest even numbered angular index to the angular index of the neighboring coding unit is used, in the case where the prediction mode of the neighbouring coding unit is not supported by the current coding unit.
Moreover, since 4X4 sized CU5 only support prediction modes represented by even numbered values (except in the case of a planar prediction mode) in the case where process 2 applies and the angular index is coded, the angular index value is divided by 2 prior to being encoded as the remaining mode. At the decoder side, the decoded mode has inversely to be multiplied by 2 to obtain the actual 4x4 prediction mode.
The use of angular indexes in place of previous mode numbering provides a number of advantages: Simplification, by virtue of * removal the need to use of a look-up tables associating mode numbers with angle indexes; * simple derivation of mode left++', that simply involves incrementing by one the value of the left mode. In previous designs it was required first to access to the angle index corresponding to the left mode, then to increase this angle index, then to identify to which mode the incremented index corresponded.
Coding efficiency, by virtue of * an improved straightforward mapping to 4x4 mode, more logical than the HEVC mapping to planar mode.
In the case of a decoder, such as that of Figure 9, receiving the codeword resulting from the method of Figure 12, the codeword being indicative of the prediction mode of the current, the following steps are implemented.
In an initial step the decoder receives the codeword related to the prediction mode of the current image portion and then determines, based on the codeword, the prediction mode from among the potential prediction modes for decoding the current image portion.
The decoder is configured to determine based on the codeword, whether the prediction mode value of the current coding unit corresponds to reference prediction mode values MPMO or MPM1.
In the case where the codeword indicates that the angular index of the prediction mode of the current coding unit is equal to the first reference prediction mode value MPMO (process 1 of Figure 12), the decoder decodes the current coding unit using the prediction mode associated with the angular index of MPMO. MPMO is built by only accessing the left coding unit.
In the case where the codeword indicates that the angular index of the prediction mode of the current coding unit is equal to the second reference prediction mode value MPM1 (process 1 of Figure 12), the decoder decodes the current coding unit using the prediction mode associated with the angular index ofMPM1.
Otherwise the decoder determines that the codeword is coded data representative of the angular index of the prediction mode of the current coding unit (process 2 of Figure 12) and decodes the coding using the corresponding prediction mode.
Figure 14 is a flow chart illustrating steps of a method according to a fourth embodiment of the invention for decoding mode information representing a prediction mode for encoding a current coding unit with respect to reference coding units by an intra coding process. This embodiment combines features of the previous embodiments.
In step Si 401 a first syntax element "MPM_flag" is decoded. If this first syntax element indicates that the prediction mode of the current coding unit to be decoded is equal to one of the reference angular indexes, a second syntax element "mpm_idx" is decoded in step Si 402. If this second syntax element indicates that the prediction mode of the current coding unit to be decoded is equal to the first reference angular index, this first reference angular index MPMO is derived in step S1407 from the angular index of the prediction mode of a neighbouring left coding unit of the current coding unit and the mode of the current coding unit to be decoded is set equal to MPMO in step S1407.
Otherwise, if the second syntax element "mpm_idx" indicates that the prediction mode of the current coding unit to be decoded is equal to the second reference angular index, this first reference angular index MPM1 is derived in step S1408 from angular indexes of neighbouring top and left coding units of the current coding unit and the mode of the current coding unit to be decoded is set equal to MPM1 in step Si 408.
Otherwise, if the first syntax element "MPM_flag" indicates that the prediction mode of the current coding unit to be decoded is not equal to one of the reference angular indexes, a third syntax element "remaining_flag" is decoded in step 51403. If this third syntax element indicates that the prediction mode of the current coding unit to be decoded is equal to planar mode, the mode of the current coding unit to be decoded is set equal to planar mode in step Si405. Otherwise if the third syntax element "remaining_flag" indicates that the prediction mode of the current coding unit to be decoded is not equal to planar mode, a fourth syntax element "remaining_mode" is decoded in step Si 404. In a particular embodiment, in the case of 16 possible remaining modes values for 4x4 CUs the codeword is composed of 4 bits while for other sized CUs the code word is composed of 5 bits since 32 remaining modes values are is possible. In step Si 406, the final mode value has to be derived from the decoded remaining mode value. Step Si406 involves first deriving the two MPM values. Then the remaining mode value is incremented if it is larger than the MPM values. The mode of the current coding unit to be decoded is set equal to the resulting remaining mode value.
In further embodiments of the method illustrated in Figure 14 the mode information may be represented by a mode prediction value instead of angular index. In other embodiments of the method of Figure 14 the first reference MPMO may be derived from the prediction modes of neighbouring top and left coding units instead of a single neighbouring coding unit, or from a single top neighbouring coding unit.
Embodiments of the invention thus provide ways of reducing the computational complexity for the encoding and decoding of a prediction mode in an HEVC encoder.
Although the present invention has been described hereinabove with reference to specific embodiments, the present invention is not limited to the specific embodiments, and modifications will be apparent to a skilled person in the art which lie within the scope of the present invention. In particular the different features from different embodiments may be interchanged or combined, where appropriate.
For example in the embodiment of Figure 10 angular indexes may used to represent the prediction modes and/or the first MFM may be derived using a single coding unit. Similarly, in the embodiments of Figure 1 1A or 11 B angular indexes may be used to represent the prediction modes and/or an extra step may be added comparing the prediction mode of the current coding unit with a predefined prediction mode in the case where the prediction mode of the current coding unit does not correspond to the first or second MFM.
Many further modifications and variations will suggest themselves to those versed in the art upon making reference to the foregoing illustrative embodiments, which are given by way of example only and which are not intended to limit the scope of the invention, that being determined solely by the appended claims.
In the claims, the word "comprising" does not exclude other elements or steps, and the indefinite article "a" or "an" does not exclude a plurality. The mere fact that different features are recited in mutually different dependent claims does not indicate that a combination of these features cannot be advantageously used.
Claims (1)
- <claim-text>CLAIMS1. A method of encoding mode information representing a prediction mode for encoding of a current image portion by an intra mode coding process, the prediction mode being one of a plurality of prediction modes each prediction mode being represented by a prediction mode value, the method comprising: comparing a prediction mode value of the current image portion to be encoded with reference prediction mode values, each derived from one or more prediction modes of respective image portions in order to determine an encoding process, from among a plurality of encoding processes, to encode the mode information; wherein if the prediction mode value of the current image portion is different from the reference prediction mode values, the prediction mode value of the current image portion is compared with a further predefined prediction mode value, the method further comprising selecting, based on the further comparison, an encoding process for encoding the mode information.</claim-text> <claim-text>2. A method according to claim 1, wherein if the prediction mode value of the current image portion is equal to the further predefined prediction mode value the selected encoding process comprises encoding information indicating a predefined relationship between the prediction mode value of the current image portion and the further predefined prediction mode value, otherwise if the prediction mode value of the current image portion is different from the further predefined prediction mode value the selected encoding process comprises encoding information representative of the prediction mode value of the current image portion 3. A method according to claim 1 or 2, wherein the further predefined prediction mode value is set to a prediction mode value corresponding to a planar prediction mode 4. A method according to claim 1 or 2, wherein the further predetined prediction mode value is set to a mode value corresponding to a horizontal prediction mode, a vertical prediction mode or a DC prediction mode.5. A method according to claim 1 or 2, wherein the further predefined prediction mode value is dependent upon the content of the image being encoded.6. A method according to claim 1 or 2, wherein the further predefined prediction mode value depends on mode probabilities representative of the probability of occurrence of respective prediction modes.7. A method according to claim 6, wherein the mode probabilities are regularly computed and the further predefined prediction mode value is adaptively derived based on said mode probabilities.8. A method according to any preceding claim wherein the reference prediction mode values comprise a first reference prediction mode value based on the prediction mode of a single reference image portion and a second reference prediction mode value based on the respective prediction modes of at least two reference image portions.9. A method according to claim 8 wherein the single reference image portion comprises a neighbouring portion such as the left neighbouring image portion of the current image portion.10. A method according to claim 8 or 9 wherein if the prediction mode value of the single reference image portion corresponds to the further predefined prediction mode value then the first reference prediction mode value is set to a second predefined prediction mode value, otherwise the first reference prediction mode value is set to the prediction mode value of the single reference image portion.11. A method according to any one of claims 8 to 10, wherein the two reference image portions comprise neighbouring image portions such as the left neighbouring image portion and the top neighbouring image portion of the current image portion.12. A method according to claim 11, wherein if the prediction mode value of the top neighbouring image portion corresponds to the further predefined prediction mode or to the prediction mode value of the left neighbouring image portion of the current image portion, then the second reference prediction mode value is set to a prediction mode value corresponding to an angular direction adjacent to the angular direction of the left neighbouring image portion, otherwise the second reference prediction mode value is set to the prediction mode value of the top neighbouring image portion.13. A method according to claim 12, wherein if the prediction mode value of the left neighbouring image portion corresponds to a non directional prediction mode then the second reference prediction mode value is set to a prediction mode value corresponding to a third predefined prediction mode such as a vertical prediction mode.14. A method according to claim 11, wherein if the prediction mode value of the top neighbouring image portion corresponds to the further predefined prediction mode value or to the prediction mode value of the left neighbouring image portion of the current image portion, then the second reference prediction mode value is set to a fourth predefined prediction mode value, otherwise the second reference prediction mode value is set to the prediction mode value of the top neighbouring image portion.15. A method according to claim 14, wherein the fourth predefined prediction mode value corresponds to a DC prediction mode.16. A method according to any preceding claim, wherein prediction mode values are each represented by an angular index representative of the angular direction of the corresponding prediction mode; the angular index of the current image portion is compared with reference angular indexes in order to determine the encoding process; and in the case where the angular index of the current image portion is different from the reference angular indexes, the determined encoding process comprises encoding the angular index value of the current image portion.17. A method according to claim 16 wherein angular indexes with even numbered values correspond to prediction modes supported by image portions of predetermined size and/or shape.18. A method according to claim 16 or 17, wherein non-directional prediction modes are attributed an angular index having a greater value than the angular index values of directional prediction modes.19. A method according to any one of claims 16 to 18 wherein a DC prediction mode has an even number angular index.20. A method according to any one of claims 16 to 19 wherein a planar prediction mode has an odd numbered angular index such as 33.21. A method according to any one of claims 16 to 20 wherein if the prediction mode of the reference image portion is not supported by the image portion to be encoded, the angular index of the current image portion is set to the lowest closest even number to the angular index value of the reference image portion.22. A method according to any one of claims 16 to 21 wherein the angular index is divided by an integer such as two, prior to encoding.23. A method of decoding mode information representing a prediction mode for decoding of a current image portion by an intra mode decoding process, the prediction mode being one of a plurality of prediction modes, the method comprising: receiving a codeword related to the prediction mode of the current image portion; determining, based on the codeword, the prediction mode from among the plurality of prediction modes for decoding the current image portion; decoding the current image portion using the determined prediction mode wherein in the case where the prediction mode value of the current image portion is different from reference prediction mode values derived from prediction modes of respective image portions and is equal to a further predefined prediction mode value, the codeword comprises a flag indicative that the prediction mode is the further predefined prediction mode and the decoding step comprises decoding the current image portion using the predefined prediction mode; otherwise in the case where the prediction mode value of the current image portion is different from reference prediction mode values derived from prediction modes of respective image portions and is different from the further predefined prediction mode value, the codeword comprises information representative of the prediction mode value of the current image portion and the decoding step comprises decoding the current image portion using the prediction mode represented by the prediction mode value.24. An encoder for encoding mode information representing a prediction mode for encoding of a current image portion by an intra mode coding process, the prediction mode being one of a plurality of prediction modes each prediction mode being represented by a prediction mode value, the encoder comprising: comparison means for comparing a prediction mode value of the current image portion to be encoded with reference prediction mode values, each derived from one or more prediction modes of respective image portions in order to determine an encoding process, from among a plurality of encoding processes, to encode the mode information; the comparison means being configured to compare the prediction mode value of the current image portion with a further predefined prediction mode value in the case where the prediction mode value of the current image portion is different from the reference prediction mode values; and selection means for selecting, based on the further comparison, an encoding process for encoding the mode information; and encoding means for encoding the mode information using the selected encoding process.25. A decoder for decoding mode information representing a prediction mode for decoding of a current image portion by an intra mode decoding process, the prediction mode being one of a plurality of prediction modes, the decoder comprising: reception means for receiving a codeword related to the prediction mode of the current image portion; determining means for determining, based on the codeword, the prediction mode from among the plurality of prediction modes for decoding the current image portion; decoding means for decoding the current image portion using the determined prediction mode wherein in the case where the prediction mode value of the current image portion is different from reference prediction mode values derived from prediction modes of respective image portions and is equal to a further predefined prediction mode value, the codeword comprises a flag indicative that the prediction mode is the further predefined prediction mode and the decoding means is configured to decode the current image portion using the predefined prediction mode; otherwise in the case where the prediction mode value of the current image portion is different from reference prediction mode values derived from prediction modes of respective image portions and is different from the further predefined prediction mode value, the codeword comprises information representative of the prediction mode value of the current image portion, and the decoding means is configured to decode the current image portion using the prediction mode represented by the prediction mode value.26. A method of encoding mode information representing a prediction mode for encoding of a current image portion by an intra mode coding process, the prediction mode being one of a plurality of prediction modes, each prediction mode being represented by a prediction mode value the method corn pris i ng: deriving a first reference prediction mode value based on the prediction mode of a single reference image portion; and comparing the prediction mode value of the current image portion with at least the first reference prediction mode value in order to determine an encoding process, from among a plurality of encoding processes, to encode the mode information of the current image portion.27. A method according to claim 26, wherein in the case where the prediction mode value of the current image portion is not equal to the first reference prediction mode value, the prediction mode value of the current image portion is compared with a second reference prediction mode value based on the respective prediction modes of at least two reference image portions.28. A method according to claim 26 or 27 wherein the single reference image portion comprises a neighbouring image portion such as the left neighbouring image portion of the current image portion.29. A method according to any one of claims 26 to 28 wherein if the prediction mode value of the single reference image portion corresponds to the further predefined prediction mode value then the first reference prediction mode value is set to a second predefined prediction mode value, otherwise the first reference prediction mode value is set to the prediction mode value of the single reference image portion.30. A method according to claim 29, wherein the second predefined prediction mode value corresponds to a DC prediction mode.31. A method according to any one of claims 26 to 30, wherein the two reference image portions comprise a first neighbouring image portion and a second neighbouring image portion of the current image portion.32. A method according to claim 31 wherein the first neighbouring image portion is the left neighbouring image portion and the second neighbouring image portion is the top neighbouring image portion.33. A method according to claim 31 or 32, wherein if the prediction mode value of the second neighbouring image portion corresponds to the further predefined prediction mode value or to the prediction mode value of the first neighbouring image portion of the current image portion, then the second reference prediction mode value is set to a prediction mode value corresponding to an angular direction adjacent to the angular direction of the first neighbouring image portion, otherwise the second reference prediction mode value is set to the prediction mode value of the second neighbouring image portion.34. A method according to claim 31 or 32, wherein if the prediction mode value of the first neighbouring image portion corresponds to a non directional prediction mode then the second reference prediction mode value is set to a prediction mode value corresponding to a third predefined prediction mode such as vertical prediction mode.35. A method according to claim 31 or 32, wherein if the prediction mode value of the second neighbouring image portion corresponds to the predefined prediction mode value or to the prediction mode value of the first neighbouring image portion of the current image portion, then the second reference prediction mode value is set to a fourth predefined prediction mode value, otherwise the second reference prediction mode value is set to the prediction mode value of the third neighbouring image portion.36. A method according to claim 35, wherein the fourth predefined prediction mode value corresponds to a DC prediction mode.37. A method according to any one of claims 26 to 36 wherein prediction mode values are represented by respective angular indexes, each representative of the angular direction of the corresponding prediction mode; the angular index of the current image portion is compared with first and second reference angular indexes to determine the encoding process; and in the case where the angular index of the current image portion is different from the reference angular indexes, the determined encoding process comprises encoding the angular index of the current image portion.38. A method according to claim 37 wherein angular indexes with even numbered values correspond to prediction modes supported by image portions of a predetermined size and/or shape.39. A method according to claim 37 or 38, wherein non-directional prediction modes are attributed angular index values greater than the angular index values of directional prediction modes.40. A method according to any one of claims 37 to 39 wherein a DC prediction mode has an even numbered angular index.41. A method according to any one of claims 35 to 38 wherein a planar prediction mode has an odd angular index such as 33.42. A method according to any one of claims 37 to 41, wherein if the prediction mode of the reference image portion is not supported by the image portion to be encoded the angular index of the current image portion is set to the lowest closest even number to the angular index of the reference image portion.43. A method according to any one of claims 37 to 42 wherein the angular index is divided by an integer such as two prior to encoding.44. A method of decoding mode information representing a prediction mode for decoding of a current image portion by an intra mode decoding process, the prediction mode being one of a plurality of prediction modes, each prediction mode being represented by a prediction mode value, the method comprising: receiving a codeword related to the prediction mode of the current image portion; determining, based on the codeword, whether the prediction mode value of the current image portion corresponds to reference prediction mode values; wherein in the case where the codeword indicates that the prediction mode value of the current image portion is equal to a first reference prediction mode value, the prediction mode value of the current image portion is derived from the prediction mode value of a single reference image portion, and in the case where the codeword indicates that the prediction mode value of the current image portion is equal to a second reference prediction mode value the prediction mode value of the current portion is derived from the prediction mode values of at least two reference image portions, otherwise it is determined that the codeword comprises data representative of the prediction mode value of the current image portion 45. An encoder for encoding mode information representing a prediction mode for encoding of a current image portion by an intra mode coding process, the prediction mode being one of a plurality of prediction modes, each prediction mode being represented by a prediction mode value, the encoder comprising: means for deriving a first reference prediction mode value based on the prediction mode of a single reference image portion; and means far comparing the prediction mode value of the current image portion with at least the first reference prediction mode value in order to determine an encoding process, from among a plurality of encoding processes, to encode the mode information of the current image portion.46. An encoder according to claim 45 further comprising means for deriving a second reference prediction mode value based on the respective prediction modes of at least two reference image portions; wherein the means for comparing is configured to further compare the prediction mode value of the current image portion with the second reference prediction mode value.47. A decoder for decoding mode information representing a prediction mode for decoding of a current image portion by an intra mode decoding process, the prediction mode being one of a plurality of prediction modes, each prediction mode being represented by a prediction mode value, the decoder comprising: reception means for receiving a codeword related to the prediction mode of the current image portion; determining means for determining, based on the codeword, whether the prediction mode value of the current image portion corresponds to reference prediction mode values; deriving means for deriving the prediction mode value wherein the deriving means is configured to derive the prediction mode value of the current image portion from the prediction mode value of a single reference image portion, in the case where the codeword indicates that the prediction mode value of the current image portion is equal to a first reference prediction mode value, to derive the prediction mode value of the current portion from the prediction mode values of at least two reference image portions in the case where the codeword indicates that the prediction mode value of the current image portion is equal to a second reference prediction mode value; to derive the prediction mode value from the codeword in the case where it is determined that the codeword comprises data representative of the prediction mode value of the current image portion.48. A method of encoding mode information representing a prediction mode for encoding of a current image portion by an intra mode coding process, the prediction mode being one of a plurality of prediction modes, the method comprising: representing each prediction mode by an angular index representative of the angular direction of the corresponding prediction mode; comparing the angular index of the current image portion with reference angular indexes in order to determine an encoding process, from among a plurality of encoding processes, to encode the mode information; wherein in the case where the angular index of the current image portion is different from the reference angular indexes, the determined encoding process comprises encoding the angular index of the current image portion.49. A method according to claim 48, wherein angular indexes with even numbered values correspond to prediction modes supported by image portions of predetermined shape and/or size.50. A method according to claim 48 or 49, wherein non-directional prediction modes are attributed an angular index value having a value greater than the angular index values of directional prediction modes.51. A method according to any preceding claim wherein a DC prediction mode has an even numbered angular index.52. A method according to any one of claims 48 to 51 wherein a planar prediction mode has an odd numbered angular index, such as angular index of 33.53. A method according to any one of claims 48 to 52 wherein if the prediction mode of the reference image portion is not supported by the image portion to be encoded the angular index of the current image portion is set to the lowest closest even numbered angular index of the angular index value of the reference image portion.54. A method according to any one of claims 48 to 53 wherein the angular index is divided by an integer such as two prior to encoding.55. A method of decoding mode information representing a prediction mode for decoding of a current image portion by an intra mode coding process, the prediction mode being one of a plurality of prediction modes, each prediction mode being represented by a prediction mode value, the method comprising: receiving a codeword related to the prediction mode of the current image portion; wherein the case where the angular index value of the current image portion is different from reference angular indexes values derived from one or more reference image portions, the code word is an angular index value of the current image portion and the method further comprises decoding the angular index value of the current image portion.56. An encoder for encoding mode information representing a prediction mode for encoding of a current image portion by an intra mode coding process, the prediction mode being one of a plurality of prediction modes, wherein each prediction mode is represented by an angular index representative of the angular direction of the corresponding prediction mode; the encoder comprising comparison means for comparing the angular index of the current image portion with reference angular indexes in order to determine an encoding process, from among a plurality of encoding processes, to encode the mode information; encoding means for encoding the angular index of the current image portion in the case where the angular index of the current image portion is different from the reference angular indexes.57. A decoder for decoding mode information representing a prediction mode for decoding of a current image portion by an intra mode coding process, the prediction mode being one of a plurality of prediction modes, each prediction mode being represented by a prediction mode value, the decoder comprising: reception means for receiving a codeword related to the prediction mode of the current image portion; wherein the case where the angular index value of the current image portion is different from reference angular indexes values derived from one or more reference image portions, the code word is an angular index value of the current image portion; and decoding means for decoding the angular index value of the current image portion.58. A computer program product for a programmable apparatus, the computer program product comprising a sequence of instructions for implementing a method according to any one of claims 1 to 23; 26 to 44; or 48 to 55 when loaded into and executed by the programmable apparatus.59. A computer-readable storage medium storing instructions of a computer program for implementing a method, according to any one of claims 1 to 23; 26 to 44; or48 to 55.60. A method of encoding mode information representing a prediction mode for encoding of a current image portion by an intra mode coding process,, substantially as hereinbefore described with reference to, and as shown in Figure 10, hA, 11B, 12 or 14.61. A method according to claim 17,38 or49 wherein the predetermined size or shape corresponds to a square block of 4x4 pixels.</claim-text>
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1200285.3A GB2498225B (en) | 2012-01-09 | 2012-01-09 | Method and device for encoding or decoding information representing prediction modes |
GB1206592.6A GB2498234A (en) | 2012-01-09 | 2012-04-13 | Image encoding and decoding methods based on comparison of current prediction modes with reference prediction modes |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1200285.3A GB2498225B (en) | 2012-01-09 | 2012-01-09 | Method and device for encoding or decoding information representing prediction modes |
Publications (3)
Publication Number | Publication Date |
---|---|
GB201200285D0 GB201200285D0 (en) | 2012-02-22 |
GB2498225A true GB2498225A (en) | 2013-07-10 |
GB2498225B GB2498225B (en) | 2015-03-18 |
Family
ID=45788656
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB1200285.3A Active GB2498225B (en) | 2012-01-09 | 2012-01-09 | Method and device for encoding or decoding information representing prediction modes |
GB1206592.6A Withdrawn GB2498234A (en) | 2012-01-09 | 2012-04-13 | Image encoding and decoding methods based on comparison of current prediction modes with reference prediction modes |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB1206592.6A Withdrawn GB2498234A (en) | 2012-01-09 | 2012-04-13 | Image encoding and decoding methods based on comparison of current prediction modes with reference prediction modes |
Country Status (1)
Country | Link |
---|---|
GB (2) | GB2498225B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3414906A4 (en) * | 2016-02-08 | 2019-10-02 | Sharp Kabushiki Kaisha | Systems and methods for intra prediction coding |
WO2020133380A1 (en) * | 2018-12-29 | 2020-07-02 | 富士通株式会社 | Image intra-block coding or decoding method, data processing device, and electronic apparatus |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102252323B1 (en) | 2018-05-10 | 2021-05-14 | 삼성전자주식회사 | Video encoding method and apparatus, video decoding method and apparatus |
US20200099927A1 (en) * | 2018-09-24 | 2020-03-26 | Qualcomm Incorporated | Most probable modes (mpms) construction |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110243230A1 (en) * | 2010-03-31 | 2011-10-06 | Futurewei Technologies, Inc. | Multiple Predictor Sets for Intra-Frame Coding |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090161757A1 (en) * | 2007-12-21 | 2009-06-25 | General Instrument Corporation | Method and Apparatus for Selecting a Coding Mode for a Block |
US8761253B2 (en) * | 2008-05-28 | 2014-06-24 | Nvidia Corporation | Intra prediction mode search scheme |
-
2012
- 2012-01-09 GB GB1200285.3A patent/GB2498225B/en active Active
- 2012-04-13 GB GB1206592.6A patent/GB2498234A/en not_active Withdrawn
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110243230A1 (en) * | 2010-03-31 | 2011-10-06 | Futurewei Technologies, Inc. | Multiple Predictor Sets for Intra-Frame Coding |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3414906A4 (en) * | 2016-02-08 | 2019-10-02 | Sharp Kabushiki Kaisha | Systems and methods for intra prediction coding |
US11233990B2 (en) | 2016-02-08 | 2022-01-25 | Sharp Kabushiki Kaisha | Systems and methods for intra prediction coding |
US11765347B2 (en) | 2016-02-08 | 2023-09-19 | Sharp Kabushiki Kaisha | Method of decoding intra prediction data |
US12101474B2 (en) | 2016-02-08 | 2024-09-24 | Sharp Kabushiki Kaisha | Method of decoding intra prediction data |
WO2020133380A1 (en) * | 2018-12-29 | 2020-07-02 | 富士通株式会社 | Image intra-block coding or decoding method, data processing device, and electronic apparatus |
CN112106368A (en) * | 2018-12-29 | 2020-12-18 | 富士通株式会社 | Image intra-block coding or decoding method, data processing device and electronic equipment |
Also Published As
Publication number | Publication date |
---|---|
GB201200285D0 (en) | 2012-02-22 |
GB2498234A (en) | 2013-07-10 |
GB201206592D0 (en) | 2012-05-30 |
GB2498225B (en) | 2015-03-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10666938B2 (en) | Deriving reference mode values and encoding and decoding information representing prediction modes | |
US10158870B2 (en) | Method and apparatus for processing motion compensation of a plurality of frames | |
US10313668B2 (en) | Method and device for encoding or decoding an image comprising encoding of decoding information representing prediction modes | |
CN114556957A (en) | Video coding apparatus and method based on in-loop filtering | |
US20160080753A1 (en) | Method and apparatus for processing video signal | |
WO2014106746A1 (en) | Method and device for processing prediction information for encoding an image | |
US20220337814A1 (en) | Image encoding/decoding method and device using reference sample filtering, and method for transmitting bitstream | |
GB2501125A (en) | Providing adaptation parameters to a decoder by including an identifier to a relevant characteristic set in a bit stream portion. | |
US20230319315A1 (en) | Coding enhancement in cross-component sample adaptive offset | |
US20240137546A1 (en) | Coding enhancement in cross-component sample adaptive offset | |
US20160088305A1 (en) | Method and apparatus for processing video signal | |
GB2498225A (en) | Encoding and Decoding Information Representing Prediction Modes | |
US20240205438A1 (en) | Coding enhancement in cross-component sample adaptive offset | |
JP2023123699A (en) | Luma mapping- and chroma scaling-based video or image coding | |
US20240195996A1 (en) | Coding enhancement in cross-component sample adaptive offset | |
US20240214595A1 (en) | Coding enhancement in cross-component sample adaptive offset | |
US20240259578A1 (en) | Cross-component sample adaptive offset | |
US20240314339A1 (en) | Cross-component sample adaptive offset |