[go: up one dir, main page]

WO2006064250A1 - Reduced bandwidth flicker-free displays - Google Patents

Reduced bandwidth flicker-free displays Download PDF

Info

Publication number
WO2006064250A1
WO2006064250A1 PCT/GB2005/004864 GB2005004864W WO2006064250A1 WO 2006064250 A1 WO2006064250 A1 WO 2006064250A1 GB 2005004864 W GB2005004864 W GB 2005004864W WO 2006064250 A1 WO2006064250 A1 WO 2006064250A1
Authority
WO
WIPO (PCT)
Prior art keywords
frames
image processing
video stream
processing device
interpolation
Prior art date
Application number
PCT/GB2005/004864
Other languages
French (fr)
Inventor
John Lazar Barbur
Jonathan Alistar Harlow
Original Assignee
City University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by City University filed Critical City University
Publication of WO2006064250A1 publication Critical patent/WO2006064250A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0127Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level by changing the field or frame frequency of the incoming video signal, e.g. frame rate converter
    • H04N7/0132Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level by changing the field or frame frequency of the incoming video signal, e.g. frame rate converter the field or frame frequency of the incoming video signal being multiplied by a positive integer, e.g. for flicker reduction

Definitions

  • the present invention relates to display devices that can generate flicker-free images from low bandwidth signals.
  • the present invention relates to display devices that generate "morphed" image frames by interpolating between received image frames.
  • a moving object can be recorded on film by capturing a large number of still images of the object and then replaying the still images in rapid succession.
  • the human brain reassembles the still images into a single moving scene.
  • the larger the number of still images that are captured per second the smoother the motion of an object in the resulting film appears to the human eye. If too few images per second are captured, then motion appears jerky and discontinuous which is known as "flicker". Therefore, in order to obtain high quality moving images it is preferable to have a high frequency frame rate.
  • Visual images may be transmitted as wired or wireless signals.
  • bandwidth constraints that make it desirable to keep the signal bandwidth to a minimum. This is particularly true for telecommunications applications such as mobile phones, where it may be desired to transmit video clips etc over a limited bandwidth transmission channel. Therefore, for transmitting visual images, it is preferable for those signals to have a limited bandwidth, i.e. for the frame rate to be as low as possible.
  • a typical motion channel has a reduced bandpass temporal response characteristic with little signal power over 10Hz.
  • flicker detection mechanisms within the human brain are able respond to rapid local light flux changes at a much higher frequency (typical cut-off being around 50 to 75Hz) than the received frame frequency. Therefore, there is always a compromise that must be made between the quality of moving images displayed to a user of a display device and satisfying the bandwidth constraints for transmitting the images to the display device. There is therefore a need for an improved method for transmitting video and a display device for using this information so as to generate high-quality moving images.
  • an image processing device comprising a receiver for receiving a first video stream defining a series of received frames of image data and an interpolation arrangement for generating one or more supplementary frames of image data by interpolating between successive ones of the received frames and thereby forming a second video stream comprising the received frames and the supplementary frames.
  • a display is arranged to display the second video stream.
  • the interpolation arrangement is arranged to form the supplementary frames such that the luminance of each area of a successive frame is interpolated between the luminances of the corresponding areas of those of the received frames immediately preceding and succeeding the respective successive frame.
  • the interpolation arrangement is arranged to form the supplementary frames such that the chromaticity of each area of a successive frame is interpolated between the chromaticities of the corresponding areas of those of the received frames immediately preceding and succeeding the respective successive frame.
  • the said interpolation may be a linear interpolation or a non-linear interpolation.
  • the said areas may be pixels of the respective frames.
  • the said areas may be multi- pixel blocks of the respective frames.
  • the frame rate of the first video stream is below that at which flicker is perceived by a human viewer.
  • the frame rate of the second video stream is above that at which flicker is perceived by a human viewer.
  • the interpolation arrangement may be implemented as a plurality of processing units each of which performs interpolation for a respective pixel of the display.
  • the interpolation arrangement is provided in a separate housing from the display.
  • an image processing method comprising receiving by means of a receiver a first video stream defining a series of received frames of image data and generating one or more supplementary frames of image data by interpolating between successive ones of the received frames and thereby forming a second video stream comprising the received frames and the supplementary frames.
  • the method comprises displaying the second video stream.
  • the method comprises transmitting the first video stream to the receiver.
  • the method comprises forming the first video stream by processing an initial video stream to reduce the frame rate thereof prior to the said transmission step.
  • the step of forming the first video stream comprises omitting frames from the initial video stream.
  • Figure 1 shows the arrangement of a basic pixel
  • Figure 2 shows the arrangement of a basic pixel in a liquid crystal display
  • Figure 3 shows received frames and morphed frames
  • Figure 4 shows two consecutive received frames and an interpolated morphed frame
  • Figure 5 shows two consecutive received frames and an interpolated morphed frame for motion interpolation
  • Figure 6 shows two consecutive received frames and an interpolated morphed frame for luminance and chromaticity interpolation
  • Figure 7 shows the percentage contribution to the interpolated morphed frames of two received frames in a linear interpolation scheme
  • Figure 8 shows the percentage contribution to the interpolated morphed frames of two received frames in a non-linear interpolation scheme
  • Figure 9 shows a suitable display device for implementing the present invention
  • Figure 10 shows the structure of a pixel in a liquid crystal display
  • Figure 11 shows a graph of calibration data for a typical cathode ray tube display.
  • FIG. 1 illustrates a typical pixel.
  • the pixel in figure 1 shown generally at 101 , comprises red 102, green 103 and blue 104 components. Images are displayed by controlling the luminance and chromaticity (i.e. the colour) of each pixel.
  • the required luminance and chromaticity values are typically represented in an image file by three values: R, G and B.
  • each of the "RGB” values is an 8-bit integer in the range 0 to 255.
  • the RGB values for each pixel are used to calculate the intensity with which the red, green and blue components of each pixel should be activated, corresponding to the desired luminance values (U) and chromaticity values (Xj and V 1 ) for each pixel.
  • Figure 1 illustrates a pixel in which the red, green and blue components of the pixel are arranged as stripes. However, the three components may also be arranged as dots, or any other suitable arrangement.
  • a pixel arrangement such as that illustrated in figure 1 is typically used in cathode ray tube (CRT) displays, in which each component is composed of a different phosphor that emits light when hit by an electron beam. One phosphor emits red light, one emits green and the other emits blue light when hit by an electron beam.
  • CTR cathode ray tube
  • LCDs liquid crystal displays
  • the arrangement is slightly different and each pixel emits a single colour of light.
  • a simple LCD arrangement is illustrated in figure 2 and comprises a light source 201 , a liquid crystal pixel arrangement 202 and a colour filter 203. Therefore, in an LCD display each pixel has a red, a green or a blue colour filter to create the required red, blue and green components.
  • Figure 3 illustrates a number of frames 301 to be displayed by a display device.
  • Two of the frames 302 are received by the device at a frequency of f m Hz, so that one frame is received by the display device every T m seconds.
  • the frame frequency f m might typically be the motion sample rate, i.e. the rate at which the original images were captured, which could be e.g. 8Hz.
  • the frames 301 that the device is to display also include additional frames 303. These frames are generated by the display device itself by "morphing" between two adjacent received frames, i.e. received frames n and n+1 in this case.
  • the display device By generating these additional "morphed" frames, the display device has m frames, rather than only one frame, to display in the time T m seconds between received frames. Therefore, the display device displays still images at an increased display frequency f f , which might be e.g. 80 Hz.
  • the received frame frequency f m is typically in the range of 6 to 12Hz.
  • the frequency of displayed frames that results from generating the morphed frames might typically be around 80Hz.
  • the transmitted signals can still be of a relatively low bandwidth even though the displayed frame rate is high enough to avoid perceptible flicker.
  • the display device might generate nine intermediate morphed frames for every received frame.
  • the display frequency is not an exact multiple of the frame frequency
  • the display device interpolates between consecutive received frames as usual but may insert a different number of morphed frames between received frames. For example, if the received frame frequency is 6Hz and the display frequency is 80Hz, the display device may generate 13 morphed frames to interpolate between some successive frames and 14 frames to interpolate between other successive frames. Human visual systems are unable to detect this difference in the number of displayed frames that is required to adapt the frame frequency to the display frequency of the display device. Therefore, a display device can accept any input frame frequency and generate interpolated frames at the display frequency.
  • the morphed frames may be generated pixel by pixel by the display device.
  • portions of frames n and n+1 (401 , 402) are illustrated.
  • Pixel 403 brightens considerably between the two frames.
  • the display device therefore generates a morphed frame 404 between frames n and n+1 by interpolating between the two "brightness" values of pixel 403.
  • Figure 4 shows two received frames and a single morphed frame for the purposes of example only.
  • the display device would generate more than one morphed frame to display between each received frame.
  • a typical pixel has two basic qualities that can be controlled in order to create an image from a large number of pixels. The first is luminance and the second is chromaticity.
  • the display device may interpolate between one or both of these values for pixels in adjacent received frames to generate the morphed frames.
  • luminance luminance
  • chromaticity chromaticity
  • Embodiments of the present invention could also be used for motion interpolation.
  • the luminance and chromaticity values of pixel 504 in frame n (501 ) effectively move to pixel 506 in frame n+1 (502). Therefore, one option for interpolating between frames n and n+1 is to transfer the luminance and chromaticity values of pixel 504 to pixel 505 in morphed frame 503.
  • Figure 6 illustrates the situation of figure 5 but with the interpolation being done on a pixel by pixel basis according to the method outlined above. Therefore, in figure 6 the values of pixels 604 and 605 in morphed frame 603 are generated by interpolating between the values of these pixels in frame n (601) and frame n+1 (602). Although it may appear that the frame sequence in figure 5 provides a smoother representation of a moving object than the sequence of figure 6, research has shown that human visual systems do in fact perceive smooth motion from a frame sequence such as the one illustrated in figure 6.
  • the interpolation could be done in a single operation for blocks of pixels that have very similar luminance and chromaticity values in adjacent received frames. This is especially useful if the received frames are compressed by a protocol that renders similar adjacent pixels in the uncompressed data with identical luminance and/or chromaticity in the compressed image.
  • the interpolation method described herein could be combined with a data compression format such as MPEG. In this way, a single interpolation calculation can be performed for a block of pixels, where the pixels have very similar luminance and chromaticity values in adjacent received frames.
  • the morphed frames may be generated by either linear or non-linear interpolation between pixel values.
  • figure 7 illustrates the contribution that successive received frames n and n+1 might make to the morphed frames generated by the device in a linear interpolation scheme.
  • the contribution made by the pixel values of frame n (701 ) decreases linearly from 100% in frame n to 0% in frame n+1 while the contribution of the pixel values of frame n+1 (702) increases linearly from 0% in frame n to 100% in frame n+1.
  • An example of a nonlinear interpolation scheme is illustrated in figure 8.
  • the contribution to the morphed frames from received frame n+1 is initially small but increases in increasing incremental steps as the frames morph towards received frame n+1.
  • the two values can be interpolated according to different interpolation schemes. For example, one set of values could be interpolated according to a linear interpolation scheme whereas the other set of values could be interpolated according to a non-linear interpolation scheme. Also, the same number of interpolations need not be used for each set of values. For example, half the number of interpolations might be performed for the set of chromaticity values as for the luminance values. Thus, the chromaticity value of pixels within an image might be updated only every second morphed frame with the luminance value of those pixels being updated every morphed frame.
  • the device comprises a receiver 901 , a signal decoder 902, an interpolation unit 904 and a display 905.
  • the receiver 901 might be capable of receiving signals over a wired or wireless link.
  • the received signals will typically include audio and visual information.
  • the received signals are passed to the signal decoder 902, which extracts the frame information from the received signals.
  • the frame information specifies the luminance and chromaticity values for each pixel of the display for each image frame.
  • the extracted frame information is optionally passed to a buffer 903.
  • the buffer may be implemented as part of the interpolation unit 904.
  • the buffer introduces a delay of one frame into the system, thereby enabling the interpolation unit to generate morphed frames by interpolating between two consecutive received frames.
  • T m the received frames will be delayed by at least one frame period (T m ) before being displayed on the display, it might be advisable to also delay any accompanying audio signal by the same length of time. This may be unnecessary if T m is too short a time period for human perception to be able to detect any delay between the visual images and the accompanying audio signal. However, the longer that T m becomes, i.e. the lower that the transmitted frame frequency becomes, the more likely it is that any associated delay between the displayed images and accompanying soundtrack will be noticeable to humans.
  • the interpolation unit 904 compares the luminance, and optionally the chromaticity, of each pixel in consecutive received frames and calculates the interpolated value for each pixel and each morphed frame. For example, if the luminance is interpolated according to a linear interpolation scheme such as that illustrated in figure 7 to produce four morphed frames the interpolated luminance values L n between real
  • L 0 and L 5 might be calculated according to L n 1 j ne resulting
  • image frames are then fed to the display 905, on which they are displayed.
  • the display 905 might be a cathode ray tube (CRT) display screen, an LCD display panel, a plasma screen display or any other suitable display. These different types of display are given for the purposes of example only and it should be understood that the present invention is not limited to any specific type of display.
  • Figure 9 is a simple diagram that shows the relevant functional components of the display device as discrete functional blocks. This is for the purposes of example only and it should be understood that the various functions could be implemented separately or in combination: for example the interpolation unit could be implemented in a separate housing from the display screen. Similarly, it should be understood that further components may be present in an actual display device for implementing the present invention. For example, the signal decoder will extract other information from the received signal than frame information e.g. audio and control information, which will be forwarded to other functional blocks of the display device e.g. a loudspeaker.
  • frame information e.g. audio and control information
  • the pixels of the display may be addressed directly for each morphed and interpolated frame.
  • the display 905 is a CRT display
  • the electron beam will scan each pixel for each received and morphed frame.
  • the interpolation between received frames may be achieved by the display itself.
  • the display is one in which the pixels are addressed directly through electronic means e.g. an LCD display (see figure 10) where each pixel (1004) is controlled via a transistor (1003) and gate (1004) and signal (1002) lines, then additional circuitry may be added to the display so that the morphing between received frames is achieved by the pixels themselves.
  • every pixel might be addressed with two sets of luminance and chromaticity values: the first set representing the first received frame and the second set for the subsequent received frame.
  • Circuitry coupled to each pixel for example leaking charge hold circuitry, would then gradually adjust the luminance and chromaticity of the pixel from the first set of values to the second set.
  • the electronic circuitry controlling each pixel can be considered to be the interpolation unit.
  • Displays differ in their particular characteristics. For example, different CRTs have different phosphors that respond differently to an electron beam. Therefore, a particular image encoded in an image file will not necessarily look the same to the human eye on display different devices.
  • the image generated for the same image file specification may differ significantly in luminance, contrast and colour, unless some procedure is employed to adapt the RGB values so as to take into account the response characteristics of the display. Therefore, in addition to the interpolation calculations described above, the display device may calculate luminance and chromaticity values specific to the device on which the image is to be displayed.
  • the ideal image file contains RGB data (usually as an 8 bit integer) that is used to drive the red, green and blue components of each pixel in a display to generate the corresponding luminance values.
  • This file may also contain parameters describing the characteristics of the specific display device on which the image file was originally generated. For example, the luminance versus applied voltage relationships differ for each display e.g. for each phosphor and also for the same phosphor from one display device to another. Calibration data for a typical CRT is illustrated in figure 11. Therefore, if the image file contains such information and the display device knows its own performance characteristics, the computations needed to achieve an invariant colour/luminance reproduction of an image file can be carried out within the display device.
  • the image data file contains sufficient information to compute either the luminance and chromaticity of each pixel (i.e. Li, Xj, y ⁇ ) or the corresponding tristimulus values (i.e., Xj, Yj, Zj).
  • the tristimulus values represent how the colour and luminance of each pixel would appear to the human eye i.e. to a subject with normal red, green and blue cone photoreceptors in the eye. These two sets of values are equivalent and one can convert from one to the other (see e.g. G. Wysecki and W. Stiles, "Colour Science - Concepts and Methods, Quantitative Data and Formulas", John Wiley & Sons, 1982).
  • the luminance of the red, green and blue components that are needed to produce the same tristimulus values (i.e. Xi, Yj, Zj) as the image when displayed on the device on which it was generated.
  • the RGB values needed to generate the required luminances for the red, green and blue primaries can be calculated.
  • the nearest colour/luminance specification will be produced when the specified X, Y and Z values fall outside the gamut of the display device.
  • the display device also has its chromaticity co-ordinates (CIE-1931) and its luminance versus applied voltage relationship for each of the three primaries i.e.
  • Red phosphor or primary: x r> y r ,z r ; Green phosphor (or primary): x g , y g ,z g and Blue phosphor (or primary): Xb, yt > ,Zb.
  • D r , D 9 , D b and D are the determinants formed from the three simultaneous equations as follows:
  • the display device can achieve invariant colour/luminance reproduction, irrespective of the display characteristics of the device on which the image was originally generated.
  • the present invention may be implemented using any suitable display device, e.g. televisions (particularly high definition televisions), video phones, mobile telephones etc.
  • the invention is beneficial in applications where video data is transmitted over links having bandwidth restrictions, e.g. in networks where there is pressure to maximise the number of channels and/or users that may be accommodated at any one time, such as mobile phone networks or the internet.
  • the invention may also be beneficially used in devices that play video data that is stored in the device itself. Such devices generally have limited storage capacity, which can be utilised more efficiently by using the invention.
  • the invention may therefore be particularly advantageous in applications such as personal video players, mobile telephones, camcorders, video on demand services, telephone uplink etc, or in any device that plays back data that is received from a remote server, such as a personal computer.
  • the invention may be advantageously implemented in military applications, such as drones (i.e. pilotless planes) for front-line reconnaissance work.
  • the present invention may also be implemented in any suitable transmission system, e.g. telecommunications systems, network systems, cable and satellite systems etc.
  • the invention may be implemented using any format of transmitted image data, for example, MPEG, JPEG, TIFF etc.
  • the present invention provides a display device and a method for displaying images that requires little information to be transmitted and yet does not compromise the human perception of motion in the resulting displayed images. Therefore, the result is a flicker-free display and a large reduction in required transmission bandwidth. Although some additional signal processing is required within the display unit, this is offset by a reduction in the transmission bandwidth and improved motion perception. The transmission bandwidth required per transmission channel is reduced and therefore, more channels can be encoded using the same transmission system.
  • original video data at a relatively fast frame rate could be reduced in rate before transmission (e.g. to a rate at which flicker would be perceptible), and then interpolated when received so that it can be displayed at a higher rate at which flicker will not be perceived.
  • rate before transmission it is preferable to simply omit some frames.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)

Abstract

An image processing device comprising a receiver for receiving a first video stream defining a series of received frames of image data and an interpolation arrangement for generating one or more supplementary frames of image data by interpolating between successive ones of the received frames and thereby forming a second video stream comprising the received frames and the supplementary frames.

Description

REDUCED BANDWIDTH FLICKER-FREE DISPLAYS
The present invention relates to display devices that can generate flicker-free images from low bandwidth signals. In particular, the present invention relates to display devices that generate "morphed" image frames by interpolating between received image frames.
A moving object can be recorded on film by capturing a large number of still images of the object and then replaying the still images in rapid succession. The human brain reassembles the still images into a single moving scene. The larger the number of still images that are captured per second, the smoother the motion of an object in the resulting film appears to the human eye. If too few images per second are captured, then motion appears jerky and discontinuous which is known as "flicker". Therefore, in order to obtain high quality moving images it is preferable to have a high frequency frame rate.
Visual images may be transmitted as wired or wireless signals. However, whatever the type of communication link over which the signals are transmitted, there are always bandwidth constraints that make it desirable to keep the signal bandwidth to a minimum. This is particularly true for telecommunications applications such as mobile phones, where it may be desired to transmit video clips etc over a limited bandwidth transmission channel. Therefore, for transmitting visual images, it is preferable for those signals to have a limited bandwidth, i.e. for the frame rate to be as low as possible.
A typical motion channel has a reduced bandpass temporal response characteristic with little signal power over 10Hz. However, flicker detection mechanisms within the human brain are able respond to rapid local light flux changes at a much higher frequency (typical cut-off being around 50 to 75Hz) than the received frame frequency. Therefore, there is always a compromise that must be made between the quality of moving images displayed to a user of a display device and satisfying the bandwidth constraints for transmitting the images to the display device. There is therefore a need for an improved method for transmitting video and a display device for using this information so as to generate high-quality moving images.
According to one embodiment of the invention, there is provided an image processing device comprising a receiver for receiving a first video stream defining a series of received frames of image data and an interpolation arrangement for generating one or more supplementary frames of image data by interpolating between successive ones of the received frames and thereby forming a second video stream comprising the received frames and the supplementary frames.
Preferably a display is arranged to display the second video stream.
Preferably the interpolation arrangement is arranged to form the supplementary frames such that the luminance of each area of a successive frame is interpolated between the luminances of the corresponding areas of those of the received frames immediately preceding and succeeding the respective successive frame.
Preferably the interpolation arrangement is arranged to form the supplementary frames such that the chromaticity of each area of a successive frame is interpolated between the chromaticities of the corresponding areas of those of the received frames immediately preceding and succeeding the respective successive frame.
The said interpolation may be a linear interpolation or a non-linear interpolation.
The said areas may be pixels of the respective frames. The said areas may be multi- pixel blocks of the respective frames. Preferably the frame rate of the first video stream is below that at which flicker is perceived by a human viewer. Preferably the frame rate of the second video stream is above that at which flicker is perceived by a human viewer.
The interpolation arrangement may be implemented as a plurality of processing units each of which performs interpolation for a respective pixel of the display.
Preferably the interpolation arrangement is provided in a separate housing from the display.
According to a second embodiment of the invention, there is provided an image processing method comprising receiving by means of a receiver a first video stream defining a series of received frames of image data and generating one or more supplementary frames of image data by interpolating between successive ones of the received frames and thereby forming a second video stream comprising the received frames and the supplementary frames.
Preferably the method comprises displaying the second video stream.
Preferably the method comprises transmitting the first video stream to the receiver.
Preferably the method comprises forming the first video stream by processing an initial video stream to reduce the frame rate thereof prior to the said transmission step.
Preferably the step of forming the first video stream comprises omitting frames from the initial video stream.
For a better understanding of the present invention, reference is made to the following drawings in which:
Figure 1 shows the arrangement of a basic pixel; Figure 2 shows the arrangement of a basic pixel in a liquid crystal display;
Figure 3 shows received frames and morphed frames;
Figure 4 shows two consecutive received frames and an interpolated morphed frame;
Figure 5 shows two consecutive received frames and an interpolated morphed frame for motion interpolation;
Figure 6 shows two consecutive received frames and an interpolated morphed frame for luminance and chromaticity interpolation;
Figure 7 shows the percentage contribution to the interpolated morphed frames of two received frames in a linear interpolation scheme;
Figure 8 shows the percentage contribution to the interpolated morphed frames of two received frames in a non-linear interpolation scheme;
Figure 9 shows a suitable display device for implementing the present invention;
Figure 10 shows the structure of a pixel in a liquid crystal display; and
Figure 11 shows a graph of calibration data for a typical cathode ray tube display.
Received images are displayed on display devices by dividing up an image into a large number of tiny areas. The human brain is unable to distinguish between the individual tiny areas and interprets them as a complete image. A display for displaying images is typically composed of a large number of pixels, each of which displays one of the tiny areas. Figure 1 illustrates a typical pixel. The pixel in figure 1 , shown generally at 101 , comprises red 102, green 103 and blue 104 components. Images are displayed by controlling the luminance and chromaticity (i.e. the colour) of each pixel. The required luminance and chromaticity values are typically represented in an image file by three values: R, G and B. Typically, each of the "RGB" values is an 8-bit integer in the range 0 to 255. The RGB values for each pixel are used to calculate the intensity with which the red, green and blue components of each pixel should be activated, corresponding to the desired luminance values (U) and chromaticity values (Xj and V1) for each pixel.
Figure 1 illustrates a pixel in which the red, green and blue components of the pixel are arranged as stripes. However, the three components may also be arranged as dots, or any other suitable arrangement. A pixel arrangement such as that illustrated in figure 1 is typically used in cathode ray tube (CRT) displays, in which each component is composed of a different phosphor that emits light when hit by an electron beam. One phosphor emits red light, one emits green and the other emits blue light when hit by an electron beam. In liquid crystal displays (LCDs) the arrangement is slightly different and each pixel emits a single colour of light. For example, a simple LCD arrangement is illustrated in figure 2 and comprises a light source 201 , a liquid crystal pixel arrangement 202 and a colour filter 203. Therefore, in an LCD display each pixel has a red, a green or a blue colour filter to create the required red, blue and green components.
An overview of the present invention will be given with reference to figure 3. Figure 3 illustrates a number of frames 301 to be displayed by a display device. Two of the frames 302 are received by the device at a frequency of fm Hz, so that one frame is received by the display device every Tm seconds. The frame frequency fm might typically be the motion sample rate, i.e. the rate at which the original images were captured, which could be e.g. 8Hz. The frames 301 that the device is to display also include additional frames 303. These frames are generated by the display device itself by "morphing" between two adjacent received frames, i.e. received frames n and n+1 in this case. By generating these additional "morphed" frames, the display device has m frames, rather than only one frame, to display in the time Tm seconds between received frames. Therefore, the display device displays still images at an increased display frequency ff, which might be e.g. 80 Hz.
The received frame frequency fm is typically in the range of 6 to 12Hz. The frequency of displayed frames that results from generating the morphed frames might typically be around 80Hz. As the morphed frames are generated within the display device, the transmitted signals can still be of a relatively low bandwidth even though the displayed frame rate is high enough to avoid perceptible flicker.
If, for example, the received frame frequency is 8Hz and the display frequency is 80Hz a display device might generate nine intermediate morphed frames for every received frame. When the display frequency is not an exact multiple of the frame frequency, the display device interpolates between consecutive received frames as usual but may insert a different number of morphed frames between received frames. For example, if the received frame frequency is 6Hz and the display frequency is 80Hz, the display device may generate 13 morphed frames to interpolate between some successive frames and 14 frames to interpolate between other successive frames. Human visual systems are unable to detect this difference in the number of displayed frames that is required to adapt the frame frequency to the display frequency of the display device. Therefore, a display device can accept any input frame frequency and generate interpolated frames at the display frequency.
The morphed frames may be generated pixel by pixel by the display device. In figure 4, portions of frames n and n+1 (401 , 402) are illustrated. Pixel 403 brightens considerably between the two frames. The display device therefore generates a morphed frame 404 between frames n and n+1 by interpolating between the two "brightness" values of pixel 403.
Figure 4 shows two received frames and a single morphed frame for the purposes of example only. Typically, the display device would generate more than one morphed frame to display between each received frame. As explained above, a typical pixel has two basic qualities that can be controlled in order to create an image from a large number of pixels. The first is luminance and the second is chromaticity. The display device may interpolate between one or both of these values for pixels in adjacent received frames to generate the morphed frames. However, to create a smooth perception of moving objects it is more important to interpolate luminance than chromaticity. Therefore, if a device has limited processing ability, it is preferred to interpolate between luminance values only, and to keep the chromaticity value constant at, for example, that of the initial received frame of the pair that is being interpolated between.
Embodiments of the present invention could also be used for motion interpolation. For example, in figure 5 the luminance and chromaticity values of pixel 504 in frame n (501 ) effectively move to pixel 506 in frame n+1 (502). Therefore, one option for interpolating between frames n and n+1 is to transfer the luminance and chromaticity values of pixel 504 to pixel 505 in morphed frame 503. However, this would require increased processing power, and could introduce unwanted perceptual artefacts. Instead, it is preferred to implement a system that works with images on an individual pixel by pixel basis, rather than one in which the image as a whole must be considered. Figure 6 illustrates the situation of figure 5 but with the interpolation being done on a pixel by pixel basis according to the method outlined above. Therefore, in figure 6 the values of pixels 604 and 605 in morphed frame 603 are generated by interpolating between the values of these pixels in frame n (601) and frame n+1 (602). Although it may appear that the frame sequence in figure 5 provides a smoother representation of a moving object than the sequence of figure 6, research has shown that human visual systems do in fact perceive smooth motion from a frame sequence such as the one illustrated in figure 6.
The interpolation could be done in a single operation for blocks of pixels that have very similar luminance and chromaticity values in adjacent received frames. This is especially useful if the received frames are compressed by a protocol that renders similar adjacent pixels in the uncompressed data with identical luminance and/or chromaticity in the compressed image. For example, the interpolation method described herein could be combined with a data compression format such as MPEG. In this way, a single interpolation calculation can be performed for a block of pixels, where the pixels have very similar luminance and chromaticity values in adjacent received frames.
The morphed frames may be generated by either linear or non-linear interpolation between pixel values. For example, figure 7 illustrates the contribution that successive received frames n and n+1 might make to the morphed frames generated by the device in a linear interpolation scheme. Thus, in figure 7, the contribution made by the pixel values of frame n (701 ) decreases linearly from 100% in frame n to 0% in frame n+1 while the contribution of the pixel values of frame n+1 (702) increases linearly from 0% in frame n to 100% in frame n+1. An example of a nonlinear interpolation scheme is illustrated in figure 8. In this example, the contribution to the morphed frames from received frame n+1 is initially small but increases in increasing incremental steps as the frames morph towards received frame n+1.
Where a device is interpolating between both luminance and chromaticity values of pixels in adjacent received frames, the two values can be interpolated according to different interpolation schemes. For example, one set of values could be interpolated according to a linear interpolation scheme whereas the other set of values could be interpolated according to a non-linear interpolation scheme. Also, the same number of interpolations need not be used for each set of values. For example, half the number of interpolations might be performed for the set of chromaticity values as for the luminance values. Thus, the chromaticity value of pixels within an image might be updated only every second morphed frame with the luminance value of those pixels being updated every morphed frame.
An example of a display device for implementing the interpolation method described above is illustrated in figure 9. The device comprises a receiver 901 , a signal decoder 902, an interpolation unit 904 and a display 905. The receiver 901 might be capable of receiving signals over a wired or wireless link. The received signals will typically include audio and visual information. The received signals are passed to the signal decoder 902, which extracts the frame information from the received signals. The frame information specifies the luminance and chromaticity values for each pixel of the display for each image frame. The extracted frame information is optionally passed to a buffer 903. The buffer may be implemented as part of the interpolation unit 904. The buffer introduces a delay of one frame into the system, thereby enabling the interpolation unit to generate morphed frames by interpolating between two consecutive received frames.
Given that the received frames will be delayed by at least one frame period (Tm) before being displayed on the display, it might be advisable to also delay any accompanying audio signal by the same length of time. This may be unnecessary if Tm is too short a time period for human perception to be able to detect any delay between the visual images and the accompanying audio signal. However, the longer that Tm becomes, i.e. the lower that the transmitted frame frequency becomes, the more likely it is that any associated delay between the displayed images and accompanying soundtrack will be noticeable to humans.
The interpolation unit 904 compares the luminance, and optionally the chromaticity, of each pixel in consecutive received frames and calculates the interpolated value for each pixel and each morphed frame. For example, if the luminance is interpolated according to a linear interpolation scheme such as that illustrated in figure 7 to produce four morphed frames the interpolated luminance values Ln between real
values L0 and L5 might be calculated according to Ln
Figure imgf000010_0001
1 jne resulting
image frames are then fed to the display 905, on which they are displayed.
The display 905 might be a cathode ray tube (CRT) display screen, an LCD display panel, a plasma screen display or any other suitable display. These different types of display are given for the purposes of example only and it should be understood that the present invention is not limited to any specific type of display. Figure 9 is a simple diagram that shows the relevant functional components of the display device as discrete functional blocks. This is for the purposes of example only and it should be understood that the various functions could be implemented separately or in combination: for example the interpolation unit could be implemented in a separate housing from the display screen. Similarly, it should be understood that further components may be present in an actual display device for implementing the present invention. For example, the signal decoder will extract other information from the received signal than frame information e.g. audio and control information, which will be forwarded to other functional blocks of the display device e.g. a loudspeaker.
The pixels of the display may be addressed directly for each morphed and interpolated frame. For example, if the display 905 is a CRT display, then the electron beam will scan each pixel for each received and morphed frame. Alternatively, the interpolation between received frames may be achieved by the display itself. For example, if the display is one in which the pixels are addressed directly through electronic means e.g. an LCD display (see figure 10) where each pixel (1004) is controlled via a transistor (1003) and gate (1004) and signal (1002) lines, then additional circuitry may be added to the display so that the morphing between received frames is achieved by the pixels themselves. For example, for each received frame, every pixel might be addressed with two sets of luminance and chromaticity values: the first set representing the first received frame and the second set for the subsequent received frame. Circuitry coupled to each pixel, for example leaking charge hold circuitry, would then gradually adjust the luminance and chromaticity of the pixel from the first set of values to the second set. In such an embodiment of the present invention, the electronic circuitry controlling each pixel can be considered to be the interpolation unit.
Displays differ in their particular characteristics. For example, different CRTs have different phosphors that respond differently to an electron beam. Therefore, a particular image encoded in an image file will not necessarily look the same to the human eye on display different devices. In particular, the image generated for the same image file specification may differ significantly in luminance, contrast and colour, unless some procedure is employed to adapt the RGB values so as to take into account the response characteristics of the display. Therefore, in addition to the interpolation calculations described above, the display device may calculate luminance and chromaticity values specific to the device on which the image is to be displayed.
The ideal image file contains RGB data (usually as an 8 bit integer) that is used to drive the red, green and blue components of each pixel in a display to generate the corresponding luminance values. This file may also contain parameters describing the characteristics of the specific display device on which the image file was originally generated. For example, the luminance versus applied voltage relationships differ for each display e.g. for each phosphor and also for the same phosphor from one display device to another. Calibration data for a typical CRT is illustrated in figure 11. Therefore, if the image file contains such information and the display device knows its own performance characteristics, the computations needed to achieve an invariant colour/luminance reproduction of an image file can be carried out within the display device.
According to this embodiment of the present invention, the image data file contains sufficient information to compute either the luminance and chromaticity of each pixel (i.e. Li, Xj, yι) or the corresponding tristimulus values (i.e., Xj, Yj, Zj). The tristimulus values represent how the colour and luminance of each pixel would appear to the human eye i.e. to a subject with normal red, green and blue cone photoreceptors in the eye. These two sets of values are equivalent and one can convert from one to the other (see e.g. G. Wysecki and W. Stiles, "Colour Science - Concepts and Methods, Quantitative Data and Formulas", John Wiley & Sons, 1982).
In order to generate the same colour/luminance specification on the new display device, it is necessary to compute the luminance of the red, green and blue components (primaries) that are needed to produce the same tristimulus values (i.e. Xi, Yj, Zj) as the image when displayed on the device on which it was generated. By using the luminance versus applied voltage relationships for each primary for the particular display device, the RGB values needed to generate the required luminances for the red, green and blue primaries can be calculated. The nearest colour/luminance specification will be produced when the specified X, Y and Z values fall outside the gamut of the display device.
The display device also has its chromaticity co-ordinates (CIE-1931) and its luminance versus applied voltage relationship for each of the three primaries i.e.
Red phosphor (or primary): xr> yr,zr; Green phosphor (or primary): xg, yg,zg and Blue phosphor (or primary): Xb, yt>,Zb.
Figure imgf000013_0001
Y = Lr +Lg +Lb (2)
(3) yr yg yb
We need to find Lr, L9 and Lb, in other words we need to solve three simultaneous equations. The solution in matrix notation is as follows:
- = ^ + ^ + + ^- (4) hence L = ^ A A A ; L = -*- ,' Lb = ^- D A A A A D g D " D
Dr, D9, Db and D are the determinants formed from the three simultaneous equations as follows: D
Figure imgf000014_0001
.ΞL yr yg yb
D = .ZL (8) yr ys yb
Zr z b yr ys yb
Once the luminance needed for each primary has been calculated i.e. l_r, L9 and U, the required RGB values required to reproduce each of the luminances from a knowledge of the luminance versus applied voltage relationship for each primary. Therefore, by using the above equations, the display device can achieve invariant colour/luminance reproduction, irrespective of the display characteristics of the device on which the image was originally generated.
The present invention may be implemented using any suitable display device, e.g. televisions (particularly high definition televisions), video phones, mobile telephones etc. In particular, the invention is beneficial in applications where video data is transmitted over links having bandwidth restrictions, e.g. in networks where there is pressure to maximise the number of channels and/or users that may be accommodated at any one time, such as mobile phone networks or the internet. The invention may also be beneficially used in devices that play video data that is stored in the device itself. Such devices generally have limited storage capacity, which can be utilised more efficiently by using the invention. The invention may therefore be particularly advantageous in applications such as personal video players, mobile telephones, camcorders, video on demand services, telephone uplink etc, or in any device that plays back data that is received from a remote server, such as a personal computer. In addition, the invention may be advantageously implemented in military applications, such as drones (i.e. pilotless planes) for front-line reconnaissance work. The present invention may also be implemented in any suitable transmission system, e.g. telecommunications systems, network systems, cable and satellite systems etc. Similarly, the invention may be implemented using any format of transmitted image data, for example, MPEG, JPEG, TIFF etc.
The present invention provides a display device and a method for displaying images that requires little information to be transmitted and yet does not compromise the human perception of motion in the resulting displayed images. Therefore, the result is a flicker-free display and a large reduction in required transmission bandwidth. Although some additional signal processing is required within the display unit, this is offset by a reduction in the transmission bandwidth and improved motion perception. The transmission bandwidth required per transmission channel is reduced and therefore, more channels can be encoded using the same transmission system.
Since the present invention allows video data transmitted at a relatively slow frame rate to be displayed in a flicker-free manner, original video data at a relatively fast frame rate could be reduced in rate before transmission (e.g. to a rate at which flicker would be perceptible), and then interpolated when received so that it can be displayed at a higher rate at which flicker will not be perceived. To reduce the rate before transmission it is preferable to simply omit some frames.
The applicant hereby discloses in isolation each individual feature described herein and any combination of two or more such features, to the extent that such features or combinations are capable of being carried out based on the present specification as a whole in light of the common general knowledge of a person skilled in the art, irrespective of whether such features or combinations of features solve any problems disclosed herein, and without limitation to the scope of the claims. The applicant indicates that aspects of the present invention may consist of any such feature or combination of features. In view of the foregoing description it will be evident to a person skilled in the art that various modifications may be made within the scope of the invention.

Claims

1. An image processing device comprising: a receiver for receiving a first video stream defining a series of received frames of image data; and an interpolation arrangement for generating one or more supplementary frames of image data by interpolating between successive ones of the received frames and thereby forming a second video stream comprising the received frames and the supplementary frames.
2. An image processing device as claimed in claim 1 , comprising a display is arranged to display the second video stream.
3. An image processing device as claimed in claim 1 or 2, wherein the interpolation arrangement is arranged to form the supplementary frames such that the luminance of each area of a successive frame is interpolated between the luminances of the corresponding areas of those of the received frames immediately preceding and succeeding the respective successive frame.
4. An image processing device as claimed in any preceding claim, wherein the interpolation arrangement is arranged to form the supplementary frames such that the chromaticity of each area of a successive frame is interpolated between the chromaticities of the corresponding areas of those of the received frames immediately preceding and succeeding the respective successive frame.
5. An image processing device as claimed in claim 3 or 4, wherein the said interpolation is a linear interpolation.
6. An image processing device as claimed in claim 3 or 4, wherein the said interpolation is a non-linear interpolation.
7. An image processing device as claimed in any of claims 3 to 6, wherein the said areas are pixels of the respective frames.
8. An image processing device as claimed in any of claims 3 to 6, wherein the said areas are multi-pixel blocks of the respective frames.
9. An image processing device as claimed in any preceding claim, wherein the frame rate of the first video stream is below that at which flicker is perceived by a human viewer.
10. An image processing device as claimed in any preceding claim, wherein the frame rate of the second video stream is above that at which flicker is perceived by a human viewer.
11. An image processing device as claimed in claim 2 or any of claims 3 to 10 as dependent on claim 2, wherein the interpolation arrangement is implemented as a plurality of processing units each of which performs interpolation for a respective pixel of the display.
12. An image processing device as claimed in claim 2 or any of claims 3 to 10 as dependent on claim 2, wherein the interpolation arrangement is provided in a separate housing from the display.
13. An image processing method comprising: receiving by means of a receiver a first video stream defining a series of received frames of image data; and generating one or more supplementary frames of image data by interpolating between successive ones of the received frames and thereby forming a second video stream comprising the received frames and the supplementary frames.
14. An image processing method as claimed in claim 13, comprising displaying the second video stream.
15. An image processing method as claimed in claim 13 or 14, comprising transmitting the first video stream to the receiver.
16. An image processing method as claimed in claim 15, comprising forming the first video stream by processing an initial video stream to reduce the frame rate thereof prior to the said transmission step.
17. An image processing method as claimed in claim 16, wherein the step of forming the first video stream comprises omitting frames from the initial video stream.
18. An image processing device substantially as described herein with reference to the accompanying drawings.
19. An image processing method substantially as described herein with reference to the accompanying drawings.
PCT/GB2005/004864 2004-12-15 2005-12-15 Reduced bandwidth flicker-free displays WO2006064250A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB0427689.5 2004-12-15
GB0427689A GB0427689D0 (en) 2004-12-15 2004-12-15 Reduced bandwidth flicker-free displays

Publications (1)

Publication Number Publication Date
WO2006064250A1 true WO2006064250A1 (en) 2006-06-22

Family

ID=34090242

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2005/004864 WO2006064250A1 (en) 2004-12-15 2005-12-15 Reduced bandwidth flicker-free displays

Country Status (2)

Country Link
GB (1) GB0427689D0 (en)
WO (1) WO2006064250A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8295364B2 (en) 2009-04-02 2012-10-23 Sony Corporation System and method of video data encoding with minimum baseband data transmission

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0396360A2 (en) * 1989-04-28 1990-11-07 Victor Company Of Japan, Limited Apparatus for inter-frame predictive encoding of video signal
US5995154A (en) * 1995-12-22 1999-11-30 Thomson Multimedia S.A. Process for interpolating progressive frames
US6229571B1 (en) * 1998-07-23 2001-05-08 Nec Corporation Scan converter with interpolating function
US20020175882A1 (en) * 2001-05-22 2002-11-28 Koninklijke Philips Electronics N.V. Display devices and driving method therefor
EP1307056A1 (en) * 1995-06-30 2003-05-02 Mitsubishi Denki Kabushiki Kaisha Scan conversion apparatus with improved vertical resolution and flicker reduction apparatus
US20040071313A1 (en) * 2002-08-07 2004-04-15 Marko Hahn Apparatus and method for motion-vector-aided interpolation of a pixel of an intermediate image of an image sequence

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0396360A2 (en) * 1989-04-28 1990-11-07 Victor Company Of Japan, Limited Apparatus for inter-frame predictive encoding of video signal
EP1307056A1 (en) * 1995-06-30 2003-05-02 Mitsubishi Denki Kabushiki Kaisha Scan conversion apparatus with improved vertical resolution and flicker reduction apparatus
US5995154A (en) * 1995-12-22 1999-11-30 Thomson Multimedia S.A. Process for interpolating progressive frames
US6229571B1 (en) * 1998-07-23 2001-05-08 Nec Corporation Scan converter with interpolating function
US20020175882A1 (en) * 2001-05-22 2002-11-28 Koninklijke Philips Electronics N.V. Display devices and driving method therefor
US20040071313A1 (en) * 2002-08-07 2004-04-15 Marko Hahn Apparatus and method for motion-vector-aided interpolation of a pixel of an intermediate image of an image sequence

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
BLUME H: "NEW ALGORITHM FOR NONLINEAR VECTOR-BASED UPCONVERSION WITH CENTER WEIGHTED MEDIANS", JOURNAL OF ELECTRONIC IMAGING, SPIE / IS & T, US, vol. 6, no. 3, 1 July 1997 (1997-07-01), pages 368 - 378, XP000704802, ISSN: 1017-9909 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8295364B2 (en) 2009-04-02 2012-10-23 Sony Corporation System and method of video data encoding with minimum baseband data transmission

Also Published As

Publication number Publication date
GB0427689D0 (en) 2005-01-19

Similar Documents

Publication Publication Date Title
US9635377B2 (en) High dynamic range image processing device and method
US8947498B2 (en) Method and device for processing multi-picture video image
JP6005882B2 (en) Global display management based light modulation
KR102176398B1 (en) A image processing device and a image processing method
EP1704727B1 (en) Flicker-free adaptive thresholding for ambient light derived from video content mapped through unrendered color space
US8295364B2 (en) System and method of video data encoding with minimum baseband data transmission
US20070091111A1 (en) Ambient light derived by subsampling video content and mapped through unrendered color space
EP2713611B1 (en) Image display adjustment method, device and system
TW201009804A (en) Converting three-component to four-component image
US20130044122A1 (en) Color adjustment circuit, digital color adjustment device and multimedia apparatus using the same
TW200308165A (en) Signal processing unit and liquid crystal display device
EP2030437A2 (en) Display information feedback
US20100156956A1 (en) Grayscale characteristic for non-crt displays
US20030218695A1 (en) Color reproduction method and system, and video display method and device using the same
US7443453B2 (en) Dynamic image saturation enhancement apparatus
CN107277475A (en) Laser television image processing method, laser television and computer-readable recording medium
WO2006064250A1 (en) Reduced bandwidth flicker-free displays
CN1787648B (en) Video-signal-processing device and video-signal-transfer method
US8350949B2 (en) Method for transmitting man-machine operation picture, mobile video device thereof, and video system using the same
JP6602977B2 (en) Transmission device, transmission method, control program, and recording medium
WO2004107256A1 (en) Method of color compression
KR100640814B1 (en) Deinterlacing device and method of video equipment
Litwic et al. Delivery of high dynamic range video using existing broadcast infrastructure
JPH09214921A (en) Device, method for processing image and image communication system
Kurita 30.1: Invited Paper: Cooperation of Video‐System Components for Construction of High‐Image‐Quality Systems

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KN KP KR KZ LC LK LR LS LT LU LV LY MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 05818459

Country of ref document: EP

Kind code of ref document: A1

WWW Wipo information: withdrawn in national office

Ref document number: 5818459

Country of ref document: EP