[go: up one dir, main page]

CN101373590B - Image frame processing method and device for displaying moving images to a variety of displays - Google Patents

Image frame processing method and device for displaying moving images to a variety of displays Download PDF

Info

Publication number
CN101373590B
CN101373590B CN200810161915.5A CN200810161915A CN101373590B CN 101373590 B CN101373590 B CN 101373590B CN 200810161915 A CN200810161915 A CN 200810161915A CN 101373590 B CN101373590 B CN 101373590B
Authority
CN
China
Prior art keywords
frame
picture
image
fast
picture frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN200810161915.5A
Other languages
Chinese (zh)
Other versions
CN101373590A (en
Inventor
青木幸代
大场章男
冈正昭
佐佐木伸夫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment Inc
Original Assignee
Sony Computer Entertainment Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2005100075A external-priority patent/JP4789494B2/en
Application filed by Sony Computer Entertainment Inc filed Critical Sony Computer Entertainment Inc
Publication of CN101373590A publication Critical patent/CN101373590A/en
Application granted granted Critical
Publication of CN101373590B publication Critical patent/CN101373590B/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Landscapes

  • Controls And Circuits For Display Device (AREA)
  • Digital Computer Display Output (AREA)
  • Image Generation (AREA)

Abstract

A rendering process for rendering an image frame and a postprocess for adapting the image frame to a display are separated. A rendering processing unit 42 generates an image frame sequence by performing rendering at a predetermined frame rate regardless of a condition that the image frame should meet for output to the display. A postprocessing unit 50 subjects the image frame sequence generated by the rendering processing unit to a merge process so as to generate and output an updated image frame sequence that meets the condition. Since the rendering process and the postprocess are separated, the image frame sequence can be generated regardless of the specification of the display such as resolution and frame rate of the display.

Description

Image frame processing method and to the equipment of various display show events images
The application be that May 18, application number in 2005 are 200580024443.3 the applying date, the dividing an application of application for a patent for invention that denomination of invention is " image frame processing method and to the equipment of various display show events images ".
Technical field
The present invention relates generally to picture frame treatment technology, more specifically relate to generation and be suitable for the various picture frame treatment technologies that present the sequence of image frames of condition.
Background technology
Along with the improvement of the technology of the thin display for the manufacture of such as liquid crystal display and plasma scope with and the minimizing of price, the surrounding at us has various display devices for the reproduction of movable (moving) image now.
The specification of displayable image data that comprises frame frequency and resolution is according to type of display and difference.According to correlation technique, in display device (display device), process from the view data of picture production device output to generate the image that is suitable for this display.In this is arranged, along with this display should be applicable frame frequency and the increase of the kind of resolution, exploitation circuit and the needed manpower of software will increase.Associated therewith, the processing load being placed on display device has also increased.
Summary of the invention
Considered the problems referred to above and made the present invention, and the object of invention is to provide the picture frame treatment technology that is suitable for various types of displays.
One aspect of the present invention is image frame processing method.This method comprises: for presenting, presenting of sequence of image frames processes and by presenting, process the aftertreatment that the sequence of image frames generating is suitable for display for making.At this, present in processing, by carrying out with predetermined frame frequency, present to generate sequence of image frames and do not consider to make in order to output to display the condition that picture frame should be satisfied.In this aftertreatment, the sequence of image frames that presents processing generation by this stands predetermined process to generate and export the sequence of image frames that meets this condition.
Because present, to process be separated with aftertreatment, so can synthetic image frame sequence and have nothing to do with resolution such as display and the specification of the display frame frequency.
Another aspect of the present invention provides for carrying out predetermined process so that the Dynatic image display method showing from storer reading images frame sequence and to these sequence of image frames, it comprises: when occurring advancing fast request, read selectively the picture frame that will be shown as the picture that advances fast from storer; And carry out by the picture frame to read the picture frame that predetermined integrated processing creates renewal; And the picture frame of display update.
According to this aspect, because created by integrated a plurality of picture frames the frame that advances fast, so can obtain the picture that advances fast of the value of being added (value-added)." integrated processing " herein refers to uses part or all image information being included in a plurality of picture frames to create the picture frame upgrading.
Another aspect of the present invention provides a kind of and has processed so that the Dynatic image display method showing for carrying out to merge from storer reading images frame sequence and to described sequence of image frames, comprise: the picture frame that creates the renewal that will be shown as the picture that advances fast by a plurality of picture frames in merging storer, wherein the number of picture frame, according to the request of advancing fast, is used and is included in the whole frames in this sequence of image frames in merging; And the picture frame that shows described renewal.
Another aspect of the present invention provides a kind of live image that comprises the sequence of image frames of storing in storer by use to create the image frame processing method of the frame that advances fast, comprise: in response to the request of advancing fast, use monochrome information or mobile message from storer, to read the picture frame that has wherein occurred predetermined characteristic; Integrated read picture frame is to create the described frame that advances fast; And using predetermined frame frequency and show that the frame that advances fast being created is as the picture that advances fast.
It is a kind of for create the picture frame treatment facility of the frame that advances fast in response to the request of advancing fast that another aspect of the present invention provides, and comprises: storage unit, for storing the moving image data that comprises sequence of image frames; Processing unit fast advances, for reading in response to the request of advancing fast, use monochrome information or mobile message the picture frame that has wherein occurred predetermined characteristic from described storage unit, this advance fast processing unit also integrated read picture frame create the frame that advances fast; And picture synthesis unit, for the frame that advances fast being created with predetermined frame frequency demonstration, using as the picture that advances fast.
It is a kind of for create the picture frame treatment facility of the frame that advances fast in response to the request of advancing fast that another aspect of the present invention provides, and comprises: storage unit, for storing the moving image data that comprises sequence of image frames; Processing unit fast advances, for create the picture frame of the renewal that will be shown as the picture that advances fast by merging a plurality of picture frames of storer, wherein the number of picture frame, according to the request of advancing fast, is used and is included in the whole frames in this sequence of image frames in merging; And picture synthesis unit, for the frame that advances fast being created with predetermined frame frequency demonstration, using as the picture that advances fast.
Realization of the present invention with method, device, system, computer program and recording medium form also can be used as other pattern of the present invention and puts into practice.
Accompanying drawing explanation
Fig. 1 shows the hardware configuration in entertainment device.
Fig. 2 is the functional block diagram of picture frame treatment facility.
Fig. 3 shows basis coordinate system.
Fig. 4 shows from presenting in the sequence of image frames of processing unit output, how changing off-set value at every four frames.
Fig. 5 has illustrated the point sampling that relates to four picture frames.
Fig. 6 has illustrated the generation for the picture frame of the size of original four picture frames four times.
Fig. 7 has illustrated the motion blur that uses four picture frames.
Fig. 8 has illustrated merged to produce four picture frames of the picture frame of same size.
Fig. 9 has illustrated by the sampling of execution point on four picture frames and bilinear interpolation subsequently, has generated enlarged image.
Figure 10 has illustrated bilinearity sampling.
Figure 11 is for the process flow diagram merging the definite merging method in condition setting unit.
Figure 12 shows according to the hardware configuration of the picture frame treatment facility 200 of second embodiment of the invention.
Figure 13 is according to the functional block diagram of the picture frame treatment facility of example 4.
Figure 14 shows for extracting picture frame from sequence of image frames and creating the principle (concept) of the processing of the frame that advances fast.
Figure 15 shows for merging a plurality of picture frames to create the principle of the processing of the frame that advances fast.
Figure 16 shows for reducing merged picture frame number and creating the principle of processing of the picture that advances fast of the pace with reduction.
Figure 17 shows according to the functional block diagram of the picture frame treatment facility of example 5.
Figure 18 shows for extract the principle of the processing of picture frame based on monochrome information.
Figure 19 is according to the functional block diagram of the picture frame treatment facility of example 6.
Figure 20 shows for picture frame being separated into the principle of the processing of specific image region and nonspecific image-region.
Figure 21 is according to the functional block diagram of the picture frame treatment facility of example 7.
Figure 22 shows the principle of path Graphics Processing.
Figure 23 is the functional block diagram that can realize according to the picture frame treatment facility of the processing of example 4 to 7.
Embodiment
(the first embodiment)
To by introducing this creativeness equipment wherein, be applied to present the embodiment of the entertainment device of three dimensional computer graphics (CG) image now, provide according to the description of picture frame treatment facility of the present invention.
Fig. 1 shows the hardware configuration in entertainment device 100.Entertainment device 100 has graphic chips 18 and can be presented and on display 26, shown in real time 3-D view by execution.
Entertainment device 100 has host CPU 12, primary memory 14, geometrical processor 16, graphic chips 18, display controller 28 and input and output (I/O) port 30.These pieces are connected with each other to allow mutual data to transmit and to receive via graphics bus 36.Display controller 28 can be connected to one of various displays 26, and each in these displays has different specifications and display condition.
Input and output port 30 is connected to the External memory equipment 32 such as CD-ROM drive, DVD-ROM driver or hard disk drive, and is connected to the input equipment 34 such as keyboard or mouse to feed and key in data and coordinate data to entertainment device 100.Input and output port 30 is controlled at External memory equipment 32 and the input equipment 34 data input and output in the two.Input and output port 30 reads to be stored in and presents data or program in External memory equipment 32, then they is offered to host CPU 12 and geometrical processor 16.For example, presenting data can be the object data that presents object.Input and output port 30 can be configured to communicate to input with miscellaneous equipment present data and program.
Host CPU 12 is controlled entertainment device 100 globally, and carries out the program that presents in External memory equipment 32 that is stored in.When executive routine, host CPU 12, by use the input control graphic chips 18 of input equipment 34 according to user, carrys out the demonstration of control chart picture.
Host CPU 12 by being controlled at the data transmission between constitution equipment control entertainment device 100.For example, host CPU is controlled the geometric data that generated by geometrical processor 16 to the transmission of graphic chips 18 with primary memory 14 as buffering.Host CPU 12 is also managed the synchronous of data transmission between graphic chips 18, External memory equipment 32, input equipment 34 and display 26.In this embodiment, geometrical processor 16 and host CPU 12 are provided respectively.As selection, these elements can integrate so that host CPU 12 can be carried out the function of geometrical processor 16.
Primary memory 14 is stored object configuration (configuration) data that read from External memory equipment 32 and is presented program.Each object data comprises a plurality of polygonal vertex data that forms associated objects.Primary memory 14 has texture buffer, for storing the texture for texture.
Under the control of host CPU 12, the object data that 16 pairs of geometrical processors are stored in primary memory 14 is carried out such as the coordinate conversion of definition position or configuration and the geometric manipulations conversion, or the processing relevant to the light source that irradiates summit.The geometric data obtaining as the result of geometric manipulations comprises the apex coordinate of object, in texture coordinate and the object properties such as the brightness of summit at place, summit.
Graphic chips 18 comprises and presents processor 20, memory interface 22 and the video memory such as EDRAM 24.Present processor 20 and under the control of host CPU 12, sequentially read the geometric data being generated by geometrical processor 16, and this geometric data is carried out to present and process with synthetic image frame.The α value of the rgb value of the pixel in picture frame and indication transparency is stored in video memory 24.The Z value of indication pixel depth is stored in the (not shown) of Z buffer zone.Z buffer zone can be provided in video memory 24.
Graphic chips 18 present processor 20 according to the order that presents providing from host CPU 12, via memory interface 22, be presented on the picture frame in video memory 24.Between processor 20 and memory interface 22 and between memory interface 22 and video memory 24, set up high-speed bus and connect presenting, can be to carry out and present processing in video memory 24 at a high speed to present processor 20.For instance, present the picture frame that processor 20 presents 640 * 480 pixels.
By presenting the picture frame that processor 20 presents, be temporarily stored in video memory 24.Host CPU 12 is via memory interface 22 retrieving images frame from video memory 24, and this picture frame is written in other storer such as primary memory 14.As required, host CPU 12 is converted to this picture frame the picture frame that can show on display 26.Then display controller 28 receives this picture frame and show it on display 26 via bus 36.
Fig. 2 is the functional block diagram of picture frame treatment facility 10.Function in Fig. 2 is mainly realized by graphic chips 18, host CPU 12 and primary memory 14.Fig. 2 is the explanation that concentrates on function.Therefore, these functions can be only by hardware, only differently realize by software or by the combination of hardware and software.
Object data reading unit 40 reads the geometric data of the object that will present.Present processing unit 42 and sequentially present with the frame frequency of being scheduled to the picture frame that comprises the object with predetermined resolution.The picture frame presenting is stored in the first memory 44 as impact damper.Present processing unit 42 and present this picture frame with the frame frequency of the maximum frame rate of the display 26 that equals or use higher than plan.For example, present processing unit 42 and present processor 20 corresponding to Fig. 1.
Transmission control unit (TCU) 46 is read the picture frame being stored in first memory 44, is then stored in second memory 48.A plurality of picture frames of second memory 48 storage, to can be identified in the chronological order between a plurality of picture frames.For example, first memory 44 is corresponding to the video memory 24 in Fig. 1, and second memory 48 is corresponding to primary memory 14.Alternatively, second memory can in picture frame treatment facility 10, provide, any memory device or storer such as External memory equipment 32.In addition, alternatively, each in first memory 44 or second memory 48 can be corresponding to the different storage zone in identical physically storer.
Interface unit 52 obtains resolution or the frame frequency information of the display 26 that is connected to picture frame treatment facility 10.Interface unit 52 can or present the information that program is obtained content from host CPU 12, such as this image of indication, is rest image or the information of live image.Interface unit 52 can obtain via input equipment 34 resolution or the frame frequency information of display from user.By obtained information delivery to post-processing unit 50.For example, this post-processing unit 50 can be corresponding to the host CPU 12 in Fig. 1.
Post-processing unit 50 comprises and merges condition setting unit 54, frame sequence acquiring unit 56 and merge performance element 58.50 pairs of post-processing units present and are stored in the sequence of image frames execution aftertreatment in second memory 48 by presenting processing unit 42, to generate the image that can show on display.
Particularly, merge the information of condition setting unit 54 based on receiving from interface unit 52, appropriate merging condition be set for sequence of image frames.With reference to Figure 10, this processing is described after a while.The conditional information retrieval sequence of image frames that frame sequence acquiring unit 56 bases are arranged by merging condition setting unit 54, and retrieved picture frame is delivered to and merges performance element 58.Merge the picture frame execution merging processing that 58 pairs of performance elements are received." merge and process " herein refers in the middle of a plurality of picture frames and generates single image frame.
By processed the picture frame generating by merging by merging performance element 58, with the frame frequency by merging 54 settings of condition setting unit, output to image-display units 60.
As mentioned above, a feature of the first embodiment is that sequence of image frames " presents " and " aftertreatment " do not carried out on same chip, but separated execution.By separation, presenting the advantage obtaining with aftertreatment is the dependence (dependence) of having eliminated being connected to the type of display of picture frame treatment facility.
More specifically, although interface unit 52 has obtained, output to the condition that the picture frame of display should be satisfied, present processing unit 42 synthetic image frame sequences and the conditional independence of obtaining with interface unit 52.Subsequently, post-processing unit 50 to carrying out predetermined process by the sequence of image frames that presents processing unit 42 generations, thinks that this display generates the sequence of image frames upgrading according to the condition of being obtained by interface unit 52.Thus, the rendering method presenting in processing unit 42 needn't should satisfied condition change according to picture frame.Therefore, present processing unit 42 and only need to there is general structure.When being connected to the display of picture frame treatment facility 10, change into when dissimilar, the modification by the processing in post-processing unit 50 adapts to (accommodate) this change.Therefore, can connect a greater variety of displays to guarantee quite high-grade compatibility.This reduces exploitation circuit and software the most at last so that treatment facility is suitable for the needed manpower of various displays.
In addition, according to correlation technique, even when presenting processor and have height and present ability, the specification of display is also usually failed to be suitable for this and is presented ability, and this need to limit the ability that presents that presents processor.Yet, according to the first embodiment, needn't limit but can utilize the ability that presents processor completely.Can arrange like this aftertreatment, make the raising to resolution etc. by the capability distribution presenting with high frame frequency.By guaranteeing that aftertreatment has height versatility, can improve the dirigibility aspect designed image frame processing device.In addition, because the task of not distributing execution graph picture frame to process to display, so can reduce the processing load being placed on display.
In addition considerable, when changing while presenting the frame frequency in the live image such as animated video, can showing on display, be different from animated image that image creation person wants and represent (exhibiting) and move.According to correlation technique, a kind of method of processing this problem is to consider be finally presented at the frame frequency of the picture frame on display and prepare a plurality of image sequences, to enjoy optimum viewing experience with any frame frequency.On the contrary, according to the first embodiment, only need to prepare to have a sequence of image frames of high frame frequency, and irrelevant with the frame frequency of display.
First memory 44 plays frame buffer, its frame by frame (by one frame) storage by the sequence of image frames that presents processing unit 42 and present.The picture frame being temporarily stored in first memory 44 is sequentially transferred to second memory 48.Therefore the perform region that, second memory 48 mainly plays post-processing unit 50.Play the first memory 44 of frame buffer effect conventionally with the general very high realizations such as EDRAM of its cost.If second memory is not provided in this embodiment, first memory need to be a jumbo storer, and this is because as described later, during four frame times, should process maximum four frames of storage for merging.By second memory 48 is provided, first memory 44 only needs to have storage by the capacity that presents at least one picture frame that processing unit 42 presents.Therefore, except first memory 44, also providing second memory 48 is favourable as the perform region of post-processing unit 50.First memory 44 can be built in wherein built-in to present in the same semiconductor circuit components of processing unit 42.
Fig. 3 shows the wherein pixel of each picture frame and is located at the basis coordinate system presenting in processing unit 42.Transverse axis is indicated by x and the longitudinal axis is indicated by y.Pixel coordinate collection is indicated by (x, y).Each in x and y coordinate figure is fixed-point value, and it is represented by 12 integral parts and 4 fraction parts.As described in, each pixel be centered close to the rounded coordinate point place in basis coordinate system.Be stored in first memory the picture frame individual element presenting.Be used in reference to the coordinate system of the position in first memory and be called as window coordinates system.Using this coordinate system to implement storage address calculates.Window coordinates systems is the coordinate system being used in reference to the position in frame buffer, and it usings the upper left point (top left) of the rectangular area in impact damper as initial point.
Suppose that basis coordinate figure is that (Px, Py) and off-set value are (Offx, Offy), can provide with equation below window coordinates values (Wx, Wy).
Wx=Px-Offx
Wy=Py-Offy
Now description picture frame treatment facility is as shown in Figure 2 generated in aftertreatment to several examples of the sequence of image frames that is suitable for various display conditions.Suppose that presenting processing unit 42 presents 640 * 480 image with the frame frequency of 240 frames (hereinafter referred to as fps) per second.
(example 1)
Fig. 4 shows from presenting in the sequence of image frames of processing unit 42 output, how changing off-set value at every four frames.For convenience, suppose by presenting and generated picture frame F1, F2, F3 and F4 with described order.Present processing unit 42 without any skew present the first picture frame F1, with the skew of (0.5,0), present the second picture frame F2, with (0,0.5) skew presents the 3rd picture frame F3, and presents the 4th picture frame F4 with the skew of (0.5,0.5).By in presentation space continuously displacement as the coordinate that presents starting point, implement by the skew presenting in presenting that processing unit 42 carries out.Hereinafter, such processing will be called as " pixel displacement processing ".
Fig. 5 and Fig. 6 have illustrated and have merged the first example of processing.In this example 1, merge frame sequence to be created as the picture frame of four times of the sizes of presented picture frame.
Fig. 5 illustrates by presenting the schematic diagram that processing unit 42 is that present, how pixel that have pixel displacement is arranged in the same window coordinate system.Referring to Fig. 5 and Fig. 6, circle by " 1 " indication represents the pixel from the first picture frame F1, circle by " 2 " indication represents the pixel from the second picture frame F2, circle by " 3 " indication represents the pixel from the 3rd picture frame F3, and represents the pixel from the 4th picture frame F4 by the circle of " 4 " indication.Spacing between the center of the neighbor in each frame is " 1 " at x and y direction.As the result by presenting processing unit 42 and carrying out pixel displacement processing, pixel from picture frame F2 has been shifted 0.5 in the x-direction with respect to the associated pixel from picture frame F1, pixel from picture frame F3 has been shifted 0.5 in the y-direction with respect to the associated pixel from picture frame F1, and with respect to the associated pixel from picture frame F1, has been shifted in the x-direction 0.5 and be shifted in the y-direction 0.5 from the pixel of picture frame F4.Therefore, when these four picture frames are arranged in same coordinate system, as described in Figure 5, from the pixel of respective image frame in x direction and the equal spacing 0.5 of y direction.
By take 0.5 pixel as unit in window coordinates system rather than take 1 pixel as unit carries out grid (grid) sampling, can be created on the picture frame that x direction and y direction have twice number of pixels.With reference to Fig. 6, this is described.Fig. 6 illustrates the schematic diagram how pixel arranges.Although for simplicity, Fig. 6 only for picture frame 102,104,106 and 108 show horizontal 4 pixels and longitudinal 3 longitudinally, in fact in each frame, have horizontal 640 pixels and longitudinal 480 pixels.All pixel is arranged with the relation described in the picture frame 110 in Fig. 6.By arranging this 640 * 480-pixel map picture frame 102,104,106 and 108 with grid configuration as mentioned above, created the picture frame 110 with 1280 * 960 pixel sizes, its size is four times of 640 * 480 pixel map picture frames.Hereinafter, such method of sampling is called to " point sampling ".
According to this example 1, in presenting processing unit 42 by present to generate a plurality of picture frames with space displacement.Subsequently, 50 pairs of picture frames of post-processing unit are carried out and are kept the merging of the displacement between picture frame to process, to generate than have the sequence of image frames of higher spatial resolution from presenting the sequence of image frames of processing unit 42 outputs.Phrase " keep picture frame between displacement " means by using pixel from the picture frame of each skew to obtain final picture frame without any modification.Thus, can generate by aftertreatment the sequence of image frames of the different resolution that is suitable for different displays.
Example 1 also can be construed as reducing the measure of frame frequency.For example, by synthetic image frame from four picture frames as shown in Figure 3, frame frequency will be lowered to 1/4.This highest frame frequency for display is favourable lower than the situation that presents the frame frequency of processing unit 42, can sample to obtain low frame rate and high-resolution image by execution point.
Current embodiment is not limited to and from four picture frames, generates a picture frame.Alternatively, by synthetic image frame from nine picture frames, likely generate pixel than the picture frame of many nine times of original image frame.When the picture frame of the more big figure relating to, this is correct equally.The number of picture frame is larger, and the frame frequency of the final picture frame obtaining is lower.
(example 2)
Fig. 7 and 8 has illustrated and has merged the second example of processing.In this example 2, by merging successive frame, realize motion blur effects.
Referring to Fig. 7, empty circle representative is from the pixel that has stood the picture frame F1-F4 of pixel displacement processing.In this example, from the rgb value of four adjacent image points, obtain average RGB value, the value then result being obtained is as new rgb value.Synthetic image frame makes as shown in the diagonal line shaded circles in Fig. 7, the centre between the center of the center of each pixel in four empty circle pixels.
Fig. 8 schematically illustrates this layout.That is to say, generated 680 * 480 picture frame, it has by the rgb value of the pixel in 640 * 480 picture frame F1-F4 is multiplied by 0.25 rgb value obtaining.
According to example 2, in presenting processing unit 42 by present to generate a plurality of picture frames with space displacement.Subsequently, 50 pairs of picture frames of post-processing unit are carried out to merge and are processed the displacement between picture frame with payment.Therefore, generated and from presenting the sequence of image frames of processing unit 42 outputs, there is the sequence of image frames of the renewal of same spatial resolution.Phrase " displacement between payment picture frame " means finally to obtain unmigrated picture frame by the mixed picture frame that presents being offset.In this embodiment, mix four pixels and generated a pixel.The average image of the image that the counteracting image that time division frame obtains between the displacement between picture frame is equivalent in fact generate by the picture frame at two Time Continuous and use are divided is as the image of target frame.Therefore,, if be live image by the content that presents the sequence of image frames that processing unit presents, offsetting displacement can be to this live image application motion blur effects.As example 1, example 2 can be understood to reduce the measure of frame frequency.More specifically, example 2 makes when keeping original image frame resolution, and output frame frequency is that 1/4 the sequence of image frames that presents the frame frequency of processing unit 42 becomes possibility.
Also can usage example 2 when content is rest image.In this case, to process the rest image obtaining by merging, apply anti-aliasing (antialiasing) effect.In this case, merging is processed to be similar to and wherein original pixels is divided into sub-pixel to obtain the data for the object pixel of image, and wherein adopts the average data of sub-pixel as " super sampling (the super sampling) " of pixel data.
(example 3)
Fig. 9 has illustrated according to the merging of example 3 and has processed.In example 3, generate transverse and longitudinal than being different from by the picture frame that presents the transverse and longitudinal ratio of the original image frame that processing unit 42 generates.For example, suppose that presenting processing unit 42 generates each and have the first picture frame 112, the second picture frame 114, the 3rd picture frame 116 and the 4th picture frame 118 of the resolution of 720 * 480 pixels, and from picture frame 112118, generate have be different from original image frame 112-118 aspect ratio, resolution is the target image frame 122 of 1920 * 1080 pixels.Referring to Fig. 9, it is respectively from the pixel of the first picture frame 112, the second picture frame 114, the 3rd picture frame 116 and the 4th picture frame 118 that the pixel being associated is indicated in numeral " 1 ", " 2 ", " 3 " and " 4 ".
In first step 130, carry out the point sampling of describing in example 1.Thus, generated the picture frame 120 of 1440 * 960 pixels (being the big or small four times of each original image frame 112-118).Subsequently, in second step 132, carry out bilinearity sampling to generate the picture frame 122 of the resolution with 1920 * 1080 pixels.Bilinearity sampling is a kind of image interpolation method.In this example, by the linear interpolation of the rgb value of four pixels around, determine the pixel color that will present.
With reference to Figure 10, provide the description of bilinearity sampling.Suppose scaled picture frame 120 (1440 * 960), the coordinate of the pixel center of calculating chart picture frame 122 in the coordinate system of picture frame 120 of arriving of picture frame 122 (1920 * 1080).Figure 10 shows a part 124 for the picture frame 120 in Fig. 9.The center of the pixel of empty circle 140 representatives in picture frame 122.In order to determine the color that will use when these coordinate places present this pixel, according to the coordinate displacement at the center of the pixel 124a apart from Figure 10,124b, 124c and 124d, make rgb value stand linear interpolation.It should be noted that pixel 124a is the pixel from the first picture frame 112 in Fig. 9, pixel 124b is the pixel from the second picture frame 114, and pixel 124c is the pixel from the 3rd picture frame 116, and pixel 124d is the pixel from the 4th picture frame 118.The displacement of center along continuous straight runs of supposing four pixel 124a-124d of distance is α, and displacement is vertically β (referring to Figure 10), by following equation, is provided the rgb value of the empty circle 140 calculating by linear interpolation.
R=(1-α)(1-β)Rs1+α(1-β)Rs2+(1-α)βRs3+αβRs4(1)
G=(1-α)(1-β)Gs1+α(1-β)Gs2+(1-α)βGs3+αβGs4(2)
B=(1-α)(1-β)Bs1+α(1-β)Bs2+(1-α)βBs3+αβBs4(3)
Wherein Rs, Gs and Bs represent the rgb value of four pixel 124a-124d, and the component of suffix s1, s2, s3 and s4 difference represent pixel 124a, 124b, 124c and 124d.By determining color for whole pixel calculation equation (1)-(3) that comprise in picture frame 122, generated picture frame 122.
Equation (1) is the identical principle of the calculating based on Generalized Bilinear sampling to (3).Be different from general bilinearity sampling part and be, from the pixel of different picture frames, obtain color component.
When not being the lower time of relation of another integral multiple in one of them by the resolution that presents the resolution of the sequence of image frames that processing unit 42 presents and be presented on display, for example, when presented picture frame is 720 * 480 and the resolution that is presented at the picture frame on display while being 1920 * 1080, can not only by carrying out, pixel displacement be processed and point sampling obtain target resolution.In this case, by pixel displacement, process the middle graph picture frame that generates 1440 * 960 with point sampling, and carry out subsequently bilinearity sampling to obtain 1920 * 1080 picture frame.Thus, frame frequency is reduced to 1/ (number of the picture frame using in point sampling).
Can carry out pixel displacement processes.More specifically, can be directly by presenting, on the original image frame that processing unit 42 presents, carrying out bilinearity and sampling to obtain final picture frame.Yet, by carried out pixel displacement before bilinearity sampling, process and point sampling, likely obtain the image that represents less deterioration when amplifying.Alternatively, four picture frames in sequence of image frames (for example, size is 720 * 480) in each can be enlarged into and be suitable for the picture frame (for example, 1920 * 1080) that shows, based on this, can mix four enlarged image frames to obtain final picture frame.
Can select to user the chance of one of above-mentioned example 1 to 3.In putting into practice a kind of method of this embodiment, picture frame treatment facility can automatically carry out this and determine.Figure 11 is automatic definite process flow diagram of carrying out by merging condition setting unit 54.Which in this process flow diagram, carry out about according to frame frequency, resolution and content, listing determining of the aftertreatment described in carrying out example 1 to 3 by presenting image frame sequence that processing unit 42 presents.
Merge condition setting unit 54 and compare by the frame frequency information of the display obtaining via interface unit 52 with by the frame frequency that presents the picture frame that processing unit 42 presents, to determine whether these frame frequencies mate (S10).When frame frequency coupling (being yes at S10), transmission control unit (TCU) 46 is transferred to second memory 48 by picture frame from first memory 44.Frame sequence acquiring unit 56 is read picture frame with the identical time interval of the frame frequency with presenting processing unit 42 from second memory 48.Merging performance element 58 outputs to image-display units 60 by picture frame and does not carry out any aftertreatment (S12) such as merging processing.Thus, can utilize the performance that presents that presents processing unit 42 completely, and can present image with full specification (full-spec) frame frequency.In replacement method, can output image frame sequence and it temporarily need not be stored in second memory 48.More specifically, use first memory 44 as impact damper, frame sequence acquiring unit 56 is directly read the sequence of image frames presenting by presenting processing unit 42, and these picture frames are outputed to image-display units 60.
When frame frequency does not mate (being no in S10), merge condition setting unit 54 by the resolution by presenting the picture frame that processing unit 42 presents (hereinafter, be called " picture frame having presented ") and the resolution information of the display that obtains via interface unit 52 compare, to determine whether resolution mates (S14).If higher than the resolution (being yes in S14) of the picture frame having presented, merging condition setting unit 54, the resolution of display determines that the picture material being presented on display is rest image or live image (S16).Can be undertaken this and determine by reading information in the head etc. of the program of being registered in.Alternatively, can based on by motion determining unit (not shown), calculated, as the value of the component motion of the difference of adjacent image frame, carry out determining of relevant rest image or live image.If this content is the rest image (S16 is yes) such as the screen picture of word processor documents or html document, merges condition setting unit 54 and also determine whether the resolution of display is the integral multiple (S18) that has presented the resolution of image.When the vertical and horizontal pixel counts of the resolution being compared is a plurality of integral multiple of vertical and horizontal pixel counts of another resolution, when the picture frame for example having presented is 640 * 480 and the resolution of display while being 1280 * 960 or 1920 * 1440, make determining of " integral multiple ".When the resolution of display is the integral multiple of resolution of the picture frame that presented (being yes in S18), carry out with reference to figure 5 and 6 examples 1 of describing to obtain the sequence of image frames (S22) of expectation resolution.Therefore,, when the resolution of display has twice high, merge condition setting unit 54 and make four picture frames of frame sequence acquiring unit 56 retrievals.When the resolution of display has three times high, merge condition setting unit 54 and make nine picture frames of frame sequence acquiring unit 56 retrievals.Frame sequence acquiring unit 56 is delivered to obtained picture frame to merge performance element 58, based on this, merges performance element 58 execution point sampling on sent picture frame, so that the picture frame with expectation resolution is outputed to image-display units 60.
When determining that at S18 the resolution of display is not the integral multiple of the picture frame that presented (being no in S18), carry out with reference to figure 9 and 10 examples 3 of describing to obtain the sequence of image frames (S20) of expectation resolution.More specifically, merge condition setting unit 54 generate resolution be presented picture frame resolution integral multiple and approach the picture frame of desired resolution most.By carry out bilinearity on generated picture frame, sample, merging condition setting unit 54 also generates and does not wherein keep the image of transverse and longitudinal ratio.
When determining that at S16 this content is the live image (being no in S16) such as motion CG or film, merging condition setting unit 54 carries out with reference to figure 7 and 8 examples 2 of describing, to obtain the live image (S24) with motion blur effects.More specifically, merging condition setting unit 54 processes by using to be carried out to merge by the definite picture frame number of (frame frequency that has presented picture frame)/(display frame frequency).For example, when presenting the frame frequency of picture frame, be 240fps and the frame frequency of display while being 60fps, use 4 (=240/60) individual frame to be used for merging processing.If the result of this division is not integer, when presenting the frame frequency of image, be for example 240fps and the frame frequency of display while being 70fps, this result of division is 3.4 (=240/70).In this case, abandon fractional part, and three picture frames are carried out to merge and process so that the picture frame obtaining with 70fps output.
The frame frequency of the sequence of image frames of under any circumstance, exporting from post-processing unit 50 is lower than the frame frequency that presents picture frame by presenting processing unit 42 generations.For example, by execution point sampling, frame frequency is reduced to 1/4 resolution producing up to twice, and samples frame frequency is reduced to 1/9 resolution producing up to three times by execution point.Therefore, even while obtaining the resolution of expectation when the result as S20 or S22 processing, also likely produce flicker etc. due to low frame rate on screen.In replacement method, user's Tip element (not shown) can be provided, it warns user that frame frequency when obtaining the resolution of expectation will decline significantly on screen, and points out user to accept.When user accepts, execution point is sampled.When not accepting, not execution point sampling.In also having a method, merge the specification (resolution and frame frequency) that condition setting unit 54 can reference display and the aftertreatment that can carry out in post-processing unit 50, then on screen, show possible multipair resolution and the list of frame frequency.User's Tip element (not shown) can point out user to select a pair of of expectation, and selected a pair of being transferred to merged to condition setting unit 54.In response to this, merge condition setting unit 54 then guidance frame retrieval unit 56 and merging performance element 58.
Can before the demonstration that presents picture frame, carry out above-mentioned processing.Alternatively, can show and stand to merge the picture frame of processing according to pre-defined algorithm, to watch the user of shown result, can judge whether to proceed above-mentioned processing according to user's taste.
Return referring to S14, when the resolution of display is equal to or less than the resolution (being no in S14) of presented picture frame, merges condition setting unit 54 and determine whether this content is rest image (S26).When content is rest image (being yes in S26), merges condition setting unit 54 and skip some picture frames (S28) for showing.More specifically, merge condition setting unit 54 indication frame sequence acquiring units 56 and from the picture frame having presented of given number, obtain a picture frame.Merging performance element 58 outputs to these picture frames image-display units 60 and does not make them stand aftertreatment.For example, when the picture frame output frame frequency having presented is 240fps and the frame frequency of display while being 60fps, export each the 4th picture frame.
When determine that in S26 content is live image (being no) in S26, carry out as mentioned above example 2 to obtain the live image (S30) of motion blur.
Therefore, by frame frequency or resolution by presenting frame frequency or the resolution of the sequence of image frames that processing unit 42 presents and being connected to the display of picture frame treatment facility are compared, merge condition setting unit 54 and can automatically determine the condition of aftertreatment.
As mentioned above, allow to present processing unit with predetermined frame frequency carry out present with synthetic image frame sequence with for to output to display picture frame will be satisfied conditional independence.Post-processing unit is then to carrying out predetermined processing by the sequence of image frames that presents processing generation, and output meets sequence of image frames above-mentioned condition, that upgrade.
Because carry out discretely to present, process and aftertreatment, irrelevant with the display specification such as resolution or frame frequency so likely present to generate sequence of image frames with predetermined frame frequency execution.
Present processing unit 42 and be described to present with 240fps 640 * 480 pixel image.The picture frame that can present other pixel counts.The presentation speed of picture frame also can be lower or higher.For example, can present sequence of image frames by 300fps.In this case, can generate and be suitable for the two sequence of image frames of 50 hertz of displays and 60 hertz of displays.
Describe in the above in this embodiment, by merging to process, be described as the pixel from four picture frames to carry out.Alternatively, can carry out and merge processing the pixel of larger quantity.For example, can utilize pixel displacement to present six picture frames to be positioned at hexagonal respective vertices place from the pixel of each frame, and form picture frame by the pixel with the average RGB value of six pixels.
While describing this embodiment in the above, picture frame treatment facility is described as being built in for presenting the entertainment device of CG image.Yet picture frame treatment technology according to the present invention can also be applied to DVD player, personal computer, digital video camera etc.
(the second embodiment)
In the first embodiment, presented than the more picture frame of the needed picture frame of show events image on display.Then, presented picture frame is carried out to predetermined process so that output map picture frame is for showing.On the contrary, such embodiment is also envisioned for, and when Provision in advance live image, picks up a plurality of picture frames and make them stand predetermined process from this live image, and then output is less than the picture frame of read picture frame.By the latter embodiment, can create the picture that advances fast of original active image.In addition, can also create the live image that fasts rewind.Rewinding live image is the image of oppositely exporting along time shaft for live image.Hereinafter, " advance fast " and comprise the action of fasting rewind.
These two embodiment are different at first sight.Yet these two embodiment have identical principle, wherein make the picture frame that surpasses the frame finally providing to user stand predetermined process, and the picture frame that then output is upgraded.In other words, the difference between these two embodiment is only the gap length of output map picture frame.
Recently, the widespread that becomes of the digital active scanner-recorder such as HDD (hard disk drive) video recording apparatus.Therefore, can easily in person create, record or play a large amount of moving image datas.In such device, user searches for interested part in recorded moving image data by quick advancement function.Yet when live image is advanced fast, user usually misses part interested at searching period, and sensation search is sometimes inconvenient.
Therefore, in this second embodiment, even if will provide the picture frame treatment technology of also exporting the live image that is easy to watch when live image is advanced fast.
Figure 12 shows according to the hardware configuration of the picture frame treatment facility 200 of the second embodiment.Host CPU 12, primary memory 14, display 26, display controller 28, input and output (I/O) port 30 and External memory equipment 32 with as shown in Figure 1, identical according to the piece of the first embodiment, so identical digital distribution is given to these pieces and is omitted further illustrating these pieces.Camera apparatus 38 such as digital video camera is connected to input and output port 30.The live image being obtained by camera apparatus 38 is stored as numerical data in the External memory equipment 32 such as DVD (digital multi-purpose disk) driver and hard disk drive.Graphic process unit 80 is selected sequence of image frames from be stored in the moving image data External memory equipment 32, and is stored in primary memory 14.Then, 80 pairs of sequence of image frames of graphic process unit are carried out predetermined process to create the sequence of image frames upgrading, and the sequence of this renewal is outputed to display 26.
Picture frame treatment facility 200 can be incorporated in various types of live image display devices, and these display devices show the live image being comprised of sequence of image frames on display 26.Such live image display device can comprise such as DVD player and HDD video recorder for storing or the various devices of movie content.In addition, live image display device can also be incorporated in personal computer, digital video camera or entertainment device.
Input equipment 84 produces some to the input of picture frame treatment facility 200.According to the type of live image display device, can use various types of equipment as input equipment 84.For example, suppose that live image display device is DVD player or HDD video recorder, input equipment 84 can be the various buttons that provide in telepilot or live image display device.Suppose that live image display device is multi-purpose computer, input equipment 84 can be keyboard or mouse.
In this second embodiment, will describe when receiving prior establishment from user and being recorded in the advancing fast during request of movie contents the mass-memory unit such as DVD driver or HDD driver, the image that establishment is advanced fast.As the first embodiment, the second embodiment can be applied to carry out present and process to create for being presented at the entertainment device of the new image frame sequence on display.
Now, in the two field picture treatment facility 200 shown in Figure 12, with reference to some examples, describe for creating the method for the live image that advances fast of the value of being added.
(example 4)
Figure 13 is according to the functional block diagram of the picture frame treatment facility 200 of example 4.Feature in Figure 13 can mainly utilize graphic process unit 80, host CPU 12 and primary memory 14 to realize.In this example 4, illustrated a kind of in response to the method for that advance fast, level and smooth movable image is provided from the advancing fast request of user.
Interface unit 202 obtains the request of advancing fast of being carried out via input equipment 84 by user.For example, suppose that picture frame treatment facility 200 is incorporated in DVD player, this request of advancing is fast corresponding to forwarding button or the dial (of a telephone) quick pace information specified, such as " double-speed ", " 4 * (four times) speed " providing on main body or telepilot is provided.The request of advancing fast can specify rather than provide from user in the head part of moving image data.Interface unit 202 sends to transmission frame number determining unit 206 by obtained information.Transmission frame number determining unit 206 is determined with the quick pace information realization of the being received needed picture frame number of live image that advances fast.That frame transmission unit 208 is read from be stored in the sequence of image frames storage unit 250 with constant timing is determined by transmission frame number determining unit 206, the ascertaining the number of picture frame.Then, this frame transmission unit 208 is transferred to quick advance unit 220 by these frames.For example, storage unit 250 is corresponding to the primary memory 14 in Figure 12.Yet storage unit 250 can be any storage unit or the storer that provide in picture frame treatment facility 200 such as External memory equipment 32.In addition, the picture frame in storage unit 250 can be unpressed image.Picture frame in storage unit 250 can also be the compressed image that uses DCT (discrete cosine transform).
Advance unit 220 comprises frame sequence acquiring unit 222 and merges performance element 224 fast.Frame sequence acquiring unit 222 obtains transmitted picture frame, and temporarily stores them.Merge performance element 224 and carry out merging processing, it generates the picture frame of a renewal from be stored in a plurality of picture frames frame sequence acquiring unit 222.This merges processing can be the mixed processing of describing in the first embodiment.The picture frame of this renewal is called as " frame fast advances ".
To generate order, the frame that advances fast by merging performance element 224 generations is transferred to picture component units 240.Picture component units 240 is exported these frames that advances fast with the predetermined frame frequency that can show on display 26.Thus, user can watch the picture that advances fast of expectation on display 26.
When carrying out merging processing on a plurality of picture frames, in the frame that advances fast, produce pseudo-after image (afterimage).By sequentially exporting such frame that advances fast, can obtain the picture that advances fast with motion blur effects.Like this, user can enjoy true to nature and level and smooth moving frame.
Incidentally, can carry out such processing, wherein in the sequence of image frames of the frame of every predetermined number from storage unit 250, extract picture frame, and export the frame being extracted and do not provide any for creating the merging processing of the picture that advances fast with predetermined frame frequency.Now, in order to understand the advantage of example 4, the shortcoming of the picture that advances fast being created by such processing is described with reference to Figure 14.
Figure 14 shows such handling principle, and it comprises step: the picture frame that extracts appropriate number from cut-and-dried sequence of image frames 300; And create the frame sequence 310 that advances fast.
Sequence of image frames 300 comprises picture frame 301-309 and scope a large amount of other picture frames before and after these frames.Picture frame 301-309 deputy activity image, wherein circular object 400 moves to lower right from the upper left side of screen.In fact, unless used the picture frame more much more than frame 301--309, otherwise mobile circular object 400 smoothly.Yet, in order to simplify the explanation in Figure 14, suppose the level and smooth movement that only utilizes picture frame 301-309 just to realize circular object 400.
Star 401 shown in picture frame 303 and 307 represents the flicker of circular object 400.In this sequence of image frames 300, circular object 400 appears at the upper left corner of screen, then moves to the lower right corner and glimmers twice.
In this example, from every three picture frames, extract a picture frame.More specifically, in sequence of image frames 300, every three picture frames extract picture frame 301,304 and 307.Then, the picture frame of these extractions becomes respectively and advances fast frame 311,312,313 and do not take any processing.Therefore, can create to advance fast frame sequence 310 and export this sequence 310 with predetermined frame frequency and create the picture that advances fast by extracting a picture frame the picture frame from every appropriate number.In this example shown in Figure 14, can obtain the 3X picture that advances fast.
Yet, utilize such processing, when the difference between the picture frame extracting from original image frame sequence is large, particularly, in the situation that advancing very fast, picture may advance frame by frame.Therefore,, for user, this picture becomes poor picture.In addition, sequence of image frames 300 comprises the picture frame 303 that represents circular object 400 flickers.Yet the frame sequence 310 that advances fast does not comprise this picture frame 303.Therefore, watching the user of the picture that advances fast that comprises frame sequence 310 can not recognize circular object 400 has glimmered twice.
This shows, in such processing, in Figure 14, the flicker of object, likely the picture frame with important information is not included in fast and is advanced in picture.In other words, although there is some events in original image frame sequence, when user watches this to advance fast in picture, he likely can not see this event.So, as user, using customizing messages while searching for scene of interest as clue from the picture that advances fast, if the picture that advances fast lacks this customizing messages, user can not find out this scene.
Next, will describe for creating the method for the picture that advances fast according to the example 4 with reference to Figure 15.Replace every three picture frames and extract a picture frame, three picture frames that merge in 224 pairs of sequence of image frames 300 of performance element are carried out merging processing to create the frame that advances fast.More specifically, merge 224 couples of picture frame 301-303 of performance element and carry out merging processing to create the frame 321 that advances fast.Merge 224 couples of picture frame 304-306 of performance element and carry out merging processing to create the frame 322 that advances fast.Merge 224 couples of picture frame 307-309 of performance element and carry out merging processing to create the frame 323 that advances fast.
This merges processes corresponding to creating such picture frame, and each pixel in this picture frame is the weighted mean pixel that is arranged in picture frame same position place.More specifically, when using n picture frame F m(m=1 ..., n, n is positive integer) create the frame F that advances fast ftime,
F f=∑α m/F m(4)
α wherein mrepresent the weight coefficient of each picture frame, and meet ∑ α m=1.As can be from finding out expression (4), for each picture frame, weight ratio may not wait.For example, high weight ratio can be applied to the picture frame of contiguous certain picture frame, and this picture frame from the position of certain picture frame more away from, the weight ratio of application may be lower.How distribute α mvalue depend on the frame F that advances fast ffeature.
According to merging as above, process, obtained the frame 321,322,323 that advances fast of the pseudo-after image with circular object 400 mobile between picture frame.In Figure 15, the after image of circular object 400 is expressed as sky circle or star.Thus, when reproduction has advancing fast during frame sequence 320 of the frame 321-323 that advances fast, can obtain motion blur and level and smooth movable picture.Therefore, can alleviate user's eye fatigue.In addition,, as found out from the frame 321 and 323 that advances fast, object scintigram picture is retained in these frames as pseudo-after image.Therefore, the information in original image frame may not can be advanced frame fast and to lose from frame owing to creating.In other words, always the partial information of original image frame is stayed fast and advanced in frame.Therefore, as user, using customizing messages while searching for his interested scene as clue from the picture that advances fast, due to this residual risk, user feels to be easy to find out this scene.
Figure 16 shows increases or reduces the merged principle of processing to create the number of image frames object of the picture that advances fast with different paces.Sequence of image frames 350 comprises picture frame 351-362 and the great amount of images frame of scope before and after these frames.When picture is advanced in establishment normally fast, merge 224 pairs of four picture frames of performance element and carry out merging processing to create the frame that advances fast.More specifically, merge 224 pairs of picture frame 351-354 execution merging of performance element and process to create the frame 371 that advances fast, and 224 couples of picture frame 355-358 of merging performance element carry out merging processing to create the frame 372 that advances fast.
When advanced fast while the particular image frame with specified conditions being detected image duration creating by frame sequence acquiring unit 222, merge 224 pairs of every two picture frames of performance element and carry out to merge and process to create the frame that advances fast.In Figure 16, when picture frame 359 meets these specified conditions, merge 224 pairs of picture frames 359 of performance element and 360 execution merging processing to create the frame 373 that advances fast, and merging 224 pairs of picture frames 361 of performance element and 362 are carried out merging processing to create the frame 374 that advances fast.
First the picture that advances fast that comprises the frame sequence 370 that advances fast that comprises the frame 371-374 that advances fast has 4X pace, but after the frame 373 that advances fast, pace drops to twice.Therefore,, by increasing rightly or reduce merged picture frame number, can obtain the picture that advances fast that its speed is put change at any time.
In order to detect particular image frame, can use any well-known technology.For example, use scenes changes detection technique, can obtain particular image frame in the place of scene change.So, can obtain at any special scenes place the picture that advances fast of the pace with reduction.Alternatively, calculate the motion vector between picture frame, the absolute value that then can detect its motion vector is greater than the particular image frame of predetermined value.Thus, by detecting the wherein mobile change large picture frame of object in screen, can obtain the picture that advances fast of the pace after this particular frame with reduction.
Even if also automatically reduce pace because user advances this picture fast after set point, so user can easily find out interested or important scene.In addition, as user, using customizing messages while searching for scene of interest as clue from the picture that advances fast, can automatically reduce pace at the frame place with this information.Therefore, user can more easily find out this scene.Can imagine the use that following reality.Suppose that image content is drama, the scene in the time of can occurring with the speed reproduction specific actors reducing during advancing fast.Suppose that image content is Association football match broadcast, can reproduce score scene with the speed reducing during advancing fast.
(example 5)
In example 4, do not consider the feature of each picture frame and from sequence of image frames, extract the picture frame of predetermined number, and extracted picture frame is carried out to merge and process to create the frame that advances fast.This processing is preferably the level and smooth mobile picture that advances fast of structure.Yet in some cases, extracting some picture frames with some features by priority, to create the picture that advances be fast preferred.In this example 5, a kind of like this picture frame treatment facility is provided, it creates and has the picture that advances fast that height is watched efficiency for extract some picture frames that meet specified conditions by priority.
Figure 17 shows according to the functional block diagram of the picture frame treatment facility of example 5.Interface unit 202, transmission frame number determining unit 206, frame transmission unit 208, frame sequence acquiring unit 222, picture synthesis unit 240 are identical with the piece shown in Figure 13 with storage unit 250, so identical digital distribution is given to these pieces and omitted further illustrating these pieces.
Advance unit 220 comprises frame sequence acquiring unit 222 and characteristic frame extraction unit 226 fast.The monochrome information of characteristic frame extraction unit 226 based on picture frame, the picture frame that the picture frame transmitting from frame transmission unit 208, extraction meets predetermined condition is as characteristic frame.For example, characteristic frame extraction unit 226 calculates the pixel average that comprises each pixel included in ten forwards of certain picture frame and reverse frames, then extracts and comprises having than the picture frame of the pixel of mean value large 50% as characteristic frame.Characteristic frame extraction unit 226 extracts the picture frame of some proper numbers except these characteristic frames, creates the frame that advances fast, and sends it to picture synthesis unit 240.The frame that advances fast that picture synthesis unit 240 can show with predetermined frame frequency output on display 26.
Now, by the extraction to characteristic frame of more specifically describing according to example 5.Figure 18 shows the principle of extracting the processing of some picture frames based on monochrome information from sequence of image frames.Identical with Figure 14, sequence of image frames 300 comprises picture frame 301-309 and scope a large amount of other picture frames before and after these frames.The picture frame that characteristic frame extraction unit 226 extractions pixel intensity is wherein greater than the pixel intensity in other adjacent image frame is as characteristic pattern picture frame.As mentioned above, circular object 400 is glimmered in picture frame 303 and 307.Therefore, because comprising brightness, picture frame 303 and 307 is greater than the pixel of the pixel intensity of adjacent image frame, so extract respectively them as characteristic frame.These characteristic frames become respectively advance fast frame 331 and 332 and do not take any processing.
Only by extract characteristic frame from sequence of image frames 300, likely do not extract the synthetic needed frame of picture that advances fast of pace to be asked.Therefore,, if do not extract characteristic frame in the picture frame of predetermined number, be preferably and allow characteristic frame extraction unit 226 from the picture frame of predetermined number, extract a frame and not consider monochrome information.On the contrary, if extracted a plurality of characteristic frames in the picture frame of predetermined number, be preferably and allow characteristic frame extraction unit only extract a frame.In this way, can construct the frame sequence 330 that advances fast.
Alternatively, when there is a plurality of characteristic frame extract in the picture frame at predetermined number based on monochrome information, all picture frames that are confirmed as characteristic frame can be the frames and irrelevant with pace information of advancing fast.In this way, for certain period is extracted the picture frame of the pixel with larger brightness continuously.Therefore, in this period, can obtain have normal playback speed reduction the picture that advances fast of speed.Thus, even because advancing fast in picture, also can in feature scene, obtain the live image no better than normal play, this can reduce the chance that user misses the important information in feature scene.Be preferably and depend on that it is that characteristic frame arranges condition that user wants the information type of obtaining.
In example 5, because create by extracting characteristic frame based on monochrome information the picture that advances fast, so reduced the number of picture frame that miss, that there is important information in the picture that advances fast.
It should be noted that for extracting the information of characteristic frame and be not limited to monochrome information.For example, the mobile message between picture frame also can be for extracting some picture frames with specified conditions according to priority.
Referring to Figure 17, mobile message detecting unit 210 receives the picture frame transmitting from frame transmission unit 208 and calculates the mobile message between these picture frames.For example, mobile message detecting unit 210 is by being used well-known block matching method to obtain the corresponding point between picture frame.Then, mobile message detecting unit 210 is according to the difference calculating kinematical vector between corresponding point.This motion vector is used for to mobile message.If prepared in advance each region in picture frame or some mobile messages of object as data, also can be by these data as mobile message.
Characteristic frame extraction unit 226 based on mobile message, from transmitted picture frame, extract meet predetermined condition picture frame as characteristic frame.This condition is for example that the absolute value of motion vector is greater than predetermined value.Except these characteristic frames, characteristic frame extraction unit 226 extracts the picture frame of some appropriate numbers, creates the frame that advances fast, and this frame that advances is fast sent to picture synthesis unit 240.Picture synthesis unit 240 with predetermined frame frequency by this frame Output Display Unit 26 that advances fast.
In another example, the information that characteristic frame extraction unit 226 receives the head that is written to moving image data from interface unit 202, and based on this information extraction characteristic frame.For example, the content of supposing live image is drama, before and after scene change point, in the head of tens or the hundreds of picture frame in scope, has enabled the position of indication scene change.Characteristic frame extraction unit 226 can extract by such picture frame of indicating as characteristic frame.In this way, even if advancing fast in picture, pace also becomes and equals normal broadcasting speed.Therefore, this user can more easily recognize the content of advancing fast in picture.
(example 6)
In example 5, the picture frame that uses monochrome information or mobile message to extract to meet specified conditions has been described as characteristic frame.In other words, in example 5, the picture frame in sequence of image frames is separated into two groups.One group comprises the picture frame useful to user (that is to say to have the picture frame of bulk information).Another group comprises the picture frame slightly few to user's benefit (that is to say to have the picture frame of less information).Then, from first group, pick up more picture frame and create the picture that advances fast.
In example 6, will provide picture frame treatment facility that a picture frame is separated into two parts: a part that there is a part for more information and there is less information.And strengthen these parts or make these parts be difficult for being discovered.Thus, user's acquired information from the picture that advances fast more easily.
Figure 19 shows according to the functional block diagram of the picture frame treatment facility of example 6.Interface unit 202, transmission frame number determining unit 206, frame transmission unit 208, mobile message detecting unit 210 are identical with the piece shown in Figure 17 with storage unit 250, so identical digital distribution is given to these pieces and omitted further illustrating these pieces.
Advance unit 220 comprises separative element 228, merging performance element 230 and frame synthesis unit 232 again fast.Separative element 228 receives the picture frame transmitting from frame transmission unit 208.Separative element 228 is separated into each picture frame in " specific image region " and " nonspecific image-region ".Mobile message based on receiving from mobile message detecting unit 210 carries out this separation.Specific image region is the region that the absolute value of wherein motion vector is greater than predetermined threshold.Nonspecific image-region is the region except specific image region.Merging 230 pairs of the performance elements nonspecific image-region between picture frame carries out to merge and processes.On the other hand, merge performance element 230 and from picture frame, pick up any one specific image region.
Frame again the synthetic specific image region of extracting of synthesis unit 232 and the nonspecific image-region through merging to create the picture frame upgrading.The picture frame of renewal is sent to picture synthesis unit 240 as the frame that advances fast.Picture synthesis unit 240 exports this frame that advances fast to display 26 with predetermined frame frequency.
Figure 20 shows for picture frame being separated into the principle of specific image region and nonspecific image-region.Sequence of image frames 380 comprises picture frame 381-384 and the great amount of images frame of scope before and after these picture frames.Sequence of image frames 380 comprises someone's image.Image that can this people of detection as described below.User specifies the design and color of the clothes of this people's dress.Then, use well-known Image Matching Technique, using this design and color as clue, to detect this people's image-region.
Separative element 228 is separated into picture frame 381-384 for the specific image region of this people's image with for the nonspecific image-region of background image.This nonspecific image-region merging in 230 couples of picture frame 381-384 of performance element is carried out merging processing.This merge cells 230 picks up a specific image region from picture frame 381-384.In Figure 20, by the specific image region merging in performance element 230 captured image frames 382.Then, the frame specific image region that synthesis unit 232 will be picked up by merging performance element 230 again and the nonspecific image-region merging put to create the frame 385 that advances fast together.Because this merges, process, this frame 385 that advances fast has fuzzy background image.Therefore, comprise that the picture that advances fast of frame 385 can show the someone with motion blur background, so this user can more easily recognize this people.
As mentioned above, according to example 6, can in the picture that advances fast, show clearly the pith in picture frame.In other words, according to example 6, due to motion blur, can make less important part in picture frame become and be difficult for being discovered.
In this way, when the content of live image is drama or motion broadcast, can in the picture that advances fast, show the personnel that user likes noticeablely.
Alternatively, when nonspecific image-region is rest image, a plurality of nonspecific image-regions of merge cells 230 use strengthen its picture quality.
(example 7)
Figure 21 shows according to the functional block diagram of the picture frame treatment facility of example 7.In this example 7, in the picture that advances fast, show the path (track) of object in picture frame.
Interface unit 202, transmission frame number determining unit 206, frame transmission unit 208, mobile message detecting unit 210, picture synthesis unit 240 are identical with the piece shown in Figure 17 with storage unit 250, so identical digital distribution is given to these pieces and omitted further illustrating these pieces.
Advance unit 220 comprises path creating unit 236 and frame synthesis unit 232 again fast.Path creating unit 236 is used the mobile message receiving from mobile message detecting unit 210 image that Makes Path.This path profile similarly is the image that shows the streamline (flow line) of predetermine one the picture frame transmitting from frame transmission unit 208.Frame again synthesis unit 232 rewrites path profile picture to create the frame that advances fast on original image frame.
Figure 22 shows according to the path of example 7 and creates the principle of processing.Identical with Figure 14, sequence of image frames 300 comprises picture frame 301-309 and scope a large amount of other picture frames before and after these frames.
Path creating unit 236 is according to the image 411 that Makes Path of the difference between picture frame 301 and picture frame 302.Path creating unit 236 is according to the image 412 that Makes Path of the difference between picture frame 302 and picture frame 303.Frame again synthesis unit 232 is put into path profile in picture frame 303 to create the frame 341 that advances fast as 411 and 412.Similarly, path creating unit 236 is according to the image 413 that Makes Path of the difference between picture frame 304 and picture frame 305.Path creating unit 236 is according to the image 414 that Makes Path of the difference between picture frame 305 and picture frame 306.Frame again synthesis unit 232 is put into path profile in picture frame 306 to create the frame 342 that advances fast as 413 and 414.Picture frame 307 or picture frame are below repeated to identical processing.
Picture synthesis unit 240 is drawn together the frame 341 and 342 that advances fast at the interior frame sequence 340 that advances fast with predetermined frame frequency output packet.Therefore, can obtain the picture that advances fast in the path of the movement with circular object of illustrating 400.
In the middle of determining the object in being present in picture frame, select which object to show its path, can use various well-known methods.For example, use well-known image recognition technique in each picture frame, to detect predetermine one (for example, Association football), and can in the picture that advances fast, show the path of Association football.
According to example 7, likely in the picture that advances fast, show the information in original image frame that do not appear at.In other words, by using the differential information between picture frame, likely strengthen the information in picture frame.
(example 8)
A processing by selective basis example 4 to 7 as above, can produce the picture that advances fast that is suitable for content or customer objective.For example, depend on the content that is recorded in the live image in memory device, can select suitable advancing fast.
Now, suppose that picture frame treatment facility is incorporated in live image playing device, describe the picture that advances fast and create processing.
Figure 23 shows the functional block diagram that can realize according to the picture frame treating apparatus of whole processing of example 4 to 7.Interface unit 202, transmission frame number determining unit 206, frame transmission unit 208, mobile message detecting unit 210, picture synthesis unit 240 are identical with the piece shown in Figure 13 with storage unit 250, so identical digital distribution is given to these pieces and omitted further illustrating these pieces.
Advance unit 220 is configured to carry out whole processing of describing in example 4 to 7 fast.Picture frame treatment facility also comprises content determining unit 214.Content determining unit 214 is determined the content type that is stored in the live image in storage unit 250.Can the header information based on moving image data carrying out this determines.Alternatively, input that can be based on user or carry out this from the mobile message of mobile message detecting unit 210 and determine.Determined content type is sent to quick advance unit 220.Advance unit 220 receives the picture frame transmitting from frame transmission unit 208 fast, and depends on that content type execution is according to a processing of example 4 to 7.
Now, the particular procedure of being carried out by quick advance unit 220 will be described when by the definite content type of content determining unit 214 being the film of motion broadcast, drama or film or user's original creation.
A. motion broadcast
When content type is the videograph of Association football match, likely only when score scene, reduce the pace of the picture that advances fast.Can detection score scene as described below: the live image being obtained by the fixed position camera that points to goal post.Prior region, nominated ball doorway in the image being obtained by this camera.When Association football image enters region, goal mouth (this can detect by image matching method), quick advance unit 220 is defined as score scene and is extracted in a plurality of picture frames of these time point front and back.In addition, unit 220 can be by Image Matching Technique, using color, pattern or its conforming number and determine specific player as clue.Then can be by other player being applied to the picture that advances fast that motion blur obtains the motion that has strengthened specific player.In addition, use Image Matching Technique to recognize the Association football in picture frame, and can obtain the picture that advances fast with this Association football path.
B. drama/film
When content type is drama programs, for example, this unit prompting user's input he/her favorite actor's in this drama color or pattern.Then, advance unit 220 is detected and is had the object corresponding to the region of inputted design and color by well-known image matching technology fast.Thus, unit 220 can identify the scene that the favorite actor of institute occurs, and is that identified scene tool establishment has the picture that advances fast that has the pace of reduction.
C. the film that user creates
When the live image in being stored in storage unit 250 is the image being obtained by portable camera by user, advance unit 220 is used well-known scene change extractive technique to detect the interruption of scene fast.By the picture frame before and after interrupting in scene is included in and is advanced fast in picture, can easily in the picture that advances fast, grasp this content.In addition, using light stream to detect utilizes portable camera to follow the trail of the object of taking (chase).Then, in the picture that advances fast, make the background image motion blur except the object of tracked shooting.Thus, can create the picture that advances fast of the object with the tracked shooting that is easy to watch.
As mentioned above, according to the frame processing device of preferred image shown in Figure 23, can obtain the picture that advances fast that content type or User Preferences for live image have carried out suitable processing.
According to a second embodiment of the present invention, the sequence of image frames based on being stored in advance in DVD driver or hard disk drive etc., when receiving when request of advancing fast, carries out predetermined process to create the frame that advances fast to sequence of image frames.Then, with the request of advancing fast for this, show that the needed frame frequency of picture that advances fast exports this frame that advances fast.A feature of the second embodiment is that the picture that advances fast can have the value of various interpolations by sequence of image frames being carried out to various processing.The value of these interpolations comprises: as much as possible the important information of original image frame sequence is remained on advance fast (example 4 and 5) in picture; In the picture that advances fast, omit the unwanted information (example 6) of sequence of image frames.
In a second embodiment, after receiving the request signal that advances fast, almost create in real time the frame that advances fast also as the picture output of advancing fast.Therefore, receive while advancing fast request signal at every turn, depend on condition or user's indication, likely export the various pictures that advance fast that created by different disposal.In other words, in a second embodiment, in the quick advance unit of processing corresponding to aftertreatment, there is high versatility, and the various pictures that advance fast with different advantages can be provided.
To create with the irrelevant predetermined frame frequency of pace the sequence of image frames that is stored in the live image in storage unit 250.Yet fast advance unit depends on that advance fast request signal or content type carry out this and process to create sequence of image frames through upgrading for advancing fast.As can be seen from the above, the common principle in the first embodiment and the second embodiment is: by the picture frame of preparing with the speed higher than demonstration speed is sampled to create the frame that will show to user.The second embodiment is corresponding to special circumstances that be wherein extended for the time shaft of sampling, the first embodiment.
The application that it should be noted that the second embodiment is not limited to establishment advance fast picture or the picture that fasts rewind.For example, use the live image being obtained by high speed camera, can create the normal play image with more above-mentioned effects according to the second embodiment.In this case, should meet following condition:
N s≥N p≥N o
N wherein sthe number of the picture frame of each unit interval that representative is obtained by high speed camera, N prepresentative is stored in the number of the picture frame of each unit interval in storage unit, and N orepresentative finally outputs to the number of picture frame display, each unit interval.
Based on some embodiment, the present invention has been described.These embodiment are illustrative in essence, and for a person skilled in the art clearly, may have within the scope of the invention the various changes aspect element and processing.The optional combination of the element of describing in an embodiment, and also can be used as other patterns of the present invention and put into practice with the realization of the present invention of method, device, system, computer program and recording medium form.

Claims (8)

1. the live image that comprises by use the sequence of image frames of storing in storer creates an image frame processing method for the frame that advances fast, comprises:
In response to the request of advancing fast, use monochrome information or mobile message from storer, to read the picture frame that has wherein occurred predetermined characteristic;
Integrated read picture frame is to create the described frame that advances fast;
The predetermined frame frequency of usining shows that the frame that advances fast being created is as the picture that advances fast;
Each picture frame is separated into specific image region and nonspecific image-region, and wherein said specific image region comprises predetermine one, and described nonspecific image-region is the region except specific image region;
Nonspecific image-region between integrated image frame;
Select the central region of specific image region in picture frame; And
Combine integrated nonspecific image-region and selected specific image region to create the frame that advances fast.
2. the live image that comprises by use the sequence of image frames of storing in storer creates an image frame processing method for the frame that advances fast, comprises:
In response to the request of advancing fast, use monochrome information or mobile message from storer, to read the picture frame that has wherein occurred predetermined characteristic;
Integrated read picture frame is to create the described frame that advances fast;
The predetermined frame frequency of usining shows that the frame that advances fast being created is as the picture that advances fast;
The mobile message of mobile predetermine one in detected image frame;
Based on this mobile message, create the path profile picture of the streamline that shows this predetermine one; And
The picture frame that combines this path profile picture and therefrom detect mobile message creates this frame that advances fast.
3. for create a picture frame treatment facility for the frame that advances fast in response to the request of advancing fast, comprise:
Storage unit, for storing the moving image data that comprises sequence of image frames;
Processing unit fast advances, for reading in response to the request of advancing fast, use monochrome information or mobile message the picture frame that has wherein occurred predetermined characteristic from described storage unit, this advance fast processing unit also integrated read picture frame create the frame that advances fast;
Picture synthesis unit, usings as the picture that advances fast for the frame that advances fast being created with predetermined frame frequency demonstration;
Separative element, for each picture frame being separated into specific image region and nonspecific image-region, wherein said specific image region comprises predetermine one, and described nonspecific image-region is the region except specific image region;
Merge performance element, for the nonspecific image-region between integrated described picture frame, and select a region in the specific image region of described picture frame; And
Frame is synthesis unit again, for combining integrated nonspecific image-region and selected specific image region to create the frame that advances fast.
4. picture frame treatment facility as claimed in claim 3, also comprises: mobile message detecting unit, for calculating the mobile message between described picture frame, wherein said separative element is determined specific image region based on described mobile message.
5. picture frame treatment facility as claimed in claim 3, wherein, when described nonspecific image-region is rest image, described merging performance element is carried out the mixing of nonspecific image-region to strengthen the quality of image-region, and wherein each nonspecific image-region is to separate from described picture frame.
6. picture frame treatment facility as claimed in claim 3, also comprises: content type determining unit, and for determining the content type of described live image,
The wherein said processing unit that advances is fast carried out integrated processing according to described content type.
7. for create a picture frame treatment facility for the frame that advances fast in response to the request of advancing fast, comprise:
Storage unit, for storing the moving image data that comprises sequence of image frames;
Processing unit fast advances, for reading in response to the request of advancing fast, use monochrome information or mobile message the picture frame that has wherein occurred predetermined characteristic from described storage unit, this advance fast processing unit also integrated read picture frame create the frame that advances fast;
Picture synthesis unit, usings as the picture that advances fast for the frame that advances fast being created with predetermined frame frequency demonstration;
Mobile message detecting unit, for the mobile message between calculating chart picture frame;
The wherein said processing unit that advances fast comprises:
Path creating unit, for creating the path profile picture in the path that represents the movement of predetermine one in described picture frame based on this mobile message; And
Frame is synthesis unit again, for combining this path profile picture and original image frame, creates this frame that advances fast.
8. picture frame treatment facility as claimed in claim 7, also comprises: content type determining unit, and for determining the content type of described live image,
The wherein said processing unit that advances is fast carried out integrated processing according to described content type.
CN200810161915.5A 2004-05-19 2005-05-18 Image frame processing method and device for displaying moving images to a variety of displays Expired - Fee Related CN101373590B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2004149705 2004-05-19
JP149705/04 2004-05-19
JP2005100075A JP4789494B2 (en) 2004-05-19 2005-03-30 Image frame processing method, apparatus, rendering processor, and moving image display method
JP100075/05 2005-03-30

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN200580024443A Division CN100593188C (en) 2004-05-19 2005-05-18 Image frame processing method and device for displaying moving images to various displays

Publications (2)

Publication Number Publication Date
CN101373590A CN101373590A (en) 2009-02-25
CN101373590B true CN101373590B (en) 2014-01-22

Family

ID=38185464

Family Applications (2)

Application Number Title Priority Date Filing Date
CN200580024443A Expired - Fee Related CN100593188C (en) 2004-05-19 2005-05-18 Image frame processing method and device for displaying moving images to various displays
CN200810161915.5A Expired - Fee Related CN101373590B (en) 2004-05-19 2005-05-18 Image frame processing method and device for displaying moving images to a variety of displays

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN200580024443A Expired - Fee Related CN100593188C (en) 2004-05-19 2005-05-18 Image frame processing method and device for displaying moving images to various displays

Country Status (2)

Country Link
JP (1) JP4761403B2 (en)
CN (2) CN100593188C (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013146508A1 (en) * 2012-03-30 2013-10-03 ソニー株式会社 Image processing device and method, and program
GB2518846A (en) * 2013-10-01 2015-04-08 Ibm Diagnosing graphics display problems
EP3023987B1 (en) * 2014-11-20 2017-03-22 Axis AB Method and apparatus for visualizing information of a digital video stream
KR102503442B1 (en) * 2015-12-24 2023-02-28 삼성전자주식회사 Electronic device and operating method thereof
CN109040837B (en) * 2018-07-27 2021-09-14 北京市商汤科技开发有限公司 Video processing method and device, electronic equipment and storage medium
KR102656237B1 (en) * 2018-09-21 2024-04-09 엘지디스플레이 주식회사 Moving fingerprint recognition method and apparatus using display
CN111724460B (en) * 2019-03-18 2024-10-18 北京京东尚科信息技术有限公司 Dynamic display method, device and equipment for static picture
US11295660B2 (en) * 2019-06-10 2022-04-05 Ati Technologies Ulc Frame replay for variable rate refresh display
CN113709389A (en) * 2020-05-21 2021-11-26 北京达佳互联信息技术有限公司 Video rendering method and device, electronic equipment and storage medium
CN113596564B (en) * 2021-09-29 2021-12-28 卡莱特云科技股份有限公司 Picture playing method and device
CN115037992A (en) * 2022-06-08 2022-09-09 中央广播电视总台 Video processing method, device and storage medium
JP7700162B2 (en) * 2023-02-28 2025-06-30 株式会社ソニー・インタラクティブエンタテインメント Information processing device, information processing method, and program

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1334677A (en) * 2000-05-17 2002-02-06 三菱电机株式会社 Dynamic extraction of feature from compressed digital video signals by video reproducing system

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06149902A (en) * 1992-11-09 1994-05-31 Matsushita Electric Ind Co Ltd Animation image recording medium, animation image recorder and animation image reproducing device
US5543927A (en) * 1993-04-29 1996-08-06 Sony Corporation Variable speed playback of digital video stored in a non-tape media
JPH0738842A (en) * 1993-06-29 1995-02-07 Toshiba Corp Dynamic image editing device
US5559950A (en) * 1994-02-02 1996-09-24 Video Lottery Technologies, Inc. Graphics processor enhancement unit
JPH09307858A (en) * 1996-05-16 1997-11-28 Nippon Telegr & Teleph Corp <Ntt> Double speed playback method
JP2000101985A (en) * 1998-09-28 2000-04-07 Toshiba Corp Stream data fast forwarding mechanism
JP2002094947A (en) * 2000-09-12 2002-03-29 Matsushita Electric Ind Co Ltd High-speed video playback device, high-speed video playback method, and recording medium
US6690427B2 (en) * 2001-01-29 2004-02-10 Ati International Srl Method and system for de-interlacing/re-interlacing video on a display device on a computer system during operation thereof
JP2003078880A (en) * 2001-08-30 2003-03-14 Sony Corp Image processor, image processing method and program

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1334677A (en) * 2000-05-17 2002-02-06 三菱电机株式会社 Dynamic extraction of feature from compressed digital video signals by video reproducing system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
JP特开2003-78880A 2003.03.14

Also Published As

Publication number Publication date
JP4761403B2 (en) 2011-08-31
CN101373590A (en) 2009-02-25
JP2009110536A (en) 2009-05-21
CN1989545A (en) 2007-06-27
CN100593188C (en) 2010-03-03

Similar Documents

Publication Publication Date Title
AU2005242447B2 (en) Image frame processing method and device for displaying moving images to a variety of displays
EP2870771B1 (en) Augmentation of multimedia consumption
US8810708B2 (en) Image processing apparatus, dynamic picture reproduction apparatus, and processing method and program for the same
AU2006252194B2 (en) Scrolling Interface
US9582610B2 (en) Visual post builder
US8386942B2 (en) System and method for providing digital multimedia presentations
US11875023B2 (en) Method and apparatus for operating user interface, electronic device, and storage medium
US8350929B2 (en) Image pickup apparatus, controlling method and program for the same
US20170116709A1 (en) Image processing apparatus, moving image reproducing apparatus, and processing method and program therefor
CN101373590B (en) Image frame processing method and device for displaying moving images to a variety of displays
US20160118080A1 (en) Video playback method
US8902361B2 (en) Relational display of images
CN101783886A (en) Information processing apparatus, information processing method, and program
US9083915B2 (en) 3D electronic program guide
CN102821261A (en) Display apparatus, object display method, and program
CN103797784A (en) Video Peeking
JP2009201041A (en) Content retrieval apparatus, and display method thereof
Hürst et al. HiStory: a hierarchical storyboard interface design for video browsing on mobile devices
JP5146282B2 (en) Information processing apparatus, display control method, and program
KR100878640B1 (en) Image frame processing method and apparatus for displaying moving images on various display devices
AU2006252198B2 (en) Animated sub-images for browsing

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
ASS Succession or assignment of patent right

Owner name: SONY COMPUTER ENTERTAINMENT, INC.

Free format text: FORMER OWNER: SNE PLATFORM INC.

Effective date: 20120828

C41 Transfer of patent application or patent right or utility model
C53 Correction of patent of invention or patent application
CB02 Change of applicant information

Address after: Tokyo, Japan

Applicant after: SNE platform Limited by Share Ltd.

Address before: Tokyo, Japan

Applicant before: Sony Computer Entertainment Inc.

COR Change of bibliographic data

Free format text: CORRECT: APPLICANT; FROM: SONY COMPUTER ENTERTAINMENT INC. TO: SNE PLATFORM INC.

TA01 Transfer of patent application right

Effective date of registration: 20120828

Address after: Tokyo, Japan

Applicant after: SONY COMPUTER ENTERTAINMENT Inc.

Address before: Tokyo, Japan

Applicant before: SNE platform Limited by Share Ltd.

C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20140122