CN1217526C - Image processing device, image processing method, image processing program and its recording medium - Google Patents
Image processing device, image processing method, image processing program and its recording medium Download PDFInfo
- Publication number
- CN1217526C CN1217526C CN031025625A CN03102562A CN1217526C CN 1217526 C CN1217526 C CN 1217526C CN 031025625 A CN031025625 A CN 031025625A CN 03102562 A CN03102562 A CN 03102562A CN 1217526 C CN1217526 C CN 1217526C
- Authority
- CN
- China
- Prior art keywords
- image
- edge
- region
- image processing
- processing apparatus
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 238000012545 processing Methods 0.000 title claims abstract description 75
- 238000003672 processing method Methods 0.000 title claims description 10
- 238000000034 method Methods 0.000 claims abstract description 41
- 230000002708 enhancing effect Effects 0.000 claims abstract description 15
- 230000008878 coupling Effects 0.000 claims description 7
- 238000010168 coupling process Methods 0.000 claims description 7
- 238000005859 coupling reaction Methods 0.000 claims description 7
- 238000006243 chemical reaction Methods 0.000 claims description 4
- 238000010606 normalization Methods 0.000 claims description 4
- 238000011002 quantification Methods 0.000 claims 1
- 230000002093 peripheral effect Effects 0.000 description 16
- 230000008569 process Effects 0.000 description 10
- 230000003321 amplification Effects 0.000 description 8
- 238000003199 nucleic acid amplification method Methods 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 7
- 239000000284 extract Substances 0.000 description 7
- 238000013500 data storage Methods 0.000 description 6
- 238000004364 calculation method Methods 0.000 description 5
- 238000013507 mapping Methods 0.000 description 4
- 238000005070 sampling Methods 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 230000002950 deficient Effects 0.000 description 2
- 230000006866 deterioration Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 235000012364 Peperomia pellucida Nutrition 0.000 description 1
- 240000007711 Peperomia pellucida Species 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000007639 printing Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/403—Edge-driven scaling; Edge-based scaling
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Editing Of Facsimile Originals (AREA)
- Image Processing (AREA)
- Facsimile Image Signal Circuits (AREA)
- Color Image Communication Systems (AREA)
- Image Analysis (AREA)
Abstract
An image processing apparatus has an edge direction estimation section for estimating edge direction information for a first image area containing a notable pixel, an edge pattern selection section for selecting an edge shape pattern corresponding to the first image area corresponding to a predetermined edge direction estimated based on the edge direction information and pixel values in the first image area containing the notable pixel, an image enhancement section for enhancing the pixel values of the first image area containing the notable pixel, an enlarged image block generation section for generating an enlarged image area using the selected edge shape pattern and the enhanced pixel values of the first image area, and an image block placement section for placing the enlarged image area generated by the enlarged image block generation section according to a predetermined method.
Description
Technical field
The present invention relates to the processing and amplifying of the image represented with multi-stage grey scale, be specifically related to be used for to suppress picture quality defective such as that input picture takes place is fuzzy, sawtooth and carry out image processing apparatus, image processing method, the image processing program of image processing and amplifying and store the computer readable recording medium storing program for performing of this image processing program with high-quality.
Background technology
The image processing and amplifying be one be used to edit, a kind of basic handling of the system of layout, demonstration and print image.In recent years, along with being mainly used in the popularizing of view data that shows internet website or digital video with the monitor display resolution, obtain high-quality output result during for printing low-resolution image on high-resolution printers, high-quality processing and amplifying becomes more and more important.
The existing processing and amplifying technology of the image of being represented by multi-stage grey scale (after this being called multi-level images) can be nearest neighbor method, linear interpolation or bilinearity method and three convolution methods.
Nearest neighbor method is following a kind of method, that is, and and each pixel value after using pixel value when the neighborhood pixels during to original image as amplification each pixel inverse mapping.Nearest neighbor method can carry out high speed processing, because its amount of calculation is less, but a pixel of original image in statu quo is amplified to a rectangular shape.Therefore, in the oblique line part sawtooth takes place, if perhaps multiplying power is bigger, image becomes the mosaic shape; The degree of deterioration in image quality is very big.
Linear interpolation or bilinearity method are following a kind of methods,, suppose pixel value linear change between pixel that is, and the pixel value of four neighborhood pixels of the inverse mapping point of pixel carries out interpolation to obtain pixel value to amplifying afterwards.The processing load of linear interpolation or bilinearity method is higher than nearest neighbor method, but amount of calculation is less, and the less generation of sawtooth.On the other hand, the shortcoming of linear interpolation or bilinearity method is that entire image is that the center thickens with the edge part that is not suitable for the linear change hypothesis.
Three convolution methods are following a kind of methods, promptly, according to approximate sinc function of sampling theory definition (interpolating function of sin (x)/x), and to 16 neighborhood pixels (in 4 * 4 pixels of X and Y direction) of the inverse mapping point that amplifies the back pixel be somebody's turn to do and be similar to interpolating function and carried out convolution with the pixel value after obtaining amplification.This method and above-mentioned two kinds of methods specific energy mutually provide the better image quality, but relate to restricted publication of international news and commentary entitled scope and intensive, and have high frequency and strengthen characteristic.Therefore, the shortcoming of three convolution methods is in the edge part slight sawtooth takes place, and noise component(s) to increase.
As the trial that addresses these problems, at JP-A-7-182503, JP-A-2000-228723, JP-A-2000-253238 has proposed some technology among " utilizing the high quality graphic amplifying and reconfiguration of self-adaption two-dimensional sampling function " (No. 5 p.p.620-626 of image electronic association magazine the 2nd volume).
In JP-A-7-182503, from around the pixel of the N * M zone of a remarkable pixel (notable pixel) (for example 3 * 3), detecting maximum and minimum value, and further calculate contrast and a median, and be used as typical value according to the mean value that this contrast derives this maximum or this minimum value and other value.Then, carry out remarkable pixel and handle, utilize the median of being calculated to carry out binaryzation, and dispose this two typical values according to the binaryzation result as a threshold value to the linear interpolation or the bilinearity of N * M pixel.If contrast is little, the pixel value that then in statu quo uses linear interpolation or bilinearity to handle, thus a final enlarged image is provided.
Therefore, sawtooth does not take place, and even not have the fuzzy good conversion of interpolation in natural image also be possible.But a problem of existence is only to determine the edge part based on contrast, and be difficult to reproduce the edge direction.Also have a problem to be,, block distortion takes place in the edge part because N * M zone only is made up of two values.
In JP-A-2000-228723, an original image binaryzation, and by carrying out and the coupling of an X-Y scheme that provides (matrix data) determines to obtain the direction of the tilt component that comprises in the original image from this bianry image, and carry out interpolation processing along the incline direction of being obtained.In other parts, carry out linear interpolation or bilinearity and handle.
Owing to before the direction of carrying out tilt component is determined, original image is carried out binaryzation according to a predetermined threshold, this method is effectively to the edge with big concentration difference (density difference), but has problems when the edge with small concentration difference is reproduced.
In JP-A-2000-253238, original image is carried out linear interpolation or bilinearity to be handled to amplify original image, and concurrently in the original image each comprise remarkable pixel around N * M zone carry out binaryzation, and carry out determining with the coupling of a default oblique line test pattern.If significantly pixel belongs to a border area (with oblique line test pattern coupling), its directional information is mapped to this remarkable pixel.In addition, directional information is mapped to by carrying out linear interpolation or bilinearity handles on the enlarged image provided, and carry out smothing filtering, thereby suppressed the processing and amplifying that sawtooth takes place along the direction of mapping directional information.
But, the technology of describing among the JP-A-2000-253238 is to amplify original image, figure coupling (pattern matching), amplify directional information and enlarged image is carried out these a series of processing of orientation-dependent smothing filtering, makes this technology exist and handle the very large problem that becomes of loading when picture size becomes big.
In three convolution methods, the sinc function is an infinite series, use an approximate function that the sinc function is abandoned limitedly can cause error, and sampling theory is an object with continuous differentiable signal; And the technology described in " utilizing the high quality graphic amplifying and reconfiguration of self-adaption two-dimensional sampling function " (image electronic association magazine the 28th volume No. 5 p.p.620-626) has proposed the fact that an image comprises a large amount of discontinuity poinies as a problem, use one different with the sinc function to be suitable for function discontinuity and that have locality as interpolating function, detect the edge direction globally, and make interpolating function distortion in detected edge direction, have the enlarged image that reduces fuzzy and sawtooth thereby provide.
The function that uses in this technology is local, but becomes 4n with respect to the convolution matrix size of multiplying power n, and detects the edge direction globally, therefore has the big problem of amount of calculation.
Summary of the invention
Therefore, the purpose of this invention is to provide a kind of image processing apparatus, a kind of image processing method, a kind of image processing program, with the computer readable recording medium storing program for performing of this image processing program of storage, make it possible to the enlarged image that has less fuzzy and sawtooth under little processing load, to provide at a high speed.
According to the present invention, a kind of image processing apparatus that is used to carry out the image processing and amplifying is provided, described image processing apparatus comprises:
Edge direction estimation portion is used to estimate comprise the edge directional information of first image-region of remarkable pixel;
Edge figure selecting portion is used for the pixel value according to the edge directional information and first image-region, selects and the corresponding pairing edge of first image-region shape figure of edge direction of being indicated by the edge directional information;
Figure image intensifying portion is used to strengthen the pixel value of first image-region;
Enlarged image zone generating unit is used for the pixel value by first image-region that uses the edge shape figure selected by described edge figure selecting portion and strengthened by described figure image intensifying portion, produces an enlarged image zone; With
Image configurations portion is used for the enlarged image zone that is produced by described enlarged image zone generating unit according to the preordering method configuration.
According to the present invention, a kind of image processing method that is used to carry out the image processing and amplifying is provided, described image processing method may further comprise the steps:
For first image-region that comprises remarkable pixel is estimated the edge directional information;
According to the pixel value in the edge directional information and first image-region, select and the corresponding pairing edge of first image-region shape figure of edge direction of indicating by the edge directional information;
Strengthen the pixel value of first image-region;
Produce an enlarged image zone by the enhancing pixel value that uses the edge shape figure and first image-region; And
Dispose this enlarged image zone according to preordering method.
According to the present invention, for first image-region that comprises a remarkable pixel is estimated the edge directional information, select the first image-region pairing edge shape figure corresponding according to estimated predetermined sides along directional information and the pixel value in first image-region that comprises remarkable pixel, and strengthen the pixel value of first image-region that comprises remarkable pixel with this edge direction.Use the enhancing pixel value of the selected edge shape figure and first image-region to produce the enlarged image zone, and dispose this enlarged image zone according to preordering method.
In the present invention, for first image-region that comprises remarkable pixel is estimated the edge directional information, select the first image-region pairing edge shape figure corresponding according to estimated predetermined sides along directional information and the pixel value in first image-region that comprises remarkable pixel, make it possible to only come definite edge shape figure corresponding to side information by carrying out simple figure coupling with this edge direction.In addition, strengthen the pixel value of first image-region that comprises remarkable pixel, use the enhancing pixel value of the selected edge shape figure and first image-region to produce the enlarged image zone, and dispose this enlarged image zone according to preordering method, make it possible to consider that this edge direction produces an enlarged image, and so that the enlarged image with less fuzzy and sawtooth to be provided under little processing load at a high speed.
Description of drawings
Fig. 1 is the block diagram that shows the basic configuration of one embodiment of the invention;
Fig. 2 is the flow chart that shows the handling process of this embodiment of the invention;
Fig. 3 A and 3B are the figure that shows the example of the image block that extracted by the image block portion of setting and marking area and peripheral region;
Fig. 4 A and 4B are the figure that describes the object lesson of the edge direction in marking area and peripheral region and the marking area;
Fig. 5 is the flow chart that the edge direction of demonstration edge direction estimation portion is estimated the flow process of processing;
Fig. 6 A is the figure that shows the example of the reference zone that is used for edge direction estimation to 6C;
Fig. 7 is the flow chart that the edge direction that shows the RGB color space coloured image be used for edge direction estimation portion is estimated the flow process handled;
Fig. 8 is the figure that in the marking area shown in the displayed map 4 estimates the edge direction;
Fig. 9 is the figure that shows the object lesson of edge figure table;
Figure 10 A is the figure that specifically describes the method for the first edge figure be used for selecting Fig. 4 marking area to 10E;
Figure 11 is the figure of the second edge figure in the marking area in the displayed map 4;
Figure 12 A is the figure that shows the object lesson of the enhancing core (enhancementkernels) that is used for figure image intensifying portion to 12C;
Figure 13 is the figure of the example that shows that the enhancing core utilize among Figure 12 A strengthens pixel P;
Figure 14 describes to utilize to strengthen the enhancing of core degree of comparing input image data is amplified the figure of 8 times example;
Figure 15 shows that the enlarged image piece of enlarged image piece generating unit produces the flow chart of the flow process of handling;
Figure 16 is the figure that shows the object lesson of 3 * 3 enlarged image pieces generation;
Figure 17 is the figure that show to utilize the example of 3 * 3 enlarged image pieces generation that the second edge figure different with the second edge figure among Figure 16 carry out;
Figure 18 is the figure that shows the object lesson of 3 * 4 enlarged image pieces generation;
Figure 19 describes the figure that selects reference pixel according to the edge direction of estimating;
Figure 20 is the figure that shows the object lesson of 4 * 4 enlarged image pieces generation;
Figure 21 is the figure that shows the example of 4 * 4 enlarged image pieces generation when the edge direction of estimating is direction 0 or 4;
Figure 22 is the figure that shows the enlarged image piece of marking area;
Figure 23 is the figure of the object lesson of 4 * 4 enlarged image pieces configuration in the display image piece configuration portion.
Embodiment
With reference now to accompanying drawing,, wherein shown the preferred embodiment of the present invention.Fig. 1 is the block diagram of handling part that is used to carry out processing and amplifying that shows image processing apparatus according to an embodiment of the invention.In the drawings, enlargement processing section 1 comprises: image block is provided with portion 10, edge direction estimation portion 11, figure image intensifying portion 12, edge direction selection portion 13, enlarged image piece generating unit 14, image block configuration portion 15 and image data storage portion 16.Below general description is carried out in each parts and operation thereof among this embodiment.
In Fig. 1, view data is with a kind of picture format that can handle in image processing apparatus (BMP for example, TIFF, PNG) describe, and be in such as the image processor (not shown) of personal computer or digital camera, to produce from being used for creating in the view data of preparing with the application program of editing and processing.
Image data storage portion 16 comprises: store the function of this view data and the interim enlarged image data of accepting resolution conversion or processing and amplifying of storing before the enlarged image data are output to the output unit (not shown) interim from the view data of input unit (not shown) input is carried out processing and amplifying enlargement processing section 1 before.
Image block is provided with portion 10 and comprises following function: each required piece size of processing that edge direction estimation portion 11 and figure image intensifying portion 12 are set, the image block of above-mentioned size of input images stored extracting data and image block sent to edge direction estimation portion 11 and figure image intensifying portion 12 from image data storage portion 16 successively.
Edge direction estimation portion 11 comprises following function: according to the pixel value in following each zone distribute calculate by image block be provided with marking area in each image block that portion 10 extracts successively and this marking area around reference zone in the edge direction, and estimate the edge direction of marking area according to the edge direction of calculating in these zones.
Figure image intensifying portion 12 comprises following function: utilize the element corresponding with the magnification ratio of processing and amplifying in the enlargement processing section 1 and the core of size, strengthen the contrast that the view data in the peripheral region of marking area in each image blocks of portion's 10 extractions and marking area is set by image block.
Edge figure selecting portion 13 comprises following function: select the first measure-alike edge figure of the pairing marking area of edge direction with the marking area of being estimated by edge direction estimation portion 11, and specify the second edge figure different with the first edge dimension of picture, wherein the first and second edge figures provide with man-to-man mutual corresponding relation.
Enlarged image piece generating unit 14 comprises following function: utilize the edge direction estimated by edge direction estimation portion 11, the second edge figure that is provided by edge figure selecting portion 13 and the pixel value that is strengthened by the contrast that figure image intensifying portion 12 provides, produce the enlarged image piece that is used for marking area.
Image block configuration portion 15 comprises following function: dispose successively from the enlarged image piece of enlarged image piece generating unit 14 outputs, and handle is subjected to resolution conversion and the enlarged image data output to image data storage portion 16.
Then, will describe with reference to the flow chart of figure 2 flow process overview processing and amplifying in the embodiment of the invention.In the following description, unshowned label can be with reference to figure 1 among Fig. 2.
At first, at step S20, image block is provided with portion 10 for from input unit (not shown) and the input image data that is stored in the image data storage portion 16 each required piece size of processing of edge direction estimation portion 11 and figure image intensifying portion 12 being set, and extracts the image block of this piece size from input image data.
For example as shown in Figure 3,, then extract 6 * 6 sized blocks shown in Fig. 3 A for 2 times of amplifications so, then extract 8 * 8 sized blocks shown in Fig. 3 B for 4 times of amplifications if marking area is that 2 * 2 sizes and the peripheral region that comprises marking area are 4 * 4 sizes.
Then, at step S21, edge direction estimation portion 11 calculates the marking area in the image block that is extracted and comprises the edge direction of the reference zone in the peripheral region of this marking area, and the edge direction θ of marking area is provided according to the edge direction that is provided.
Then, at step S22, edge figure selecting portion 13 utilizes by the edge direction of edge direction estimation portion 11 estimations and the pixel distribution figure of marking area and selects the first and second edge figures.The first and second edge figures are provided for each edge figure described later and each pixel distribution figure, and are stored in the storage part (Fig. 1 is not shown) with the form such as table information.
Subsequently, at step S23, figure image intensifying portion 12 utilizes has the element value corresponding with magnification ratio and the core of size, strengthens by image block marking area in the image block that portion 10 extracts and the view data in the peripheral region thereof are set.
Then, at step S24, enlarged image piece generating unit 14 is utilized by the edge direction θ in the marking area of edge direction estimation portion 11 estimations, by the edge figure of edge figure selecting portion 13 selections with by the marking area of figure image intensifying portion 12 enhancings and the pixel value of peripheral region, produces the enlarged image piece that is used for marking area.
Then, at step S25, image block configuration portion 15 is according to the enlarged image piece of preordering method configuration described below by the marking area of enlarged image piece generating unit 14 generations, and an image block of configuration is stored in the image data storage portion 16.
At step S26, determine whether the generation of the enlarged image data that will export at input image data is finished.If the generation of enlarged image data is not finished, step S20 is returned in this processing, extracts another image block and repeats above-mentioned processing.If the generation of enlarged image data is finished, then export the enlarged image data and stop processing and amplifying.
The overview and the operation of image processing apparatus in this embodiment of the invention has been described.Below, will be to the edge direction estimation portion 11 of the major part of image processing apparatus, figure image intensifying portion 12, edge figure selecting portion 13, enlarged image piece generating unit 14 and image block configuration portion 15 are elaborated.
Be that 2 * 2 size area and the peripheral region that comprises marking area are that the situation of 4 * 4 sized blocks is that example is estimated to be elaborated to the edge direction of the edge direction estimation portion 11 of this embodiment with previous marking area shown in Figure 3 below.
Fig. 4 A is the figure that shows the example of marking area and peripheral region.In Fig. 4, marking area is pixel { a, b, c, d}={15,177,86,203}, wherein these numeral pixel values that centered on by a frame.Flow chart below with reference to Fig. 5 estimates that to the edge direction of edge direction estimation portion 11 flow process of handling describes.
At first, at step S50, according to the edge direction θ in the remarkable piece that centers on by thick frame in following expression (1) calculating chart 4.The variable that the quantity of the quantity of the angle reference edge angle reference that to be a step S55 who is used for illustrating in the back estimate the edge direction that is used for marking area is counted is set to 1.
gx=(a+c-b-d)/2
gy=(a+b-c-d)/2
θ=acrtan(gy/gx) (1)
For example, for the concrete marking area shown in Fig. 4 A 15,104,86,203}, according to expression formula (1), gx=-103 and gy=-85, the edge direction θ in the marking area as shown in Fig. 4 A become and equal-140.5 °.In addition, shown in Fig. 4 B, if the edge direction θ that obtains by with 22.5 ° of normalizations (8 directions), obtains direction 2 so when θ=-140.5 °.
Then, at step S51,, in the peripheral region that Fig. 4 A shows, select to be used for the reference zone that the edge direction is estimated according to the edge direction θ in the marking area that calculates at step S50.
Fig. 6 A shows the example of selecting based on the reference zone of the edge direction in the marking area to 6C.For this marking area, regular edge direction is a direction 2, therefore selects four reference zones (4 2 * 2 zones, upper and lower, left and right that comprise marking area) that show among Fig. 6 C.Reference zone is selected to be not limited to shown in Figure 6.For example, in Fig. 6 C, can use 8 reference zones (comprising the upper and lower, left and right of marking area, 82 * 2 oblique zones).
Then, at step S52,, calculate edge direction θ according to the expression formula same (1) with step S50 for each reference zone of selecting at step S51.
Then, at step S53, edge direction in the selected reference zone of step S52 calculating and the edge direction in the marking area that step S50 calculates are compared.
If the difference of two edge directions then increases progressively 1 to the quantity of angle reference at step S54 less than a predetermined threshold value Th, and step S55 is transferred in this processing.If it is inappropriate for the estimation of edge direction that the difference of two edge directions, is then determined reference zone greater than predetermined threshold value Th, and step S55 is transferred in this processing.
Then, at step S55, determine whether the edge direction calculating in the reference zone of all selections is finished.If the edge direction calculating is finished, step S56 is transferred in this processing; If the edge direction calculating is not finished as yet, repeating step S52 is to S54.
At step S56, calculate the edge direction in the marking area and be determined for the summation of the edge direction in the reference zone that the edge direction estimates, and adopt edge direction summation divided by the resulting average edge of the quantity of angle reference direction as the estimation edge direction in the marking area at step S53.
Therefore, consider that also edge direction from the reference zone that the edge direction of only calculating derives estimates the edge direction in the marking area marking area, only make it possible to carrying out the edge direction and detect with high accuracy by carrying out simple computation.
Be that the situation of grayscale image is illustrated to the input image data in the edge direction estimation portion 11.But, the invention is not restricted to use grayscale image.As an example, below with reference to the flow chart of Fig. 7 the edge direction that is used for RGB color space coloured image is estimated that the flow process of handling describes.
At first,, utilize above-mentioned expression formula (1) and following expression (2) at step S70, for each the RGB color space piece in the marking area calculates edge intensity G:
G=gx*gx+gy*gy (2)
Then, at step S71, from the edge intensity of the RGB color space piece that calculates according to expression formula (2), select maximum edge intensity, and select color space piece corresponding to selected maximum edge intensity.
Then, at step S72, carry out above-mentioned edge direction in the color space piece of in step S71, selecting and estimate that the step S50 of processing is to S56 with reference to figure 5 descriptions.
Then, at step S73, adopt estimation edge direction in the color space piece of maximum edge intensity as the estimation edge direction in other color space piece, and be the processing that each color space piece is carried out described edge, back figure selecting portion 13 and enlarged image piece generating unit 14.
Because execution in step S70 is to S73, make it possible to suppress the deterioration in image quality factor in each the edge parts of enlarged image data in the coloured image such as gamut (color shift).
To utilize the normalization after the edge direction calculating of expression formula (1) in marking area and reference zone be 8 directions to hypothesis in the foregoing description, but the invention is not restricted to this embodiment.More high-precision if desired edge direction can proceed to the normalization of 12 directions (15.0 °) or 16 directions (12.25 °).
Below, will the operation of edge figure selecting portion 13 be elaborated.Edge figure selecting portion 13 use edge figure table for example shown in Figure 9 select with the marking area of estimating by edge direction estimation portion 11 in the first corresponding edge figure of edge direction, and definite second edge figure corresponding to the selected first edge figure.
Specifically, shown in Fig. 4 A, use the peripheral region of the marking area that centers on by this frame and this marking area in the following description.Treatment step according to above-mentioned edge direction estimation portion 11, the estimation edge direction of the marking area that shows among Fig. 4 A is estimated as direction shown in Figure 81 (quantity (5)=-147.5 of summation (the 737.3)/angle reference of the edge direction in marking area and four reference zones, regular direction 1).
According to the estimation edge direction (direction 1) in the marking area, from edge figure table shown in Figure 9, select with marking area in edge figure graph of a correspondence candidate.In the case, four figures 0 to 3 of the first edge figure of direction 1 become the candidate of the first edge figure that is used for marking area shown in Fig. 4 A.
Then, in the edge figure selecting portion 13 selection figures 0 to 3 as described below any one as being used for the first edge figure of marking area.
Figure 10 A is to specify the figure that selects the method for an edge figure from four edge figure candidates that are used for marking area shown in Fig. 4 A to 10E.At first, the first edge figure candidate is transformed to the bit pattern shown in Figure 10 B (white portion is set to 0, and other is 1).But edge figure table shown in Figure 9 can be converted to a bit pattern table in advance, and this bit pattern table can be stored in the storage part (not shown in figure 1), can omit this step in the case.
Then,, calculate the average pixel value (in Fig. 4 A, 102) in the marking area, this mean value is deducted from each pixel value of marking area, prepare the pixel value figure (Figure 10 C) of marking area based on this symbol according to following expression (3).In addition, be this pixel value graph transformation a bit pattern (Figure 10 D).
Mean=(a+b+c+d)/4
a_sign=a-Mean
b_sign=b-Mean
c_sign=c-Mean
d_sign=d-Mean (3)
Then, carry out the figure coupling between the bit pattern of each the edge figure candidate in Figure 10 B and the bit pattern of the marking area among Figure 10 D, and select first an edge figure (Figure 10 E) that is used for marking area.In the case, select figure 2 as the first edge figure that is used for marking area.
When selecting the first edge figure, determine the second edge figure that provides corresponding to the first edge figure at last.In the case, select the second edge figure shown in Figure 11.The second edge figure is used for described in the back enlarged image piece generating unit 14 and produces the enlarged image piece that is used for marking area.
The first and second edge figures are not limited to shown in Figure 9 those.For example, can use the edge figure different according to the type of input image data, and can increase or reduce the first and second edge figure candidates' of each angle quantity with figure shown in Figure 9.
Below, figure image intensifying portion 12 is elaborated.Figure image intensifying portion 12 uses the core of element corresponding with the magnification ratio of processing and amplifying in the enlargement processing section 1 and size to strengthen the contrast of the view data of marking area in the image block that portion's 10 extractions are set by image block shown in Figure 3 and peripheral region thereof.
Figure 12 A shows the object lesson of the enhancing core that is used for figure image intensifying portion 12 to 12C.Figure 13 shows the figure that how to utilize 0 couple of pixel P of core shown in Figure 12 A to strengthen.In the case, according to the pixel value P of following formula (4) calculating pixel P ":
Remarkable pixel value P '=1.60*P-0.15* (a+b+c+d) (4)
(wherein, a, b, c, d and P represent the pixel value of each position shown in Figure 13)
Figure 14 shows the example that utilizes Figure 12 A to the core of enhancing shown in the 12C contrast of 8 times of input image data amplifications to be strengthened.At first, for input image data is amplified 2 times, the contrast of using the core 0 shown in Figure 12 A to carry out input image data strengthens.
Then, for the input image data that amplifies 2 times is amplified 2 times, the contrast of using core shown in Figure 12 B 1 to carry out the input image data that is exaggerated 2 times strengthens.Equally, for input image data is amplified 8 times, the contrast of using the core 2 shown in Figure 12 C to carry out the input image data that is exaggerated 4 times strengthens.
Be used to the core that degree of comparing strengthens and be not limited to Figure 12 A to the core shown in the 12C, can according to the type of input image data or size use on distance between element and element with Figure 12 A to the different core of core shown in the 12C.
Below enlarged image piece generating unit 14 is described.Enlarged image piece generating unit 14 the second edge figure that is provided by edge figure selecting portion 13 is provided and is subjected to the pixel value that contrast strengthens in figure image intensifying portion 12, produces the enlarged image piece that is used for marking area.
The flow process example of the enlarged image piece generation of enlarged image piece generating unit 14 being handled below with reference to the flow chart of Figure 15 describes.At first,, as shown in figure 16, use marking area that in figure image intensifying portion 12, is subjected to the contrast enhancing and the second edge figure of selecting by edge figure selecting portion 13, produce 3 * 3 enlarged image pieces at step S150.Calculate the pixel value of 3 * 3 enlarged image pieces according to expression formula shown in Figure 16.For each second edge figure is determined the calculated for pixel values expression formula.Figure 17 is presented at the object lesson of the calculated for pixel values expression formula in other second edge figure.
Then, at step S151, the edge direction in the marking area of determining to estimate by edge direction estimation portion 11.If the edge direction of estimating is in direction 1 to 3 or the direction 5 to 7 any one, step S152 is transferred in this processing; If the edge direction of estimating is direction 0 or 4, step S153 is transferred in this processing.
Step S152 (when the estimation edge direction in the marking area is in 7 any one of direction 1 to 3 or direction 5), produce one 4 * 4 enlarged image piece from 3 * 3 enlarged image pieces that produce at step S150.
As shown in figure 18, at first, use 3 * 3 enlarged image pieces and in figure image intensifying portion 12, be subjected to reference pixel (r0 is to r5) in the peripheral region that contrast strengthens, produce one 3 * 4 enlarged image piece.Determine the pixel value of 3 * 4 enlarged image pieces according to calculation expression shown in Figure 180.According to the reference pixel (r0 is to r5) in the direction selection peripheral region, estimation edge in the marking area.
Figure 19 shows according to the object lesson of estimating the reference pixel that the edge direction is selected.The selection of reference pixel is not limited to the selection example of two figures shown in Figure 19, can be according to estimating that the edge direction provides the reference pixel of bigger quantity to select figure.
Then, as shown in figure 20, use 3 * 4 enlarged image pieces and the reference pixel (r0 is to r7) that is subjected in the peripheral region that contrast strengthens produces one 4 * 4 enlarged image piece in figure image intensifying portion 12.Determine the pixel value of 4 * 4 enlarged image pieces according to calculation expression shown in Figure 20.
In 4 * 4 enlarged image pieces of above-mentioned steps S152 produce, shown 3 * 3 → 3 * 4 → 4 * 4 generation handling process, but also can adopt 3 * 3 → 4 * 3 → 4 * 4 generation handling process, suitably change reference pixel in the case and select.
Step S153 (when the estimation edge direction in the marking area is direction 0 or 4), produce one 4 * 4 enlarged image piece from 3 * 3 enlarged image pieces that produce at step S150.
4 * 4 enlarged image pieces that Figure 21 is presented at step S153 produce the overview of handling.At first, marking area that is subjected to the contrast enhancing in figure image intensifying portion 12 and peripheral region piece (4 * 4) are amplified 1.25 times.In the case, amplifying technique can be that linear amplification or projection are amplified.
Then, 5 * 5 the core (3 * 3) that is provided is replaced by 3 * 3 enlarged image pieces that produce at step S150, and further 1.2 times of 5 * 5 amplifications of gained.Equally, in the case, amplifying technique can be that linear amplification or projection are amplified.
At last, extract 4 * 4 enlarged image pieces that 6 * 6 core (4 * 4) conduct that is provided is used for marking area.Because execution in step S150 to S153, has produced the enlarged image piece that is used for marking area for example shown in Figure 22.
Below image block configuration portion 15 is described.Image block configuration portion 15 is configured the enlarged image piece that is used for marking area that is produced by enlarged image piece generating unit 14 successively according to preordering method.
Figure 23 configurations shown is by the object lesson of 4 * 4 enlarged image pieces of enlarged image piece generating unit 14 generations.In example shown in Figure 23, the enlarged image piece 0 and 1 of Chan Shenging is configured to overlap each other successively.Each overlaid pixel is configured to this pixel value and last pixel value are averaged.
For image processing apparatus and image processing method by present embodiment carry out the image processing and amplifying, can only mate the edge shape figure of selecting corresponding to the edge direction by carrying out simple figure, and produce enlarged image according to selected edge shape figure, consideration edge direction, use pixel value, making it possible to provides the enlarged image that has suppressed fuzzy and sawtooth under little processing load.
Above-mentioned image processing method can be used as image processing program and carries out in personal computer or digital camera, perhaps can become the form by the communication line issue of internet, perhaps can be recorded on the computer readable recording medium storing program for performing such as CD-ROM.
This image processing program can also be applied to following situation,, digitally amplifies input picture at a machine (for example digital static camera, digital video camera, mobile phone or PDA (personal digital assistant)) that is used for processing digital images that is.
As mentioned above, in image processing and amplifying of the present invention, detect edge direction accurately and produce enlarged image, make it possible to the high quality graphic processing and amplifying that has suppressed fuzzy and sawtooth defective under little load, to finish at a high speed from pixel value corresponding to the edge direction.
Although describe the present invention with reference to certain preferred embodiment, those skilled in the art clearly can carry out variations and modifications according to instruction of the present invention.These variations and modification are considered to fall in spirit of the present invention, scope and the design that claim limits.
Claims (18)
1. image processing apparatus that is used to carry out the image processing and amplifying, described image processing apparatus comprises:
Edge direction estimation portion is used to estimate comprise the edge directional information of first image-region of remarkable pixel;
Edge figure selecting portion is used for the pixel value according to the edge directional information and first image-region, selects and the corresponding pairing edge of first image-region shape figure of edge direction of being indicated by the edge directional information;
Figure image intensifying portion is used to strengthen the pixel value of first image-region;
Enlarged image zone generating unit is used for the pixel value by first image-region that uses the edge shape figure selected by described edge figure selecting portion and strengthened by described figure image intensifying portion, produces an enlarged image zone; With
Image configurations portion is used for the enlarged image zone that is produced by described enlarged image zone generating unit according to the preordering method configuration.
2. image processing apparatus according to claim 1, wherein
Each edge angle is calculated according to first image-region that comprises remarkable pixel and the margin of image element between the image-region around first image-region by described edge direction estimation portion, and estimates to comprise the edge directional information in first image-region of remarkable pixel according to the edge angle of being calculated.
3. image processing apparatus according to claim 1, wherein
Image-region that the edge directional information of first image-region that comprises remarkable pixel is estimated is selected to be used for by described edge direction estimation portion.
4. image processing apparatus according to claim 1, wherein
The edge directional information of estimating by described edge direction estimation portion in the direction of quantification by normalization.
5. image processing apparatus according to claim 1, wherein
The edge strength information in first image-region calculates in described edge direction estimation portion.
6. image processing apparatus according to claim 5, wherein
Color space data are selected according to the edge strength information by described edge direction estimation portion from a plurality of color space data, and use selected color space data to estimate the edge directional information.
7. image processing apparatus according to claim 1, wherein
Described edge figure selecting portion is that certain edges thereof is selected edge shape figure along directional information from a plurality of edges shape figure.
8. image processing apparatus according to claim 1, wherein
When for certain edges thereof when directional information can be selected two or more edges shape figure, described edge figure selecting portion is transformed to a two-value figure to the pixel in first image-region, and carries out figure coupling between binary map shape and each the edge shape figure to select edge shape figure.
9. image processing apparatus according to claim 1, wherein
Edge shape figure is selected by described edge figure selecting portion from the edge shape figure corresponding to a plurality of different sizes of first image-region, each the edge shape figure that is wherein provided is corresponding with different size.
10. image processing apparatus according to claim 1, wherein
Described figure image intensifying portion selects one to strengthen core the different enhancing core of distance according to magnification ratio between a plurality of its core elements and element, and uses the pixel value that selected enhancing core strengthens first image-region.
11. image processing apparatus according to claim 1, wherein
Described enlarged image zone generating unit comprises first enlarged image zone generating unit and second enlarged image zone generating unit.
12. image processing apparatus according to claim 11, wherein
First enlarged image zone generating unit is by using the pixel value in first image-region that is strengthened by described figure image intensifying portion, perhaps, come to amplify first image-region to produce the first enlarged image zone for each edge shape figure by using the value of the two or more calculated for pixel values in first image-region that a utilization strengthens by described figure image intensifying portion.
13. image processing apparatus according to claim 11, wherein
Second enlarged image zone generating unit comprises a plurality of amplifying units, and these amplifying units amplify the first enlarged image zone by using different disposal; And second enlarged image zone generating unit selects an amplifying unit to use from a plurality of amplifying units according to the edge directional information of being estimated by described edge direction estimation portion.
14. image processing apparatus according to claim 13, wherein
One of them amplifying unit produces the second enlarged image zone by the predetermined computation that the pixel value around first image-region that utilizes the pixel value in the first enlarged image zone and strengthened by described figure image intensifying portion carries out.
15. image processing apparatus according to claim 13, wherein
One of them amplifying unit according to the edge directional information of estimating by described edge direction estimation portion select first image-region around pixel value.
16. image processing apparatus according to claim 13, wherein
One of them amplifying unit uses by predetermined computation the pixel value around first image-region that strengthened by described figure image intensifying portion is carried out the value of conversion gained, produces the second enlarged image zone.
17. image processing apparatus according to claim 1, wherein
Described image configurations portion sequentially disposes the enlarged image zone that is produced by described enlarged image zone generating unit, so as with each enlarged image region overlapping.
18. an image processing method that is used to carry out the image processing and amplifying, described image processing method may further comprise the steps:
For first image-region that comprises remarkable pixel is estimated the edge directional information;
According to the pixel value in the edge directional information and first image-region, select and the corresponding pairing edge of first image-region shape figure of edge direction of indicating by the edge directional information;
Strengthen the pixel value of first image-region;
Produce an enlarged image zone by the enhancing pixel value that uses the edge shape figure and first image-region; And
Dispose this enlarged image zone according to preordering method.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP076896/2002 | 2002-03-19 | ||
JP2002076896A JP3915563B2 (en) | 2002-03-19 | 2002-03-19 | Image processing apparatus and image processing program |
Publications (2)
Publication Number | Publication Date |
---|---|
CN1445982A CN1445982A (en) | 2003-10-01 |
CN1217526C true CN1217526C (en) | 2005-08-31 |
Family
ID=27785230
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN031025625A Expired - Fee Related CN1217526C (en) | 2002-03-19 | 2003-02-10 | Image processing device, image processing method, image processing program and its recording medium |
Country Status (5)
Country | Link |
---|---|
US (1) | US7149355B2 (en) |
EP (1) | EP1347410B1 (en) |
JP (1) | JP3915563B2 (en) |
CN (1) | CN1217526C (en) |
DE (1) | DE60321739D1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103888706A (en) * | 2012-12-20 | 2014-06-25 | 索尼公司 | Image processing apparatus, image pickup apparatus, and image processing method |
Families Citing this family (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3915563B2 (en) * | 2002-03-19 | 2007-05-16 | 富士ゼロックス株式会社 | Image processing apparatus and image processing program |
JP3767513B2 (en) * | 2002-04-26 | 2006-04-19 | 三菱電機株式会社 | Pixel interpolation circuit, scanning line interpolation circuit, pixel interpolation method, scanning line interpolation method, and application apparatus |
JP4407801B2 (en) * | 2003-12-16 | 2010-02-03 | セイコーエプソン株式会社 | Edge generation apparatus, edge generation method, and edge generation program |
US7362376B2 (en) | 2003-12-23 | 2008-04-22 | Lsi Logic Corporation | Method and apparatus for video deinterlacing and format conversion |
JP4668185B2 (en) * | 2004-02-19 | 2011-04-13 | 三菱電機株式会社 | Image processing method |
TWI288890B (en) * | 2005-04-01 | 2007-10-21 | Realtek Semiconductor Corp | Method and apparatus for pixel interpolation |
JP4730525B2 (en) * | 2005-06-13 | 2011-07-20 | 富士ゼロックス株式会社 | Image processing apparatus and program thereof |
DE102005044866A1 (en) * | 2005-09-20 | 2007-04-19 | Siemens Ag | Device for measuring the position of electrical components |
DE102005052061A1 (en) * | 2005-11-01 | 2007-05-16 | Carl Zeiss Imaging Solutions G | Method and device for image processing |
JP4736121B2 (en) * | 2005-11-21 | 2011-07-27 | 富士ゼロックス株式会社 | Image analysis apparatus, image processing apparatus, image analysis method, image analysis program, and recording medium recording the same |
JP5013243B2 (en) * | 2005-12-20 | 2012-08-29 | 富士ゼロックス株式会社 | Image processing apparatus, image processing method, image processing program, and recording medium recording the same |
JP4703504B2 (en) * | 2006-07-21 | 2011-06-15 | ソニー株式会社 | Image processing apparatus, image processing method, and program |
JP4999392B2 (en) | 2006-07-28 | 2012-08-15 | キヤノン株式会社 | Image processing apparatus, control method therefor, computer program, and computer-readable storage medium |
JP4882680B2 (en) | 2006-07-31 | 2012-02-22 | 富士ゼロックス株式会社 | Image processing system and image processing program |
US8059915B2 (en) * | 2006-11-20 | 2011-11-15 | Videosurf, Inc. | Apparatus for and method of robust motion estimation using line averages |
JP4898531B2 (en) * | 2007-04-12 | 2012-03-14 | キヤノン株式会社 | Image processing apparatus, control method therefor, and computer program |
KR100866492B1 (en) * | 2007-05-30 | 2008-11-03 | 삼성전자주식회사 | Noise reduction device and method with edge enhancement function |
US8538203B2 (en) * | 2007-07-24 | 2013-09-17 | Sharp Laboratories Of America, Inc. | Image upscaling technique |
EP2187359B1 (en) * | 2007-09-07 | 2012-08-01 | Glory Ltd. | Paper sheet identification device and paper sheet identification method |
WO2009093185A2 (en) * | 2008-01-24 | 2009-07-30 | Koninklijke Philips Electronics N.V. | Method and image-processing device for hole filling |
JP5115361B2 (en) * | 2008-06-27 | 2013-01-09 | 富士通株式会社 | Pixel interpolation device, pixel interpolation method, and pixel interpolation program |
US8364660B2 (en) * | 2008-07-11 | 2013-01-29 | Videosurf, Inc. | Apparatus and software system for and method of performing a visual-relevance-rank subsequent search |
US8364698B2 (en) | 2008-07-11 | 2013-01-29 | Videosurf, Inc. | Apparatus and software system for and method of performing a visual-relevance-rank subsequent search |
JP5369526B2 (en) * | 2008-07-28 | 2013-12-18 | 株式会社日立製作所 | Image signal processing device, display device, recording / playback device, and image signal processing method |
KR101527409B1 (en) * | 2008-09-22 | 2015-06-09 | 삼성전자주식회사 | Image interpolation device using region classification and its method |
US8682094B2 (en) * | 2009-05-12 | 2014-03-25 | Dynamic Invention Llc | Adaptive subpixel-based downsampling and filtering using edge detection |
JP5315157B2 (en) * | 2009-07-27 | 2013-10-16 | キヤノン株式会社 | Information processing apparatus, line noise reduction processing method, and program |
JP2011154418A (en) * | 2010-01-26 | 2011-08-11 | Nippon Telegr & Teleph Corp <Ntt> | Super-high resolution image generating device, super-high resolution image generation method and program therefor |
US8849029B2 (en) | 2010-02-26 | 2014-09-30 | Nec Corporation | Image processing method, image processing device and program |
US9508011B2 (en) | 2010-05-10 | 2016-11-29 | Videosurf, Inc. | Video visual and audio query |
CN102609959B (en) * | 2011-01-19 | 2015-06-10 | 炬芯(珠海)科技有限公司 | Image block type judging method and system |
JP2012191465A (en) * | 2011-03-11 | 2012-10-04 | Sony Corp | Image processing apparatus, image processing method, and program |
CN104134189B (en) * | 2014-07-31 | 2017-07-28 | 青岛海信电器股份有限公司 | A kind of method and device of image amplification |
JP6226206B2 (en) * | 2015-05-22 | 2017-11-08 | 京セラドキュメントソリューションズ株式会社 | Image processing apparatus, image processing method, and image processing program |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3267289B2 (en) | 1989-05-10 | 2002-03-18 | キヤノン株式会社 | Color image processing method |
US5161035A (en) * | 1989-10-31 | 1992-11-03 | Brother Kogyo Kabushiki Kaisha | Digital image processing device for enlarging original image with high fidelity |
EP0497537B1 (en) * | 1991-02-01 | 1997-07-16 | Canon Kabushiki Kaisha | Image processing method and apparatus |
JP3073381B2 (en) | 1993-12-24 | 2000-08-07 | キヤノン株式会社 | Image processing method and apparatus |
JPH0998282A (en) | 1995-09-29 | 1997-04-08 | Fuji Xerox Co Ltd | Image processing unit |
JP3753197B2 (en) * | 1996-03-28 | 2006-03-08 | 富士写真フイルム株式会社 | Image data interpolation calculation method and apparatus for performing the method |
JPH09326921A (en) | 1996-06-04 | 1997-12-16 | Matsushita Electric Ind Co Ltd | Magnification method for digital image |
US6415053B1 (en) * | 1998-04-20 | 2002-07-02 | Fuji Photo Film Co., Ltd. | Image processing method and apparatus |
JP3075269B2 (en) * | 1998-10-13 | 2000-08-14 | セイコーエプソン株式会社 | Image data interpolation method, image data interpolation device, and medium recording image data interpolation program |
JP2000228723A (en) | 1999-02-05 | 2000-08-15 | Matsushita Electric Ind Co Ltd | Device and method for converting number of pixels |
JP2000253238A (en) | 1999-03-01 | 2000-09-14 | Mitsubishi Electric Corp | Picture processor and picture processing method |
JP3614324B2 (en) * | 1999-08-31 | 2005-01-26 | シャープ株式会社 | Image interpolation system and image interpolation method |
AUPQ377899A0 (en) | 1999-10-29 | 1999-11-25 | Canon Kabushiki Kaisha | Phase three kernel selection |
JP3915563B2 (en) * | 2002-03-19 | 2007-05-16 | 富士ゼロックス株式会社 | Image processing apparatus and image processing program |
JP4407801B2 (en) * | 2003-12-16 | 2010-02-03 | セイコーエプソン株式会社 | Edge generation apparatus, edge generation method, and edge generation program |
-
2002
- 2002-03-19 JP JP2002076896A patent/JP3915563B2/en not_active Expired - Fee Related
-
2003
- 2003-02-10 US US10/360,874 patent/US7149355B2/en not_active Expired - Fee Related
- 2003-02-10 EP EP03002641A patent/EP1347410B1/en not_active Expired - Lifetime
- 2003-02-10 DE DE60321739T patent/DE60321739D1/en not_active Expired - Lifetime
- 2003-02-10 CN CN031025625A patent/CN1217526C/en not_active Expired - Fee Related
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103888706A (en) * | 2012-12-20 | 2014-06-25 | 索尼公司 | Image processing apparatus, image pickup apparatus, and image processing method |
Also Published As
Publication number | Publication date |
---|---|
EP1347410A2 (en) | 2003-09-24 |
JP2003274157A (en) | 2003-09-26 |
US7149355B2 (en) | 2006-12-12 |
EP1347410A3 (en) | 2006-06-28 |
US20030179935A1 (en) | 2003-09-25 |
CN1445982A (en) | 2003-10-01 |
EP1347410B1 (en) | 2008-06-25 |
JP3915563B2 (en) | 2007-05-16 |
DE60321739D1 (en) | 2008-08-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN1217526C (en) | Image processing device, image processing method, image processing program and its recording medium | |
CN1237778C (en) | Image processing device, image processing method and image processing program | |
CN1258907C (en) | Image processing equipment, image processing method and storage medium of image processing program | |
CN1114888C (en) | Image processing method and device, image processing system,and memory medium | |
CN1308891C (en) | Apparatus and method for intensifying edge in picture processing | |
CN1484434A (en) | Method of pantography according to scale for digital image in embedded system | |
CN101061502A (en) | Magnification and pinching of two-dimensional images | |
CN101079959A (en) | Image processing apparatus and method, recording medium, and program | |
CN1834582A (en) | Image processing method, three-dimensional position measuring method and image processing apparatus | |
CN1717702A (en) | Generation of still images using multi-frame images | |
CN1591172A (en) | Image processing method and apparatus | |
CN1812481A (en) | Image processing apparatus and its method | |
CN1075022A (en) | Image-processing system and method thereof | |
CN1542693A (en) | Image processing method, image processing apparatus and image processing program | |
CN1804682A (en) | Blurring correction method and imaging device | |
US7885486B2 (en) | Image processing system, method for processing image and computer readable medium | |
CN1889673A (en) | Motion detecting and uninterlacing method for digital video frequency processing and apparatus thereof | |
CN1132405C (en) | Image processing device | |
CN1279490C (en) | Fast Morphological Erosion and Dilation Methods in Image Processing | |
CN1750043A (en) | Histogram equalizing method for controlling average brightness | |
CN1932836A (en) | Grid orientation, scale, translation and modulation estimation | |
JP4857975B2 (en) | Image processing system and image processing program | |
CN1925552A (en) | Normalization method, and multi-dimensional interpolation method and apparatus | |
JP4882843B2 (en) | Image processing system and image processing program | |
CN1677440A (en) | Image enlargement device and image enlargement method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20050831 Termination date: 20180210 |