WO2000030340A1 - A method and device for combining partial film scan images - Google Patents
A method and device for combining partial film scan images Download PDFInfo
- Publication number
- WO2000030340A1 WO2000030340A1 PCT/IB1999/001828 IB9901828W WO0030340A1 WO 2000030340 A1 WO2000030340 A1 WO 2000030340A1 IB 9901828 W IB9901828 W IB 9901828W WO 0030340 A1 WO0030340 A1 WO 0030340A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- edge
- digital image
- digital
- pixels
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 70
- 230000009466 transformation Effects 0.000 claims abstract description 27
- 238000013500 data storage Methods 0.000 claims abstract description 15
- 239000003550 marker Substances 0.000 claims description 45
- 238000013519 translation Methods 0.000 claims description 18
- 238000011161 development Methods 0.000 claims description 14
- 238000004891 communication Methods 0.000 claims description 12
- 230000001131 transforming effect Effects 0.000 claims 9
- 238000001914 filtration Methods 0.000 claims 6
- 208000037656 Respiratory Sounds Diseases 0.000 claims 1
- 206010037833 rales Diseases 0.000 claims 1
- 230000003595 spectral effect Effects 0.000 claims 1
- 230000000295 complement effect Effects 0.000 abstract description 7
- 238000012937 correction Methods 0.000 description 14
- 238000012545 processing Methods 0.000 description 8
- 239000012634 fragment Substances 0.000 description 7
- 238000012935 Averaging Methods 0.000 description 5
- 230000003466 anti-cipated effect Effects 0.000 description 3
- 238000013507 mapping Methods 0.000 description 3
- 238000010606 normalization Methods 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 238000000844 transformation Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000007373 indentation Methods 0.000 description 1
- 238000012417 linear regression Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000001429 visible spectrum Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/387—Composing, repositioning or otherwise geometrically modifying originals
- H04N1/3876—Recombination of partial images to recreate the original image
Definitions
- the present invention pertains to a digital image processing device and method, and more particularly to a digital image device and method that aligns and/or combines partial film scan images prior to completion of digital film processing.
- Numerous scanning devices are known for capturing digital images of objects. For example, there are numerous flat-bed and sheet-fed scanners currently available on the market for converting photographs, pages of text, and transparencies to digital images. There are also conventional film scanners available which scan photographic film to produce digital images. Most of the current film scanners use one set of three linear charged coupled devices (CCD) to scan photographic film. Each of the three CCD's scans in one region of the visible spectrum: typically red, green and blue channels. In such conventional scanners, image data is captured in each color channel
- the three CCD's pass over the film once, thus providing three separate color channel scans at substantially the same time.
- a problem with scanning photographic film prior to its being fully developed is that the locations of the images on the film are generally unknown. For example, the locations of images early in the development process may not be known at all, while the locations of images may be known with low precision at an intermediate stage. It may only be late in the development process that the locations of the images become known with high precision. Consequently, if one were to process a scan area roughly the size of an image on the exposed photographic film, for example, it will usually contain fragments of two different images. In other words, the scan area will straddle two images such that it contains a partial image fragment of two different images. This results in the image fragments from two separate scan areas having to be aligned and combined together to produce a final joined output image.
- an area of an image medium is scanned to produce a raster image of the scanned area.
- the image medium is photographic film.
- a second area of the photographic film is scanned to produce a second raster image of the second scanned area.
- the second scanned area is displaced along the longitudinal direction of the photographic film with respect to the first scanned area such that the scanned areas are abutting or partially overlapping.
- the length of each of the first and second scanned areas is selected to be approximately equal to or greater than a longitudinal length of an image recorded in the photographic film.
- the photographic film has regularly spaced perforation holes, such as sprocket holes. Regularly spaced indentations or notches, or other similar indicia are also suitable.
- the width of each of the first and second scanned areas is preferably of a width such that it contains at least a fraction of the sprocket holes, or other indicia, along at least one of the edges of the photographic film. More preferably, the width of each of the first and second scanned areas is approximately the width required to extend entirely across the photographic image out to at least the half-way point of the sprocket holes along each side of the photographic film.
- the sprocket holes, notches, dents, or other indicia provide reference markers which are fixed relative to the photographic film.
- the first raster image of the first scanned area is filtered with a high-pass spatial filter to produce an edge image corresponding to the first raster image.
- the reference markers which are portions of the sprocket holes in the preferred embodiment, appear as an outline of the reference marker.
- the sprocket holes are used to establish reference points in order to align and combine complementary portions of an image in the first and second scanned images.
- at least one corner, and more preferably, both corners, of a sprocket hole half on each opposing edge of the photographic film determine reference points.
- the most preferred embodiment determines four reference points in each of the two scanned images; thus, eight reference points in total. The following description refers to the reference points in the preferred embodiment as fiducial points.
- Each fiducial point is determined by the intersection of a vertical edge line and a horizontal edge line corresponding to its respective sprocket hole portion.
- the vertical edge line is taken to be parallel to pixel columns in the edge image.
- the column address of the vertical edge line for each of the fiducial points is preferably determined by an averaging process which is preferably a weighted average in which each column address is multiplied by the number of sprocket hole edge pixels in that column, all products for an edge of the sprocket hole are summed, and the sum is divided by a normalization factor.
- a similar procedure is followed to determine the horizontal edge line for each of the fiducial points, except the horizontal edge line is taken to be parallel to the pixel rows and it is preferably assigned a weighted average row number.
- fiducial point in each of the scanned areas is sufficient to make translational corrections.
- a minimum of two fiducial points in each scanned image is required to make rotational corrections in addition to the translational corrections.
- the most preferred embodiment uses four fiducial points in each of the scanned images.
- the fiducial points are preferably established for two sprocket holes in each scanned area which are proximate to the joining regions of the image fragments and which correspond to the same sprocket hole, but as viewed in the first and second images.
- fiducial points are determined for one sprocket hole along each transverse side of the photographic film along the joining region of the image fragments for each of the first and second scanned areas.
- the two fiducial points proximate to each sprocket hole are averaged to provide one average fiducial point intermediate between its respective fiducial points.
- a translational correction at least one of the fiducial points in the first scanned image is required to coincide with a corresponding one fiducial point in the second scanned image. This provides a translational correction rule.
- the translational correction rule is then applied to all pixels, preferably within one image, such that there is a uniform translational correction of all the pixels of the first raster image relative to the second raster image.
- the translational correction rule is determined with respect to one of the above-mentioned average fiducial points to its corresponding average fiducial point.
- the rotational correction can similarly be applied to fiducial points or average fiducial points.
- the rotational correction is performed after the translational correction based on one average fiducial point. Consequently, after the translational correction, the coinciding average fiducial point establishes a first line in the first scanned image passing through the average fiducial point on the opposing side of the photographic film. Similarly, the coinciding average fiducial point establishes a second line in the second scanned image passing through the average fiducial point on the opposing side of the photographic film. In general, there will be a non-zero angle between the first and second lines.
- a rotational translation rule is determined by rotating the first line to coincide with the second line.
- the fiducial point in the second scanned image which is not in coincidence with the average fiducial point in the first scanned image is rotated to bring them into substantial coincidence.
- the pivot point of the rotation is the two substantially coinciding fiducial points which were aligned in the translational correction.
- the combined image is cropped to remove the edge region that includes images of the sprocket holes, and to separate it from adjoining images.
- FIGURE 1 is a schematic illustration of a digital image combining device according to the preferred embodiment of the invention.
- FIGURE 2 is a schematic illustration showing a more detailed view of the digital image combining device according to the preferred embodiment of the invention.
- FIGURE 3 is an illustration of an edge image of photographic film with the locations of images on the film shown schematically as large solid rectangles and the scan area shown schematically as a dashed rectangular box;
- FIGURE 4 is an illustration of a portion of one of the sprocket holes illustrated in FIGURE 3 along with examples of fiducial points;
- FIGURE 5 is an illustration of a portion of one of the sprocket holes illustrated in FIGURE 3 along with examples of fiducial points;
- FIGURE 6 is an edge image of a portion of photographic film, like FIGURE 3, except the scan area is displaced relative to the scan area illustrated in FIGURE 3;
- FIGURE 7 is similar to FIGURE 4, but shows a portion of a sprocket hole illustrated in FIGURE 6 along with corresponding fiducial points;
- FIGURE 8 is similar to FIGURE 5, except that it shows a portion of a sprocket hole illustrated in FIGURE 6 along with examples of fiducial points;
- FIGURE 9 shows a case in which the scan area illustrated in FIGURE 3 is displaced and rotated relative to the scan area illustrated in FIGURE 6;
- FIGURE 10 illustrates the scan areas shown in FIGURE 9, but after a translational transformation between the two scan areas
- FIGURE 11 illustrates the scan areas shown in FIGURE 10, but after a rotational transformation between the two scan areas
- FIGURE 12 is a flow-chart illustrating the method of combining portions of a digital image according to the preferred embodiment of the invention.
- the digital image combining device is designated generally by the reference numeral 20 in FIGURE 1.
- the digital image combining device 20 generally has an image scanning device 22, a data processor 24, an output display device 26, and an input device 28.
- a digital image combining device 20 may include one or more generally available peripheral devices 30.
- the image scanner 22 is an electronic film developing apparatus, such as described in U.S. Patent Application No. 08/955,853
- the data processor 24 is a personal computer or a work station
- the output device 26 is a video monitor
- the input device 28 is a keyboard.
- the image scanner 22 is an electronic film developer in the preferred embodiment, the image scanner is not limited to being only an electronic film developer. Other scanning devices, including devices which scan media other than photographic film, are encompassed within the scope and spirit of the invention.
- FIGURE 2 is a schematic illustration of several elements of the data processor 24.
- FIGURE 2 also illustrates, schematically, some components interior to the electronic film developer 22.
- the electronic film developer 22 is in communication with the data processor 24.
- the scanning device 22 has at least two scanning stations 32 and 34.
- the example of the digital image scanner 22 illustrated in FIGURE 2 is a schematic illustration of a digital film scanner that has two scanning stations, it is anticipated that other digital film scanners will have three or more scarining stations.
- the digital film scanner 22 may have a single scanning station.
- exposed photographic film 32 is directed to move through the scanning stations in the longitudinal direction 38.
- the photographic film 36 has reference markers 40 at one transverse edge of the photographic film 36.
- the reference markers 30 are sprocket holes, such as sprocket hole 42, in the photographic film 36.
- the photographic film 36 has additional reference markers 44 in the transverse direction 46 opposing the reference markers 40.
- a portion of the photographic film 48 is scanned over a time interval at scanning station 32.
- a portion of the photographic film 49 is scanned at scanning station 34 over a period of time.
- the photographic film 36 is typically subjected to film development treatment prior to the scanning station 32 and another film development treatment between scanning stations 32 and 34. Consequently, the film will be at one stage of development at scanmng station 32, and at another stage of film development at scanning station 34. It is anticipated that many digital film scanners also have at least a third scanning station. Consequently, the film 36 is at a different stage of development at scanning station 32 compared to that of scanning station 34. Furthermore, the stage of film development at scanning stations 32 and 34 will be different than at subsequent scanning stations for digital film scanners that have more than two scanning stations.
- Scanned image data is transferred from each scanning station 32 and 34 to the data processor 24.
- the data processor 24 has a digital image data processor 50 that is in communication with the scanmng stations 32 and 34.
- the digital image data processor 50 is also in communication with a data storage unit 52 that stores processed image data, preferably in a conventional raster image format.
- the data storage unit 52 is in communication with a high-pass spatial filter 54 such that it receives stored raster image data from the storage unit 52.
- a reference mark detector 56 is in communication with a high-pass spatial filter 54 such that it receives filtered images from the high-pass spatial filter 54.
- the reference mark detector 56 is also in communication with the data storage unit 52.
- the partial image combiner 58 is in communication with the reference mark detector 56 and the data storage unit 52.
- the digital image data processor 50, the high-pass spatial filter 54, the reference mark detector 56 and the partial image combiner 58 are implemented in practice by programming a personal computer or a workstation.
- the invention includes other embodiments in which the components are implemented as dedicated hardware components.
- the digital image data processor 50 is a conventional digital image data processor that processes scanned data from scanning stations 32 and 34 and outputs a digital raster image in a conventional format to be stored in data storage unit
- the data storage unit 52 may be either a conventional hard drive or semiconductor memory, such as random access memory (RAM), or a combination of both. It is anticipated that other data storage devices may be used without departing from the scope and spirit of the invention.
- the high-pass spatial filter uses a conventional spatial mask such as that described in R.C.
- a three-pixel-by-three-pixel mask is usually sufficient, although one may select larger masks.
- the center mask element is given a weight with a value of 8 and each neighbor pixel is given a weight of - 1.
- the mask is then applied to each pixel of the raster image. If the subject pixel is in a fairly uniform region of the image, the sum of all the neighboring pixel values multiplied by the mask value will cancel with the central value, thus leading to essentially a zero output value. However, in the region of an edge of the image, the output will be non- zero. Consequently, such a filter will provide an output image which represents the edges of the original image. These are thus referred to as edge images.
- FIGURE 3 is a schematic illustration of photographic film 60 which may be the same or similar to photographic film 36.
- the photographic film 60 has a series of substantially uniformly spaced sprocket holes 62 proximate to one longitudinal edge of the film 60, and another series of substantially uniformly spaced sprocket holes 64 proximate to a second longitudinal edge of the film 60 opposing the first longitudinal edge.
- the film 60 may have notches spaced at regular intervals such as notches 66, 68, 70 and 72, or notch 74. Alternatively, one could deliberately cut notches into the film 60 at regular intervals.
- the first and second longitudinal edges of the film 60 which have notches 66, 68, 70, 72 and 74, and the sprocket holes 62 and 64 illustrates an example of an edge image from a section of photographic film.
- the edge image in FIGURE 3 is shown with the edges in black and the background in white.
- locations of images on the film 60 are indicated schematically as solid rectangles with reference numerals 76, 78, 80 and 82 in FIGURE 3.
- the dashed rectangle 84 indicates a region of the photographic film that has been scanned such that it includes at least a portion of sprocket holes with the regions 4 and 5.
- FIGURE 4 is a blown-up section of a reference sprocket hole as indicated in FIGURE 3.
- the portion of the sprocket hole illustrated in FIGURE 4 has a first vertical edge portion 86 and a second vertical edge portion 88.
- the portion of the sprocket hole illustrated in FIGURE 4 which is contained within the scanned region 84 has a horizontal edge portion 90.
- a first fiducial point 92 is proximate to a corner of the portion of the sprocket hole shown in FIGURE 4.
- the first fiducial point 92 is the intersection between the vertical edge line 94 and the horizontal edge line 96.
- the positions of the vertical edge line 94 and horizontal edge line 96 are determined according to an averaging procedure.
- the vertical edge line 94 is determined by a weighted average of the number of pixels corresponding to the section 86 within pixel columns of the edge image.
- the edge line 94 is substantially parallel to the pixel columns of the edge image.
- each pixel in the edge image has a unique coordinate address, preferably corresponding to a pixel row and column number in the conventional raster image representation.
- the concept of the invention is not limited to the usual Cartesian representation of raster images. It is also known to represent images in other coordinate representations, such as polar coordinates, or other orthogonal or non-orthogonal coordinate systems.
- each pixel column number in the region around the edge 86 is multiplied by the number of pixels corresponding to the edge 86 that fall within that pixel column, the products are summed and then divided by a normalization factor to provide a weighted average pixel column number that defines the vertical edge line 94.
- a similar procedure for the horizontal edge 90, with respect to pixel rows, provides a weighted average pixel row number for the fiducial point 92. Consequently, the weighted average pixel column number and weighted average pixel row number define the fiducial point 92.
- the fiducial point 92 is determined by finding the weighted average vertical edge line 94 from the horizontal edge line 86, the invention is not limited to establishing a reference marker in only this way.
- the invention anticipates generally establishing reference markers in the edge image, as long as the reference markers are fixed relative to the image medium. For example, if the edge lines of the sprocket holes are significantly misaligned with rows and columns of the raster image, then it is preferable to determine the edges by a linear regression analysis.
- a second fiducial point 98 is determined by a similar procedure as that used to determine the fiducial point 92.
- the vertical edge line 100 is determined as a weighted average vertical edge line corresponding to the edge 88.
- the fiducial point 98 is the intersection of the vertical edge line 100 and the horizontal edge line 96.
- a blow-up view of section 5 illustrated in FIGURE 3 is shown in more detail in FIGURE 5.
- the sprocket hole is at the opposing edge of the photographic film 60 relative to the sprocket hole illustrated in FIGURE 4.
- the fiducial points 102 and 104 are determined by weighted average vertical edge line 106 and horizontal edge line 108, and vertical edge line 110 and horizontal edge line 108, respectively.
- FIGURE 6 is an illustration of the edge image of the same section of photographic film 60 as in FIGURE 3, but with a different scan region 112.
- the sprocket holes in the regions labeled 7 and 8 correspond to the sprocket holes in the regions labeled 4 and 5 in FIGURE 3.
- the sprocket holes in regions 7 and 8 are part of a second image of the same sprocket holes in regions 4 and 5 of FIGURE 3.
- the scan region 84 in FIGURE 3 contains a portion of the image 80 while the scan region 112 in
- FIGURE 6 contains a complementary portion of the same image 80.
- FIGURE 7 shows a portion of the sprocket hole that is also illustrated in the region 7 in FIGURE 6. This corresponds to the sprocket hole in the region 4 illustrated in FIGURE 3. Since the sprocket hole in the regions 7 and 4 is fixed relative to the film 60, it can be used to line up the image portions of the image 80 and join them together. Similar to FIGURE 4, the portion of the sprocket hole in the region 7 has fiducial points 114 and 120.
- the fiducial point 114 is determined as the intersection of the vertical edge line 116 and the horizontal edge line 118, each preferably determined by a weighted averaging procedure.
- the fiducial point 120 is determined by the intersection of the vertical edge line 122 and the horizontal edge line 118.
- FIGURE 8 contains a second edge image of the same sprocket hole that is illustrated in the first edge image in the region 5 of FIGURE 3.
- FIGURE 8 shows an enlarged view of the portion of the sprocket hole in the region 8.
- the sprocket hole in the region 8 also determines two fiducial points, labeled 124 and 126 in FIGURE 8.
- the intersection of the vertical edge line 128 with the horizontal edge line 130 determines the fiducial point 124.
- the intersection of the vertical edge line 132 and the horizontal edge line 130 determines the fiducial point 126.
- the vertical edge lines 128 and 132 and the horizontal edge line 130 are determined by a weighted averaging procedure.
- FIGURE 9 shows an example of the scanned region 84 illustrated in FIGURE 3 displaced and rotated relative to the scanned region 112 illustrated in FIGURE 6.
- the displacement and rotation are greatly exaggerated to facilitate the explanation.
- the first scan region 84 has the four fiducial points 92, 98, 102 and 104 as reference markers for aligning and combining the second scan region 112 with the first scan region 84.
- the second scan region 112 has four fiducial points 114, 120, 124 and 126 in the preferred embodiment.
- the invention is not limited specifically to four reference points in each scanned image.
- As few as one reference marker in each image will be sufficient to make at least translational transformations to bring one portion of an image, such as image 80 into alignment for combining, or joining, with the complementary portion of the image in another scan region.
- As few as two reference markers in each scan region permit one to make both translational and rotational corrections to combine the image portions.
- each of the pairs of fiducial points are averaged to provide one average point.
- FIGURE 10 shows an example in which the average of the fiducial points 114 and 120 are translated to coincide with the average point of the fiducial points 92 and 98. Alternatively, one could have translated the average of the fiducial points 124 and 126 to coincide with the average of the fiducial points 102 and 104.
- FIGURES 9 and 10 show that the average of the fiducial points 114 and 120 are translated to coincide with the average point of the fiducial points 92 and 98. Alternatively, one could have translated the average of the fiducial points 124 and 126 to coincide with the average of the fiducial points 102 and 104.
- FIGURE 10 show the regions of the photographic film outside of the scan areas 84, 112, 84' and 112' for illustration pu ⁇ oses only. In actual practice of the invention, it is the areas within the scan regions 84, 112, 84' and 112' that will contain the edge image data. In other embodiments, one could calculate edge images for wider regions up to and including one or both edges of the photographic film 60, or, alternatively, use narrower regions than that shown for the preferred embodiment.
- the reference numerals for the scan areas 84 and 112 and features within the scan areas are shown with primes to indicate that the relative coordinates of the pixels have been transformed. However, the film 60 and the notch 70 are not shown with primes to indicate that they are representing the underlying film itself, and not image data.
- FIGURE 11 illustrates the result of a rotation after the translation illustrated in FIGURE 10.
- the pivot point 134 of the rotation is the coinciding point of the average of fiducial points 92 and 98 and the point which is the average of point 114 and 120.
- the line 136 shown in FIGURE 10 is defined by the pivot point 134 and the average point of the fiducial points 102' and 104'.
- the line 138 is defined by the pivot point 134 and the average of the fiducial points 124' and 126'.
- a first region 84 of photographic film 60 is scanned with a digital image scanner 22 which is preferably a digital film scanner.
- the scanned image will have a plurality of image channels, such as the front reflection, back reflection and through (transmission front to back and/or back to front) channels discussed in U.S. Patent Application No. 08/955,853.
- the scanned image data is then processed by the digital image data processor 50 and stored in the data storage device 52.
- the processed and stored image data is then processed by the high-pass spatial filter 54 to produce a first edge image.
- Reference marks are detected by the reference mark detector 56.
- the reference mark data are stored in data storage unit 52.
- a second region 112 of the photographic film 60 is scanned by the digital film scanner 22.
- both the first and second scanned areas are sufficiently wide to include at least one-half of the sprocket holes 62 and 64 spaced along the opposing edges of the film 60.
- the scan regions are preferably approximately equal to or wider than a single image on the photographic film.
- the second scanned image is similarly processed by the digital image data processor 50 and stored in the data storage unit 52.
- the second image is processed by the high-pass spatial filter 54 to produce a second edge image.
- the reference mark detector 56 detects reference marks in the second edge image.
- the partial image combiner determines a mapping rule to align corresponding and complementary partial images from the first and second scans, transforms the partial images such that they are properly aligned, and combines the first and second partial images into a single combined image.
- FIGURE 12 is a flowchart that schematically illustrates the method of combining portions of a digital image according to the present invention.
- First image data from the first scan is processed by and filtered by a high-pass spatial filter to produce an edge image. At least one reference location is determined in the first edge image.
- Second image data from the second scan are processed by the image data processor and high-pass spatial filtered to produce a second edge image. At least one reference location is determined in the second edge image.
- a mapping rule is determined to bring the reference locations substantially into coincidence such that a partial image in the first scan is properly aligned with a corresponding, complementary portion in the second image scan. The mapping is applied to align the portions of the digital image and the portions are combined into a single joined image.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Editing Of Facsimile Originals (AREA)
- Image Processing (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP99956255A EP1135925A1 (en) | 1998-11-12 | 1999-11-12 | A method and device for combining partial film scan images |
AU12897/00A AU1289700A (en) | 1998-11-12 | 1999-11-12 | A method and device for combining partial film scan images |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US19145998A | 1998-11-12 | 1998-11-12 | |
US09/191,459 | 1998-11-12 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2000030340A1 true WO2000030340A1 (en) | 2000-05-25 |
Family
ID=22705584
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB1999/001828 WO2000030340A1 (en) | 1998-11-12 | 1999-11-12 | A method and device for combining partial film scan images |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP1135925A1 (en) |
AU (1) | AU1289700A (en) |
TW (1) | TW496077B (en) |
WO (1) | WO2000030340A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1460586A2 (en) * | 2003-03-20 | 2004-09-22 | Eastman Kodak Company | Producing enhanced photographic products from images captured at known picture sites |
WO2013057486A1 (en) * | 2011-10-19 | 2013-04-25 | Windense Ltd | Motion picture scanning system |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI475883B (en) * | 2012-03-01 | 2015-03-01 | Altek Corp | Camera device and divided image pickup method thereof |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4231069A (en) * | 1977-05-10 | 1980-10-28 | Dr. Ing. Rudolf Hell Gmbh | Method and apparatus for producing image combinations |
EP0323849A2 (en) * | 1988-01-08 | 1989-07-12 | Fuji Photo Film Co., Ltd. | Color film analyzing method and apparatus therefore |
US5526096A (en) * | 1993-12-13 | 1996-06-11 | Kabushiki Kaisha Toshiba | Image-forming apparatus with an area-selecting function which selects an area in a surface-up condition of an original and which forms an image on the basis of the selected area of the original placed in a surface-down condition |
EP0794454A2 (en) * | 1996-03-04 | 1997-09-10 | Fuji Photo Film Co., Ltd. | Film scanner |
-
1999
- 1999-11-12 WO PCT/IB1999/001828 patent/WO2000030340A1/en not_active Application Discontinuation
- 1999-11-12 AU AU12897/00A patent/AU1289700A/en not_active Abandoned
- 1999-11-12 EP EP99956255A patent/EP1135925A1/en not_active Withdrawn
-
2000
- 2000-02-09 TW TW88119668A patent/TW496077B/en not_active IP Right Cessation
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4231069A (en) * | 1977-05-10 | 1980-10-28 | Dr. Ing. Rudolf Hell Gmbh | Method and apparatus for producing image combinations |
EP0323849A2 (en) * | 1988-01-08 | 1989-07-12 | Fuji Photo Film Co., Ltd. | Color film analyzing method and apparatus therefore |
US5526096A (en) * | 1993-12-13 | 1996-06-11 | Kabushiki Kaisha Toshiba | Image-forming apparatus with an area-selecting function which selects an area in a surface-up condition of an original and which forms an image on the basis of the selected area of the original placed in a surface-down condition |
EP0794454A2 (en) * | 1996-03-04 | 1997-09-10 | Fuji Photo Film Co., Ltd. | Film scanner |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1460586A2 (en) * | 2003-03-20 | 2004-09-22 | Eastman Kodak Company | Producing enhanced photographic products from images captured at known picture sites |
EP1460586A3 (en) * | 2003-03-20 | 2005-02-09 | Eastman Kodak Company | Producing enhanced photographic products from images captured at known picture sites |
WO2013057486A1 (en) * | 2011-10-19 | 2013-04-25 | Windense Ltd | Motion picture scanning system |
US20150125125A1 (en) * | 2011-10-19 | 2015-05-07 | Windense Ltd. | Motion picture scanning system |
US9661236B2 (en) * | 2011-10-19 | 2017-05-23 | Windense Ltd. | Motion picture scanning system |
EP2769538B1 (en) * | 2011-10-19 | 2018-07-25 | iMetafilm Ltd | Motion picture scanning system |
Also Published As
Publication number | Publication date |
---|---|
EP1135925A1 (en) | 2001-09-26 |
AU1289700A (en) | 2000-06-05 |
TW496077B (en) | 2002-07-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6570612B1 (en) | System and method for color normalization of board images | |
US6535650B1 (en) | Creating high resolution images | |
RU2421814C2 (en) | Method to generate composite image | |
US8437032B2 (en) | Image processing apparatus and image processing method within inclination angle correction | |
US20020008715A1 (en) | Image resolution improvement using a color mosaic sensor | |
KR100916867B1 (en) | Image processing apparatus and image processing method | |
JPH01170169A (en) | Method for processing image graphic correction for multicolor printing | |
US20040218071A1 (en) | Method and system for correcting the chromatic aberrations of a color image produced by means of an optical system | |
JP2004199548A (en) | Image processor, image processing method, image processing program, printed matter inspection device, printed matter inspection method, and printed matter inspection program | |
EP0351062A1 (en) | Method and apparatus for generating composite images | |
JPH0879529A (en) | Image processing device | |
US7742658B2 (en) | System and method for boundary artifact elimination in parallel processing of large format images | |
US6728425B1 (en) | Image processing device | |
JP3059205B2 (en) | Image processing device | |
EP1135925A1 (en) | A method and device for combining partial film scan images | |
US7525702B2 (en) | Methods and systems for correcting color distortions | |
US6118478A (en) | Telecine systems for high definition use | |
JP2002507848A (en) | Image data coordinate transformation method having randomly shifted pixels | |
JPH0342831B2 (en) | ||
US7847987B2 (en) | Method of compensating a zipper image by a K-value and a method of calculating a K-value | |
JP3260891B2 (en) | Edge extraction method | |
JP4206294B2 (en) | Image processing device | |
WO2000011862A1 (en) | A method and device for the alignment of digital images | |
EP1096783A1 (en) | Document imaging system | |
JP2634399B2 (en) | Color image processing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref country code: AU Ref document number: 2000 12897 Kind code of ref document: A Format of ref document f/p: F |
|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CR CU CZ DE DK DM EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GH GM KE LS MW SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 1999956255 Country of ref document: EP |
|
REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |
|
WWP | Wipo information: published in national office |
Ref document number: 1999956255 Country of ref document: EP |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 1999956255 Country of ref document: EP |