[go: up one dir, main page]

CN101137023A - Image processing system, projector and image processing method - Google Patents

Image processing system, projector and image processing method Download PDF

Info

Publication number
CN101137023A
CN101137023A CNA2007101626370A CN200710162637A CN101137023A CN 101137023 A CN101137023 A CN 101137023A CN A2007101626370 A CNA2007101626370 A CN A2007101626370A CN 200710162637 A CN200710162637 A CN 200710162637A CN 101137023 A CN101137023 A CN 101137023A
Authority
CN
China
Prior art keywords
information
area
image
projection
edge detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CNA2007101626370A
Other languages
Chinese (zh)
Other versions
CN100562082C (en
Inventor
松田秀树
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Seiko Epson Corp
Original Assignee
Seiko Epson Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Seiko Epson Corp filed Critical Seiko Epson Corp
Publication of CN101137023A publication Critical patent/CN101137023A/en
Application granted granted Critical
Publication of CN100562082C publication Critical patent/CN100562082C/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Landscapes

  • Projection Apparatus (AREA)
  • Transforming Electric Information Into Light Information (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

本发明提供可短时间且正确地生成投影对象区域的位置信息的图像处理系统、投影机及图像处理方法。该图像处理方法,向投影对象物投影第1校准图像;拍摄所投影的第1校准图像,生成规定析像度以下的低析像度的第1摄像信息和该低析像度以上的高析像度的第3摄像信息;根据第1摄像信息进行边缘检测生成第1边缘检测信息;根据该第1边缘检测信息,临时检测摄像区域中与投影对象物相当的投影对象区域生成临时检测信息;根据临时检测信息和第3摄像信息,对临时检测出的投影对象区域的边界线附近的像素组进行边缘检测,生成第3边缘检测信息;根据该第3边缘检测信息,检测所述投影对象区域生成与所述投影对象区域的位置有关的投影对象区域信息。

Figure 200710162637

The present invention provides an image processing system, a projector, and an image processing method capable of accurately generating position information of a projection target area in a short time. In this image processing method, a first calibration image is projected onto an object to be projected; the projected first calibration image is photographed, and low-resolution first imaging information below a predetermined resolution and high-resolution information above the low resolution are generated. The 3rd imaging information of image degree; Perform edge detection according to the 1st imaging information to generate the first edge detection information; According to the first edge detection information, temporarily detect the projection object area corresponding to the projection object in the imaging area to generate temporary detection information; According to the temporary detection information and the third imaging information, edge detection is performed on the pixel group near the boundary line of the temporarily detected projection object area to generate third edge detection information; according to the third edge detection information, the projection object area is detected Projection target area information related to the position of the projection target area is generated.

Figure 200710162637

Description

Image processing system, projector and image processing method
The application is that application number is 200510059742.2, the applying date is on March 29th, 2005, denomination of invention is divided an application for the patent application of " image processing system, projector and image processing method ".
Technical field
The present invention relates to detect image processing system, projector and the image processing method of view field according to shooting information.
Background technology
In recent years, proposed to have projector's projected image on projection objects things such as screen of CCD camera by use, and the projected image of institute's projection is made a video recording with the CCD camera, detect 4 jiaos coordinate of the view field suitable in the camera watch region, adjust the scheme of the position of projected image thus with projected image.
For example, open in the flat 5-30520 communique, adopted a kind ofly, and take the whole zone in zone that can show image, thereby adjust the structure of the projected position of image by display image in can the zone of show image the spy.
But, the structure of opening flat 5-30520 communique at Ru Te is such, with the condition that image is presented in can the zone of show image is under the situation of prerequisite, under the part of image is projected to situation beyond can the zone of show image, just can not adjust the projected position of image.
Particularly, projector's miniaturization in recent years, the user carries it as so-called mobile projector machine, and carries out the work of projected image at the carrying point of destination.Under these circumstances, because of the reasons such as restriction in place are set, will cause the part of projected image to be displayed on the outside of screen sometimes.Under these circumstances, just require to adjust rightly the projector of the position etc. of projected image.
In addition, open in the flat 5-30520 communique the spy, though put down in writing by reading the monochrome information that is stored in the frame memory by CPU and carrying out image processing, thereby detect the content of 4 jiaos position in the zone that can show of screen, but for carrying out the not record of what kind of image processing particularly.
In general image processing in the past, when the position of detecting 4 jiaos of screen according to shooting information, must carry out filtration treatment to photographed images integral body, spended time on image processing not only, amount of calculation is also bigger.
Therefore, more time-consuming in calibration, elongated for user's stand-by period.
Summary of the invention
The present invention finishes in view of the above problems, and its purpose is to provide the enough shorter time of energy, and correctly generates image processing system, projector and the image processing method of the positional information in projection objects zone.
In order to solve purpose of the present invention, image processing system of the present invention and projector is characterized in that, comprising:
Image projection device to projection objects thing projection the 1st calibration chart picture;
Take the 1st calibration chart picture of institute's projection and generate as the 1st shooting information of the low exploring degree of the exploring degree below the regulation exploring degree and as the camera head of the 3rd shooting information of the high-resolution of the exploring degree more than the described low exploring degree;
According to the described the 1st and the 3rd shooting information, the projection objects regional detection device of the projection objects area information of the position in the projection objects zone that is equivalent to described projection objects thing in the relevant camera watch region of described camera head of generation;
Described projection objects regional detection device comprises: carry out rim detection according to described the 1st shooting information and generate the 1st edge detecting information, and carry out the edge detecting device that rim detection generates the 3rd edge detecting information according to described the 3rd shooting information; And generate temporary detecting information, and generate the projection objects area information generating apparatus of described projection objects area information according to described the 3rd edge detecting information according to the described projection objects of described the 1st edge detecting information temporary detecting zone;
Described edge detecting device, by according to described temporary detecting information, near the pixel groups the boundary line in the projection objects zone that temporary detecting is gone out is carried out rim detection, generates described the 3rd edge detecting information.
In addition, image processing method of the present invention is characterized in that:
To projection objects thing projection the 1st calibration chart picture;
Take the 1st calibration image pickup of institute's projection and generate as the 1st shooting information of the low exploring degree of the exploring degree below the regulation exploring degree and as the 3rd shooting information of the high-resolution of the exploring degree more than the described low exploring degree;
According to described the 1st shooting information, carry out rim detection and generate the 1st edge detecting information;
According to described the 1st edge detecting information, the projection objects zone that is equivalent to described projection objects thing in the camera watch region of the described image pickup part of temporary detecting and generate temporary detecting information;
By according to this temporary detecting information and described the 3rd shooting information, near the pixel groups the boundary line in the projection objects zone that temporary detecting is gone out is carried out rim detection, generates the 3rd edge detecting information;
According to the 3rd edge detecting information, detect described projection objects zone and generate the projection objects area information of the position in relevant described projection objects zone.
According to the present invention, image processing system etc., by behind shooting information temporary detecting projection objects zone according to low exploring degree, near according to high-resolution its boundary line shooting information detects the projection objects zone, thus available shorter time and correctly generate the positional information in projection objects zone.
In addition, in described image processing system and described projector, also can be described edge detecting device, to the edge at a plurality of positions in the 1st photographed images of coming detecting and generate described the 1st edge detecting information based on described the 1st shooting information; Described projection objects area information generating apparatus by setting linear near linear or linear curve of approximation according to the positional information at described a plurality of positions of coming based on described the 1st edge detecting information, thereby generates described temporary detecting information.
In addition, described image processing method also can be that edge to a plurality of positions in the 1st photographed images of coming based on described the 1st shooting information detects and generates described the 1st edge detecting information; By setting linear near linear or linear curve of approximation, thereby generate described temporary detecting information according to the positional information at described a plurality of positions of coming based on described the 1st edge detecting information.
Thus, image processing system etc. are by carrying out rim detection under the state that has dwindled more in the zone that will become the rim detection object, and the available shorter time generates the positional information in projection objects zone.
In addition, in described image processing system and described projector, also can be described projection objects regional detection device, comprise the test point evaluating apparatus of estimating a plurality of rim detection points; Described test point evaluating apparatus, judge that described a plurality of rim detection point is whether from described linear near linear or discrete (departing from) setting of described linear curve of approximation or more than it, remove and then set the mode of described linear near linear or described linear curve of approximation with will disperse described setting or the test point more than it, control described projection objects area information generating apparatus.
In addition, described image processing method can judge that also a plurality of rim detection points are whether from described linear near linear or described linear curve of approximation discrete programing value or more than it; Open described setting or described linear near linear or described linear curve of approximation are removed and then set to the test point more than it with discrete.
Thus, image processing system etc. be by carrying out processing after removing by the discrete test point of opening from linear near linear, can reduce the influence of noise etc., and more correctly generate the positional information in projection objects zone.
In addition, described image processing system and described projector also can be following such:
Comprise the difference image generating apparatus that generates difference image and according to described difference image, detect the central reference position detecting device of a plurality of central reference position in the central block zone of the regulation in the camera watch region of described camera head;
Described projection objects regional detection device comprises the range of search determination device of the rim detection scope of setting described edge detecting device;
Described image projection device, projection the 2nd calibration chart picture;
Described camera head is taken described the 2nd calibration chart picture of institute's projection and is generated the 2nd shooting information;
Described difference image generating apparatus according to the described the 1st and the 2nd shooting information, generates described difference image;
Described range of search determination device, the external setting-up rim detection scope in described central block zone;
Described the 1st calibration chart similarly is monochromatic calibration chart picture;
Described the 2nd calibration chart picture, comprise littler than the 2nd calibration chart picture, and be positioned at the 2nd calibration chart picture central authorities near described central block zone.
In addition, described image processing method also can be as described below:
Projection the 2nd calibration chart picture;
Take described the 2nd calibration chart picture of institute's projection and generate the 2nd shooting information with described image pickup part;
According to the described the 1st and the 2nd shooting information, generate described difference image;
According to this difference image, detect a plurality of central reference position in the central block zone of the regulation in the described camera watch region;
External setting-up rim detection scope in described central block zone;
Described the 1st calibration chart similarly is monochromatic calibration chart picture;
Described the 2nd calibration chart picture, comprise littler than the 2nd calibration chart picture, and be positioned at the 2nd calibration chart picture central authorities near described central block zone.
Thus, image processing system etc., so that become the form execution processing that the zone of rim detection object dwindles more, the available shorter time generates the positional information in projection objects zone.
In addition, in described image processing system and described projector, also can be described the 2nd calibration chart picture be made of described central block zone, the peripheral piece zone of periphery that is positioned at this central block zone and the background area beyond described central block zone and the described peripheral piece zone;
Each pixel in described central block zone and the described peripheral piece zone has different desired values with each pixel in the described background area;
Described image processing system and described projector comprise: according to described central reference position, detect the peripheral reference position checkout gear of a plurality of peripheral reference positions in the described peripheral piece zone in the described camera watch region; With according to described central reference position and described peripheral reference position, generate view field's information generation device of view field's information of the position of the view field in the relevant described camera watch region.
In addition, in described image processing method, also can be described the 2nd calibration chart picture be made of described central block zone, the peripheral piece zone of periphery that is positioned at this central block zone and the background area beyond described central block zone and the described peripheral piece zone;
Each pixel in described central block zone and the described peripheral piece zone has different desired values with each pixel in the described background area;
According to described central reference position, detect a plurality of peripheral reference position in the described peripheral piece zone in the described camera watch region;
According to described central reference position and described peripheral reference position, generate view field's information of the position of the view field in the relevant described camera watch region.
Thus, image processing system etc., by detecting the central reference position in the central block zone littler than the view field that is equivalent to projected image, even if thereby be displayed in the part of projected image under the situation of outside of projection objects thing, also can generate the positional information of view field according to the central reference position.
Particularly, thus, because image processing systems etc. can not be according to the central reference position, also grasp the position of view field, therefore can generate the positional information of view field more accurately according to the peripheral reference position in the peripheral piece zone that is positioned at its periphery.
In addition, in described image processing system and described projector, it also can be described view field information generation device, by according to described central reference position and described peripheral reference position, set a plurality of near linears or curve of approximation and grasp the shape or the configuration in described central block zone and described peripheral piece zone, thereby generate described view field information.
In addition, described image processing method, also can be by according to described central reference position and described peripheral reference position, set a plurality of near linears or curve of approximation and grasp the shape or the configuration in described central block zone and described peripheral piece zone, thereby generate described view field information.
In addition, in described image processing system and described projector, also can be that described view field and described central block zone is the zone of rectangle;
Described view field information generation device, the intersection point by deriving described a plurality of near linears or the intersection point of described a plurality of curve of approximation, grasp 4 jiaos the position in described central block zone,, generate the described view field information of 4 jiaos position of the described view field of expression according to this position of 4 jiaos.
In addition, in described image processing method, also can be that described view field and described central block zone is the zone of rectangle;
The intersection point by deriving described a plurality of near linears or the intersection point of described a plurality of curve of approximation are grasped 4 jiaos the position in described central block zone, according to this position of 4 jiaos, generate the described view field information of 4 jiaos position of the described view field of expression.
Thus, because image processing systems etc. can be grasped 4 jiaos position of view field according to 4 jiaos the position in central block zone, 4 jiaos the position that view field is grasped in therefore available less processing.
In addition, described image processing system and described projector also can be to comprise according to described the 1st shooting information and described central reference position, detect the projection objects zone boundary point detection device of a plurality of boundary points in described projection objects zone;
Described peripheral reference position checkout gear detects described peripheral reference position according to described boundary point.
In addition, described image processing method also can be according to described the 1st shooting information and described central reference position, detects a plurality of boundary points in described projection objects zone;
According to described boundary point, detect described peripheral reference position.
Moreover at this, described peripheral reference position checkout gear and described image processing method also can detect the described peripheral reference position that is positioned at than close described boundary point position, described central reference position according to described boundary point.
Thus,, therefore be not vulnerable to the influence of error, can generate the positional information of view field more accurately because image processing system etc. can generate the positional information of view field according to a plurality of reference positions far away at interval.
In addition, described image processing system and described projector, also can be to comprise according to described projection objects area information and described view field information grasping distortion in images by described image projection, and so that the picture distortion correcting device of the mode correction image signal that this distortion obtains revising;
Described image projection device is according to by the corrected picture signal projected image of described picture distortion correcting device.
In addition, described image processing method also can be according to the distortion in images of described projection objects area information and the projection of information grasp institute of described view field, and picture signal is revised so that this distortion obtains revising.
Thus, because image processing systems etc. are available more in the past than the processing generation projection objects zone of lacking and the positional information separately of view field, the therefore comparable distortion of correction image effectively in the past.
Description of drawings
Fig. 1 is a skeleton diagram of showing the image projection situation of the 1st embodiment.
Fig. 2 schematic diagram that to be the screen of showing the 1st embodiment concern with the position of projected image.
Fig. 3 is the functional block diagram of the projector of the 1st embodiment.
Fig. 4 is the hardware block diagram of the projector of the 1st embodiment.
Fig. 5 is the flow chart of the flow process of showing that the position probing of the view field of the 1st embodiment is handled.
Fig. 6 A of Fig. 6 is the schematic diagram of the 1st calibration chart picture, and Fig. 6 B is the schematic diagram of the 2nd calibration chart picture.
Fig. 7 is a schematic diagram of showing the search method in the 1st stage when the detection central reference position of the 1st embodiment.
Fig. 8 is a schematic diagram of showing the search method in the 2nd stage when the detection central reference position of the 1st embodiment.
Fig. 9 is the schematic diagram of the search method in the 1st stage when the peripheral reference position of the detection of displaying the 1st embodiment.
Figure 10 is the schematic diagram of the search method in the 2nd stage when the peripheral reference position of the detection of displaying the 1st embodiment.
Figure 11 is a schematic diagram of showing the 1st stage when the setting near linear of the 1st embodiment.
Figure 12 is a schematic diagram of showing the 2nd stage when the setting near linear of the 1st embodiment.
Figure 13 is the functional block diagram of projector of the variation of the 1st embodiment.
Figure 14 is a schematic diagram of showing the search method in the 1st stage when the detection periphery reference position of variation of the 1st embodiment.
Figure 15 is a schematic diagram of showing the search method in the 2nd stage when the detection periphery reference position of variation of the 1st embodiment.
Figure 16 is the functional block diagram of the projector of the 2nd embodiment.
Figure 17 is the flow chart of the flow process of showing that the position probing in the projection objects zone of the 2nd embodiment is handled.
Figure 18 is a schematic diagram of showing the search method in the 1st stage when the detection projection objects zone of the 2nd embodiment.
Figure 19 is a schematic diagram of showing the search method in the 2nd stage when the detection projection objects zone of the 2nd embodiment.
Figure 20 is a schematic diagram of showing the search method in the 3rd stage when the detection projection objects zone of the 2nd embodiment.
Figure 21 is a schematic diagram of showing the rim detection point evaluation processing of the 2nd embodiment.
Figure 22 is a schematic diagram of showing the search method in the 1st stage when the detection periphery reference position of variation of the 2nd embodiment.
Figure 23 is a schematic diagram of showing the search method in the 2nd stage when the detection periphery reference position of variation of the 2nd embodiment.
Embodiment
Below, be example with the situation that the present invention is applicable to projector with image processing system, illustrate with reference to accompanying drawing.Moreover embodiment shown below does not constitute being documented in any qualification of the summary of the invention in the claim scope.In addition, all of being showed in following embodiment constitute, and are not necessarily necessary as the solution of the invention of being put down in writing in the claim scope.
The 1st embodiment
Fig. 1 is a skeleton diagram of showing the image projection situation of the 1st embodiment.In addition, Fig. 2 shows the screen 10 of the 1st embodiment and the schematic diagram of the position relation of projected image 12.
Projector 20 is to screen 10 projected images.Thus, on screen 10, demonstrate projected image 12.
In addition, the projector 20 of the 1st embodiment has the transducer 60 as camera head.Transducer 60 is made a video recording in the face of the screen 10 that shows projected image 12 via shooting, and generates shooting information.Projector 20 according to shooting information, carries out the distortion of projected image 12 and the adjustment of display position.
But, for example, as shown in Figure 2, break away from and when being presented at screen 10 outside, projector in the past just can not carry out the distortion of projected image 12 and the adjustment of display position according to image information from screen 10 in the part of projected image 12.
Reason is that screen 10 is far away with the distance of the wall that is positioned at screen 10 back, even if projected image 12 enters the image pickup scope of transducer 60, projector in the past can not become the evolution that is displayed on the summit of not knowing to be positioned at projected image 12 on wall where or the background object or that be not revealed the position on the plane of screen 10.
The projector 20 of the 1st embodiment by using and different in the past calibration chart pictures, according to the shooting information of this calibration chart picture, carries out simple retrieval process, thus the position of grasping projected image 12 accurately under than wide in range in the past condition.
Secondly, the functional block diagram to the projector 20 that is used to install such function describes.
Fig. 3 is the functional block diagram of the projector 20 of the 1st embodiment.
Projector 20 is constituted as and comprises that the analog rgb signal (R1, G1, B1) that will come from PC (personal computer) etc. is transformed into the input signal handling part 110 of digital rgb signal (R2, G2, B2); For the color and the lightness of correction image, this digital rgb signal (R2, G2, B2) is transformed into the look transformation component 120 of digital rgb signal (R3, G3, B3); This digital rgb signal (R3, G3, B3) is transformed into the output signal handling part 130 of analog rgb signal (R4, G4, B4) and according to the image projection portion 190 of this analog rgb signal projection image.
In addition, image projection portion 190 is constituted as the drive division 194, light source 196 and the lens 198 that comprise spatial light modulator 192, drive spatial light modulator 192.Drive division 194 drives spatial light modulator 192 according to the picture signal from output signal handling part 130.And image projection portion 190 comes from the light of light source 196 via spatial light modulator 192 and lens 198 projections.
In addition, projector 20 constitutes the transducer 60 of the shooting information that comprises the calibration information generating unit 172 that generates the calibration image information be used to show the 1st and the 2nd calibration chart picture, generates the calibration chart picture and the shooting information storage part 140 that temporary transient storage comes from the shooting information of transducer 60.
In addition, projector 20 constitutes the view field's test section 150 that comprises according to the position of the view field in the shooting face (camera watch region) of shooting information detection sensor 60.In addition, view field's test section 150 constitutes and comprises the difference image generating unit 152, the central reference position detection part 154 that detects a plurality of central reference position that is included in the central block zone in the difference image that generate the 1st photographed images and the difference image of the 2nd photographed images, detects the peripheral reference position detecting part 156 of a plurality of peripheral reference positions that are included in the peripheral piece zone in the difference image and generate view field's information generating unit 158 of view field's information of the position of representing view field according to each reference position.
And then projector 20 has the function of the distortion of revising projected image 12.For this function has been installed, projector 20 constitutes and comprises according to shooting information and view field's information, detects the brightness peak position detection part 164 of the brightness peak position (locations of pixels of brightness value maximum) in the view field; According to the brightness peak position, the picture distortion correction calculating part 162 of computed image distortion correction; With according to the picture distortion correction, revise received image signal, make the picture distortion correction portion 112 that the distortion of projected image 12 is revised.
In addition, as the hardware of the function of each one that is used to install above-mentioned projector 20, for example, applicable following hardware.
Fig. 4 is the hardware block diagram of the projector 20 of the 1st embodiment.
For example, as input signal handling part 110, available for example A/D converter 930, image processing circuit 970 wait and install; As shooting information storage part 140, available for example RAM950 waits and installs; As view field's test section 150, brightness peak position detection part 164, available for example image processing circuit 970 waits to be installed; As picture distortion correction calculating part 162, available for example CPU910 waits and installs; As calibration information generating unit 172, available for example image processing circuit 970, RAM950 wait and install; As output signal handling part 130, available for example D/A converter 940 waits to be installed; As spatial light modulator 192, available for example liquid crystal panel 920 waits to be installed; As drive division 194, the available ROM960 that has for example stored the liquid crystal light valve driver of driving liquid crystal panel 920 waits and installs.
Moreover described each several part can be via system bus 980 mutual exchange messages.
In addition, described each several part both can be installed with hardware mode as circuit, also can the mode with software install as program.
And then, also can be used for making information storage medium 900 fetch programs of computer performance from storing as the functional programs of difference image generating unit 152 grades, and with the function mounting of difference image generating unit 152 grades on computers.
As such information storage medium 900, can use for example CD-ROM, DVD-ROM, ROM, RAM, HDD etc., the mode that reads of its program both can be the way of contact, also can be the non-way of contact.
In addition, replace information storage medium 900, also can be by being used to install above-mentioned each functional programs from downloads such as host apparatus via transmission line, thus above-mentioned each function is installed.
Secondly, the flow process that the position probing of the view field that used described each several part is handled describes.
Fig. 5 is the flow chart of the flow process of showing that the position probing of the view field of the 1st embodiment is handled.In addition, Fig. 6 A is the 1st calibration chart as 13 schematic diagram, and Fig. 6 B is the 2nd calibration chart as 14 schematic diagram.
At first, projector 20, as the 1st calibration chart as 13, the calibration chart picture (step S1) of projection complete white (integral image is a white) as shown in Figure 6A.In particular, calibration information generating unit 172 generates the calibration information (for example, rgb signal etc.) of the 1st calibration chart as 13 usefulness, and image projection portion 190 is according to this calibration information, and projection is white calibration chart picture entirely.
Transducer 60 is made a video recording as 13 to the 1st calibration chart on the screen 10 with automatic exposure setting, generates the 1st shooting information (step S2).Shooting information storage part 140 storages the 1st shooting information.
Then, projector 20, as the 2nd calibration chart as 2nd calibration chart of 14 projections shown in Fig. 6 B as 14 (step S3).In particular, calibration information generating unit 172 generates the calibration information of the 2nd calibration chart as 14 usefulness, and image projection portion 190 is according to this calibration information, and projection the 2nd calibration chart is as 14.
In the present embodiment, the 2nd calibration chart is as 14, is entire image is being divided into equably under 9 the situation, and the central block zone is a black with 4 peripheral piece zones of 4 jiaos, and other piece zone is the graph image of white so-called checker pattern.
Transducer 60, the exposure when making a video recording (shooting) with the 1st calibration chart as 13 is made a video recording as 14 to the 2nd calibration chart on the screen 10, generates the 2nd shooting information (step S4).Shooting information storage part 140 storages the 2nd shooting information.
Difference image generating unit 152, according to the 1st and the 2nd shooting information, generate the 1st calibration chart as the 13 and the 2nd calibration chart as 14 difference image (step S5).Moreover difference image is for example brightness value of each pixel etc. to be carried out the resulting image of Difference Calculation.In addition, difference image, for example be if the difference value of each pixel defined threshold or the pixel more than it then with its difference value as the value of each location of pixels if really not so pixel then with 0 value image as each location of pixels.Moreover difference image generating unit 152 not necessarily will be carried out Difference Calculation to entire image, also can be only in following processing in the scope of necessity (part of image) carry out Difference Calculation.
Then, view field's test section 150, after the generation of difference image, detection is comprised in a plurality of (being 4 in the present embodiment) central reference positions in the central block zone in the difference image and is comprised in a plurality of (being 8 in the present embodiment) the peripheral reference position in the peripheral piece zone in the difference image.
Fig. 7 is a schematic diagram of showing the search method in the 1st stage when the detection central reference position of the 1st embodiment.In addition, Fig. 8 is a schematic diagram of showing the search method in the 2nd stage when the detection central reference position of the 1st embodiment.
Central reference position detection part 154, in order to detect the position of the view field (zone that is equivalent to projected image 12) in the camera watch region 15 that is equivalent to the face of making a video recording, at first, 4 central reference positions (step S6) of test pattern image.Moreover, though be convenient to understand in order to make explanation, in each figure, described projection objects zone (screen area) 18, but the situation of a part of the peripheral piece zone 17-1~17-4 of the outside that does not have projection objects zone 18 or projection objects zone 18 in the difference image of reality has also been arranged.
In particular, central reference position detection part 154 is to difference image, as shown in Figure 7, by being envisioned on the vertical position x=xc that central block zone 16 is positioned at, from y=yp to y=ym,, differentiate some P1, P2 that difference value changes by each pixel retrieval difference value.For example, suppose be P1 (xc, y1), P2 (xc, y2).
Moreover the value of the retrieval reference position of xc, yp, ym etc. for example, both can also can perhaps, also can be decided according to image pickup result by the experiment decision by the angle of visual field separately (visual angle) and the determining positions of lens 198 and transducer 60.For aftermentioned other the retrieval reference position too.
Then, central reference position detection part 154 as shown in Figure 8, by putting on the y=yc in the transverse presentation that with P1, P2 is benchmark, by each pixel retrieval difference value, is differentiated some P4, P3 that difference value changes from x=xm to x=xp.Moreover, at this, for example, be yc=(y1+y2)/2.
So, central reference position detection part 154, with 4 central reference position P1 in expression central block zone 16 (xc, y1), P2 (xc, y2), P3 (x1, yc), (x2, central reference positional information yc) is exported to peripheral reference position detecting part 156 to P4.
Periphery reference position detecting part 156, according to the central reference positional information, 8 peripheral reference positions (step S7) of test pattern image.
Fig. 9 is the schematic diagram of the search method in the 1st stage when the peripheral reference position of the detection of displaying the 1st embodiment.In addition, Figure 10 is a schematic diagram of showing the search method in the 2nd stage when the detection periphery reference position of the 1st embodiment.
In particular, periphery reference position detecting part 156, at the y of distance P1 coordinate upward on the y=yh of m%, be offset the positive direction of the x coordinate xh of several percentages to center side, the point that the difference value of each pixel of retrieval difference image changes to the x axle from the x coordinate x1 of distance P3.Thus, differentiate the some P5 that difference value changes.
Similarly, peripheral reference position detecting part 156, at the y of distance P2 coordinate y2 downwards on the y=yn of m%, from the positive direction of x coordinate xh to the x axle, the point that the difference value of each pixel of retrieval difference image changes.Thus, differentiate the some P6 that difference value changes.
Use the same method, as shown in figure 10, differentiate some P7~P12.Then, peripheral reference position detecting part 156 is exported to view field's information generating unit 158 with peripheral reference position information and the central reference positional information of representing the coordinate of these 8 points.
View field's information generating unit 158 according to peripheral reference position information and central reference positional information, detects 4 jiaos position (step S8) of view field with near linear (also can be curve of approximation).
Figure 11 is a schematic diagram of showing the 1st stage when the setting near linear of the 1st embodiment.
Figure 12 is a schematic diagram of showing the 2nd stage when the setting near linear of the 1st embodiment.
View field's information generating unit 158 according to the coordinate of a P5, some P3, some P6, is set the represented near linear of dotted line of Figure 11.Use the same method, view field's information generating unit 158 as shown in figure 12, is set 4 near linears dotting, and with 4 intersection point A of each near linear (xA, yA)~(xD yD) differentiates as 4 jiaos the point in central block zone 16 D.
Because central block zone 16 is the zones that are equivalent to original projected image 12 is reduced into the image after 1/9,4 jiaos the some EFGH that therefore is equivalent to the view field of projected image 12 becomes following such.That is, E (xE, yE)=(2*xA-xC, 2*yA-yC), F (xF, yF)=(2*xB-xD, 2*yB-yD), G (xG, yG)=(2*xC-xA, 2*yC-yA), H (xH, yH)=(2*xD-xB, 2*yD-yB).
As described above, according to present embodiment, certainly much less projected image 12 is comprised under the situation in the screen 10, even if be displayed in the part of projected image 12 under the state of outside of screen 10,4 jiaos position of the view field in the camera watch region 15 also can be detected by projector 20.Certainly, projector 20 also can be transformed into the positional information of view field the position on screen 10 planes, generates 4 jiaos positional information of projected image 12.
Thus, projector 20 can carry out well the correction of distortion of projected image 12 and position adjustment, the use in the projected image 12 detection etc. of indicating positions of laser designator etc.
For example, projector 20 is when the correction (so-called trapezoidal correction) of the distortion of carrying out projected image 12, come from view field's information of 4 jiaos position of the view field of view field's information generating unit 158 according to the 1st calibration chart as 13 shooting information and expression, utilize the highest brightness peak position of sensed luminance value in the view field of brightness peak position detection part 164 in camera watch region.
For example, screen 10 and projector 20 over against situation under, the central authorities of view field become the brightness peak position, in addition, for example, when the brightness value in the left side of view field is high, just can grasp the situation that projection optical axis is offset from the center of projected image 12 left, and can grasp the trapezoidal situation of length that projected image 12 distorts the length of side on short and right side, the limit in left side.Like this, by grasping the brightness peak position in the view field, just can grasp distortion in images.
Then, picture distortion correction calculating part 162 according to the brightness peak position in the view field, calculates and the corresponding correction of distortion in images.
And then the picture distortion correction portion 112 in the input signal handling part 110 is revised the distortion of received image signal with correction image according to this correction.
By above order, even if be displayed in the part of projected image 12 under the situation of outside of screen 10, the distortion that projector 20 also can correction image.Certainly, the modification method of distortion in images is not limited to this method.For example, the pixel of the brightness value maximum in the photographed images also can detect in projector 20, and according to the distortion of this locations of pixels correction image.
In addition, projector 20 is not only the image that central authorities and its periphery also have feature like that by adopting graph image shown in Fig. 6 B, compares with the situation that adopts central authorities just to have the graph image of feature, can differentiate view field more accurately 4 jiaos.
For example, near the point of its brightness value conversion when differentiating some P1 shown in Figure 7, some P2, also can be differentiated by projector 20.But, when setting near linear, to compare with the situation of setting near linears with a plurality of points of broad at interval with these at interval very narrow a plurality of points, the error meeting pairing approximation straight line of 1 pixel of point that becomes the basis of near linear cause bigger influence.
In the present embodiment, because projector 20 passes through to adopt the datum mark in central block zone 16 and the datum mark of peripheral piece zone 17-1~17-4, a plurality of somes setting near linears of availability interval broad, 4 jiaos of therefore can differentiate view field more accurately.
In addition, thus, projector 20 can avoid the influence of blocking (shade) of projector 20 or transducer 60, the position of accurately grasping view field's integral body.
In addition, according to present embodiment, projector 20 is not a retrieval difference image integral body, but only zone necessary in the difference image is retrieved, and therefore can detect the position of view field simpler and at high speed.
In addition, when the projection calibration image, generate the 1st shooting information by with automatic exposure setting complete white image being made a video recording for the time being, thereby can generate the 1st shooting information with the exposure that is suitable for suitable environment.In addition, the exposure of projector 20 by with the shooting of complete white image the time generates the 2nd shooting information, thereby can generate the 2nd shooting information with the exposure of the generation that is suitable for difference image.
Particularly, by take complete white image with automatic exposure setting, transducer 60, no matter be subjected to the situation of the influence of ambient light at screen 10, in the too weak situation of the reverberation of the too low projected light of reflectivity of or screen 10 too far away, still under the too strong situation of the reverberation of the too high projected light of reflectivity of or screen 10 too near because of projector distance because of projector distance, compare with the situation of making a video recording with fixing exposure, the dynamic range of the transducer 60 of can applying in a flexible way is effectively made a video recording.
The variation of the 1st embodiment
More than, though the 1st embodiment is illustrated, of the present invention being suitable for is not limited to the foregoing description.
For example, the order of retrieval is arbitrarily, and projector 20 also can retrieve along transverse direction with respect to difference image, after detecting central reference position and peripheral reference position, retrieves along longitudinal direction according to this central reference position and this periphery reference position.
In addition, projector 20 is except the distortion in images correction based on view field's information, for example, can also carry out the various processing that the positional information of view field is used in the look uneven correction in the view field, the indicating positions detection in the view field etc. according to view field's information.
And then projection objects zone 18 back detection view fields also can detected by projector 20.
Figure 13 is the functional block diagram of projector 20 of the variation of the 1st embodiment.In addition, Figure 14 is a schematic diagram of showing the search method in the 1st stage when the detection periphery reference position of variation of the 1st embodiment.In addition, Figure 15 is a schematic diagram of showing the search method in the 2nd stage when the detection periphery reference position of variation of the 1st embodiment.
For example, as shown in figure 13, projection objects zone boundary point test section 159 is set in view field's test section 150.
Projection objects zone boundary point test section 159, with the 1st photographed images as searching object, as shown in figure 14, be offset each intersection point of the line of the line of line, y=y1 of several percentages and y=y2 from the each point of range points P3 and P4 to the inboard in central block zone 16 respectively, each pixel that is positioned on the above-mentioned line that is offset several percentages to the inside carried out rim detection to the outside in central block zone 16.Moreover, in rim detection, adopt general method.Thus, differentiate T, U, V, W shown in Figure 14.In addition, projection objects zone boundary point test section 159 is exported to peripheral reference position detecting part 156 with the projection objects zone boundary dot information of representing the position of some T, U, V, W.
Periphery reference position detecting part 156 according to as the yT of the Y coordinate of some T, as the less value among the yU of Y coordinate of some U with as the y1 of the Y coordinate of P1, detects the position Y=yQ of benchmark of the retrieval of the transverse direction that becomes upside.In addition, peripheral reference position detecting part 156 according to as the yV of the Y coordinate of some V, as the less value among the yW of Y coordinate of some W with as the y2 of the Y coordinate of P2, detects the position Y=yR of benchmark of the retrieval of the transverse direction that becomes downside.
Periphery reference position detecting part 156, by the intersection point separately that intersects from X=xT, X=xU, these 4 straight lines of Y=yQ, Y=yR, retrieval laterally detects the pixel that output is arranged on Y=yQ, the Y=yR of difference image, differentiates 4 some P5~P8 thus.Periphery reference position detecting part 156 uses the same method, and differentiates 4 remaining P9~P12.
View field's test section 150 also can be differentiated the central reference position in central block zone 16 and the peripheral reference position of peripheral piece zone 17-1~17-4 by such method, and differentiates 4 jiaos position of view field.
Particularly, according to this method, view field's information generating unit 158, compare with above-mentioned method, can prevent from outside the projection objects zone, to detect the so unfavorable processing in peripheral reference position, and then, can under the wideer state in 3 the interval that is used to ask near linear, try to achieve near linear.Thus, projector 20 can detect the position of view field more accurately.
The 2nd embodiment
The projector 20 of the 2nd embodiment according to shooting information, grasps screen 10 and the position relation of projected image 12 and the shape of projected image 12, carries out the distortion of projected image 12 and the adjustment of display position.
The projector 20 of the 2nd embodiment, by carrying out and different in the past image processing, with the shorter time, and correctly generate the projection objects zone (zone that is equivalent to screen 10) in the camera watch region of transducer 60 and the positional information of view field (zone that is equivalent to projected image 12).
Secondly, the functional block to the projector 20 that is used to install such function describes.
Figure 16 is the functional block diagram of the projector 20 of the 2nd embodiment.
Projector 20 constitutes and comprises that the analog rgb signal (R1, G1, B1) that will come from PC (personal computer) etc. is transformed into the input signal handling part 110 of digital rgb signal (R2, G2, B2); For the color and the lightness of correction image, this digital rgb signal (R2, G2, B2) is transformed into the look transformation component 120 of digital rgb signal (R3, G3, B3); This digital rgb signal (R3, G3, B3) is transformed into the output signal handling part 130 of analog rgb signal (R4, G4, B4) and according to the image projection portion 190 of this analog rgb signal projection image.
In addition, image projection portion 190 constitutes the drive division 194, light source 196 and the lens 198 that comprise spatial light modulator 192, drive spatial light modulator 192.Drive division 194 drives spatial light modulator 192 according to the picture signal that comes from output signal handling part 130.And image projection portion 190 comes from the light of light source 196 via spatial light modulator 192 and lens 198 projections.
In addition, projector 20 constitutes the transducer 60 of the shooting information that comprises the calibration information generating unit 172 that generates the calibration image information be used to show the 1st and the 2nd calibration chart picture, generates the calibration chart picture and the shooting information storage part 140 that temporary transient storage comes from the shooting information of transducer 60.
In addition, projector 20 constitutes the view field's test section 150 that comprises according to the position of the view field in the shooting face (camera watch region) of shooting information detection sensor 60.In addition, view field's test section 150 constitutes and comprises the difference image generating unit 152, the central reference position detection part 154 that detects a plurality of central reference position that is included in the central block zone in the difference image that generate the 1st photographed images and the difference image of the 2nd photographed images, detects the peripheral reference position detecting part 156 of a plurality of peripheral reference positions that are included in the peripheral piece zone in the difference image and generate view field's information generating unit 158 of view field's information of the position of representing view field according to each reference position.
In addition, projector 20 comprises the projection objects region detecting part 180 of the projection objects area information that the position in the projection objects zone that is equivalent to screen 10 in the camera watch region that generates with transducer 60 is relevant.In addition, projection objects region detecting part 180 constitutes the test point evaluation portion 188 that comprises the range of search determination section 182 of setting the rim detection scope, rim detection portion 184, evaluation edge test point, generates the projection objects area information generating unit 186 of projection objects area information in temporary detecting projection objects zone when generating temporary detecting information.
And then projector 20 has the picture distortion correcting device of the distortion of revising projected image 12.In particular, projector 20, as the picture distortion correcting device, have according to the picture distortion correction calculating part 162 of view field's information and projection objects area information computed image distortion correction with according to the picture distortion correction portion 112 of this picture distortion correction correction image signal.
In addition, as the hardware of the function of each one that is used to install above-mentioned projector 20, for example, applicable following hardware shown in Figure 4.
For example, as input signal handling part 110, available for example A/D converter 930, image processing circuit 970 wait and install; As shooting information storage part 140, available for example RAM950 waits and installs; As view field's test section 150, projection objects region detecting part 180, available for example image processing circuit 970 waits to be installed; As picture distortion correction calculating part 162, available for example CPU910 waits and installs; As calibration information generating unit 172, available for example image processing circuit 970, RAM950 wait and install; As output signal handling part 130, available for example D/A converter 940 waits to be installed; As spatial light modulator 192, available for example liquid crystal panel 920 waits to be installed; As drive division 194, the available ROM960 that for example stores the liquid crystal light valve driver that drives liquid crystal panel 920 waits and installs.
Moreover described each several part can be via system bus 980 mutual exchange messages.
In addition, described each several part both can be installed with hardware mode as circuit, also can the mode with software install as program.
And then, also can be used for making information storage medium 900 fetch programs of computer performance from storing as the functional programs of projection objects area information generating unit 186 grades, the function of projection objects area information generating unit 186 grades is installed on computers.
As such information storage medium 900, applicable for example CD-ROM, DVD-ROM, ROM, RAM, HDD etc., the mode that reads of its program both can be the way of contact, also can be the non-way of contact.
In addition, replace information storage medium 900, also can be by being used to install above-mentioned each functional programs from downloads such as host apparatus via transmission line, thus above-mentioned each function is installed.
Below, the position probing in the projection objects zone of having used described each several part handled describing.Moreover, because the processing of the position probing of view field is identical with the 1st embodiment, therefore omit explanation.In addition, different with Fig. 7 of the 1st embodiment in the 2nd embodiment, be assumed to the state that projection objects zone 18 integral body do not exceed from peripheral piece zone 17-1~17-4.
Secondly, the flow process that the position probing in projection objects zone 18 is handled describes.
Figure 17 is the flow chart of the flow process of showing that the position probing in the projection objects zone 18 of the 2nd embodiment is handled.
In the processing of above-mentioned step S2, transducer 60 with as the low exploring degree of the exploring degree lower than regulation exploring degree (for example SVGA etc.) (for example VGA etc.), is taken the 1st calibration chart and is generated shooting information (step S2) as 13.Shooting information storage part 140 storages the 1st shooting information.
Then, the 1st calibration chart as 13 states that are projected on the screen 10 under, transducer 60 is with as the high-resolution of the exploring degree higher than above-mentioned low exploring degree (for example, XGA, SXGA, UXGA etc.), take the 1st calibration chart and generate the 3rd shooting information (step S11) as 13.Shooting information storage part 140 storages the 3rd shooting information.
Projection objects region detecting part 180 generates the projection objects area information of the positional information in expression projection objects zone 18 according to the 1st and the 3rd shooting information.
Figure 18 is a schematic diagram of showing the search method in the 1st stage when the detection projection objects zone 18 of the 2nd embodiment.In addition, Figure 19 is a schematic diagram of showing the search method in the 2nd stage when the detection projection objects zone 18 of the 2nd embodiment.In addition, Figure 20 is a schematic diagram of showing the search method in the 3rd stage when the detection projection objects zone 18 of the 2nd embodiment.In addition, Figure 21 is a schematic diagram of showing the rim detection point evaluation processing of the 2nd embodiment.
At first, range of search determination section 182, in order to determine the rim detection object, the coordinate information according to 4 jiaos of ABCD in the 1st shooting information and above-mentioned central block zone 16 is set in 4 retrieval boost lines (step S12) in the camera watch region that dots among Figure 18.In particular, range of search determination section 182, with the photographed images with the white of low exploring degree shooting is object, in the set positions retrieval boost line that is offset p% than the coordinate by 4 jiaos of ABCD in central reference position detection part 154 detected central block zones 16 laterally.
For example, the 1st retrieval boost line y=round[{max (yA, yB)+(yA-yD) * p/100}*a];
The 2nd retrieval boost line x=round[{max (xB, xC)+(xB-xA) * p/100}*a];
The 3rd retrieval boost line y=round[{min (yC, yD)-(yA-yD) * p/100}*a];
The 4th retrieval boost line x=round[{min (xA, xD)-(xB-xA) * p/100}*a].
Moreover, at this, max, min, round, a, be respectively return peaked function in the independent variable, return minimum value in the independent variable function, return value with the 1st of the decimal point of independent variable round up resulting integer function, be used for the exploring degree (high-resolution) of the 2nd photographed images is transformed into the coefficient of the exploring degree (low exploring degree) of the 1st photographed images.Moreover, under need not the situation of conversion exploring degree, do not need a.
By 4 retrievals of such setting boost line, as shown in figure 18,4 intersection I JKL of 4 retrieval boost lines have been determined.
Rim detection portion 184, from the each point of intersection I JKL to the outside of area I JKL on the retrieval line on the retrieval boost line, be that searching object according to pixels carries out rim detection (step S13) one by one with the 1st photographed images.Thus, as shown in figure 18, detect 8 rim detection point MNOPQRST.
And then, rim detection portion 184, from line segment TO to the boundary line in projection objects zone 18 direction, from line segment NQ to the boundary line in projection objects zone 18 direction, from line segment PS to the boundary line in projection objects zone 18 direction, from line segment RM on the direction of the boundary line in projection objects zone 18, according to pixels detect one by one.
At this, be that example describes with the situation of carrying out rim detection to the boundary line in projection objects zone 18 direction from line segment TO.Rim detection portion 184, as shown in figure 19, on the online TO, the scope of rim detection for example is the direction parallel and positive with Y-axis, sets 7 retrieval lines on line segment IJ, sets 2 retrieval lines on line segment TI, JO respectively.Moreover side search domain during the zone of 7 retrieval lines of this setting is called is called outside search domain with each 2 zone setting 2 retrieval lines.
Then, rim detection portion 184 by according to pixels carry out rim detection one by one on these retrieval lines, can detect maximum 11, comprise maximum 13 the rim detection points of a MN on straight line MN.Rim detection portion 184 carries out rim detection similarly to other line segment NQ, line segment PS, line segment RM.
Moreover, rim detection portion 184, paired MN, the OP of rim detection point, QR, ST each among a side point in camera watch region 15, fail under the detected situation, for being used to detect the outside search domain in the outside of detected point of failing, the setting and the rim detection of retrieval line are not carried out in the zone of regarding the boundary line that does not have projection objects zone 18 as in this zone.In addition, rim detection portion 184, each centering both sides's of paired MN, the OP of rim detection point, QR, ST point is failed detected situation in camera watch region 15 under, regard as with boundary line not exist, fail not carry out in the side search domain of detected line segment and the outside search domain setting and the rim detection of retrieval line being used for retrieving with the parallel direction of the detected line segment projection objects adjacent thereto zone 18 of failing.
By carrying out these processing, rim detection portion 184 can omit the rim detection that has the lower zone of possibility to projection objects zone 18, can carry out processing more quickly.
Projection objects area information generating unit 186 by according to by rim detection portion 184 detected a plurality of rim detection points, is set linear near linear or the linear curve of approximation shown in the dotted line of Figure 20 for example, offhand decision projection objects zone 18 (step S14).
Then, test point evaluation portion 188, by judging, estimate each rim detection point (step S15) thus by among rim detection portion 184 detected a plurality of rim detection points, whether from by the linear near linear of projection objects area information generating unit 186 settings or linear curve of approximation discrete programing value or more than it.
For example, as shown in figure 21, when in photographed images, including illumination light 19, when rim detection, will detect the part of illumination light 19 sometimes.Even if under these circumstances, rim detection point T shown in Figure 21 is owing to from the boundary line discrete programing value in projection objects zone 18 or more than it, remove from process object so be projected subject area information generating unit 186.
So, projection objects area information generating unit 186 only with the rim detection point of not removing, detects projection objects zone 18 more accurately from process object.
In particular, rim detection portion 184 according to the 3rd shooting information as the shooting information of high-resolution, carries out rim detection (step S16) to the neighboring pixel of the rim detection point do not removed from process object.Then, rim detection portion 184 exports to projection objects area information generating unit 186 with edge detecting information.
Projection objects area information generating unit 186 according to this edge detecting information, is set linear near linear or linear curve of approximation once more, thus decision projection objects zone 18 (step S17).Then, projection objects area information generating unit 186, the projection objects area information of 4 jiaos the position in generation expression projection objects zone 18.
As mentioned above, according to present embodiment, projector 20, after going out projection objects zone 18 in shooting information temporary detecting according to low exploring degree, near its boundary line, detect projection objects zone 18 according to the shooting information of high-resolution, the available shorter time, and correctly generate the positional information in projection objects zone 18.Thus, projector 20 can reduce the calculation process amount of image processing system integral body, carries out high-speed image with underload and handles.
In addition, according to present embodiment, projector 20, by carrying out rim detection under the state that has dwindled more in the zone that will become the rim detection object, the available shorter time generates the positional information in projection objects zone 18.
In addition, projector 20 handles by removing to carry out then from the discrete rim detection point of linear near linear, can reduce the influence of noise etc., more correctly generates the positional information in projection objects zone 18.In addition, projector 20 by when rim detection, adopts the calibration chart picture that does not contain radio-frequency component as complete white image, and the edge flase drop that can avoid being caused by projected image 12 is surveyed, and carries out high-precision test.
Thus, projector 20, can carry out validly the correction of distortion of projected image 12 and position adjustment, the use in the projected image 12 detection etc. of indicating positions of laser designator etc.
For example, in the present embodiment, picture distortion correction calculating part 162, according to the view field's information that comes from view field's information generating unit 158 with come from the projection objects area information of projection objects area information generating unit 186, in the position relation of grasping screen 10 and projected image 12 and when revising the distortion of projected image 12, computed image distortion correction is so that projected image 12 becomes desirable asperratio (aspect ratio).
Then, picture distortion correction portion 112, according to this picture distortion correction, correction image signal (R1, G1, B1).Thus, projector 20 can not have the image that distorts with the shape projection that has kept desirable asperratio.
Certainly, the modification method of distortion in images is not limited to this method.For example, the pixel of the brightness value maximum in the photographed images also can detect in projector 20, and comes the distortion of correction image according to this locations of pixels.
In addition, projector 20, by adopting graph image shown in Fig. 6 B to be not only the image that central authorities and its periphery also have feature like that, compare to differentiate view field more accurately 4 jiaos with the situation that adopts central authorities just to have the graph image of feature.
For example, the point that near its brightness value changes when differentiating some P1 shown in Figure 7, some P2, also can be differentiated by projector 20.But, setting under the situation of near linears with these at interval very narrow a plurality of points, and to compare with the situation of a plurality of somes setting near linears of broad at interval, the error of 1 pixel of point that becomes the basis of near linear can cause bigger influence near linear.
In the present embodiment, because projector 20 passes through to adopt the datum mark in central block zone 16 and the datum mark of peripheral piece zone 17-1~17-4, a plurality of somes setting near linears of availability interval broad, 4 jiaos of therefore can differentiate view field more accurately.
In addition, thus, projector 20 can avoid the influence of blocking (shade) of projector 20 or transducer 60, the position of accurately grasping view field's integral body.
In addition, according to present embodiment, projector 20, owing to be not that whole difference image is retrieved, but only zone necessary in the difference image is retrieved, therefore can detect the position of view field simpler and at high speed.
In addition, when the projection calibration image, generate the 1st shooting information, can generate the 1st shooting information with the exposure that is suitable for suitable environment by with automatic exposure setting complete white image being made a video recording for the time being.In addition, the exposure of projector 20 by with the shooting of complete white image the time generates the 2nd shooting information, can generate the 2nd shooting information with the exposure of the generation that is suitable for difference image.
Particularly, by complete white image being made a video recording with automatic exposure setting, no matter be subjected to the situation of the influence of ambient light at screen 10, in the too weak situation of the reverberation of the too low projected light of reflectivity of or screen 10 too far away, still under the too strong situation of the reverberation of the too high projected light of reflectivity of or screen 10 too near because of projector distance because of projector distance, compare with the situation of making a video recording with fixing exposure, can both apply in a flexible way the effectively dynamic range of transducer 60 of transducer 60 is made a video recording.
The variation of the 2nd embodiment
More than, though be illustrated being suitable for preferred embodiment of the present invention, of the present invention being suitable for is not limited to the foregoing description.
For example, when the 1st shooting information of generation and the 3rd shooting information, also can be that transducer 60 is taken the 1st calibration chart as 13 with high-resolution, and by the shooting information of image processing, thereby generate the 1st shooting information and the 3rd shooting information with 1 time shooting with the low exploring degree of shooting information conversion one-tenth of high-resolution.
In addition, about the position of projected image 12 and the adjustment of size, also can use the angle of visual field of projector 20 to adjust function (zoom function).Thus, even if under the situation of darkroom, also can detect the projection objects zone more reliably.
In addition, for example, the order of retrieval is arbitrarily, and projector 20 also can retrieve along transverse direction with respect to difference image, after detecting central reference position and peripheral reference position, retrieves along longitudinal direction according to this central reference position and this periphery reference position.
In addition, projector 20 is except the distortion in images correction based on view field's information, for example, also can carry out the various processing that the positional information of view field is used in the look uneven correction in the view field, the indicating positions detection in the view field etc. according to view field's information.
And then view field also can be detected by projector 20 after detecting projection objects zone 18.For example, also can be provided with, detect the projection objects zone boundary point test section of a plurality of boundary points in projection objects zone 18 according to the 1st shooting information and central reference position.And then, peripheral reference position determination section 156 can also be constituted, detect the locational peripheral reference position that is positioned at than more close this boundary point in central reference position according to this boundary point.
Figure 22 is a schematic diagram of showing the search method in the 1st stage when the detection periphery reference position of variation of the 2nd embodiment.In addition, Figure 23 is a schematic diagram of showing the search method in the 2nd stage when the detection periphery reference position of variation of the 2nd embodiment.
For example, as shown in figure 22, projection objects zone boundary point test section, also can be according to the central reference position P1~P4 in central block zone 16, be offset to the inside at distance P3, P4 each point on the retrieval boost line of several percentages, according to pixels carry out rim detection laterally one by one from some y1, the y2 of the Y coordinate of P1, P2 each point.Thus, detect 4 rim detection point TUVW.
Central reference position detection part 156 according to as value less among the yT of the Y coordinate of some T, the yU of Y coordinate as some U with as the y1 of the Y coordinate of P1, detects the position Y=yQ of benchmark of the retrieval of the transverse direction that becomes upside.In addition, the position Y=yR of benchmark of the retrieval of the transverse direction that becomes downside according to as value less among the yV of the Y coordinate of some V, the yW of Y coordinate as some W with as the y2 of the Y coordinate of P2, detects in rim detection portion 184.
Periphery reference position detecting part 156 by each intersection point that intersects from X=xT, X=xU, these 4 straight lines of Y=yQ, Y=yR, is retrieved on Y=yQ, the Y=yR of difference image laterally, and detecting has the pixel of output, thereby differentiates 4 some P5~P8.Periphery reference position detecting part 156, profit uses the same method, and differentiates remaining 4 P9~P12.
The central reference position in central block zone 16 and the peripheral reference position of peripheral piece zone 17-1~17-4 also can be differentiated by such method by projector 20, and differentiate 4 jiaos position of view field.
Particularly, according to this method, compare with above-mentioned method, view field's information generating unit 158 can be tried to achieve near linear being used in 3 the interval of asking near linear under bigger state.Thus, projector 20 can detect the position of view field more accurately.
Moreover the number of central reference position and the number of peripheral reference position are arbitrarily, are not limited to the foregoing description.
In addition, the 1st calibration chart as the 13 and the 2nd calibration chart as 14 figure, be not limited to the example shown in Fig. 6 A and Fig. 6 B, as long as becoming formation central block zone 16 under the state of difference image at least, particularly, preferably becoming formation central block zone 16 and peripheral piece zone 17-1~17-4 under the state of difference image.For example, also can adopt the 1st calibration chart that contains central block zone 16 as 13 and the 2nd calibration chart that contains peripheral piece zone 17-1~17-4 as 14.
In addition, the shape of calibration chart picture, central block zone 16 and peripheral piece zone 17-1~17-4 is not limited to rectangle, for example, also can adopt rectangle shapes in addition such as circle.Certainly, the shape of calibration integral image and the shape in central block zone 16 are not limited to similar figures, so long as the correspondence of both shapes shape as can be known gets final product.And then the number of peripheral piece zone 17-1~17-4 also is arbitrarily.
In addition, even if under the situation of projected image on the projection objects things such as the blackboard except screen 10, blank, the present invention also is effective.
In addition, for example, in the above-described embodiments, though the example that image processing system is installed in projector 20 is illustrated, cathode ray tube) etc. but except projector 20, can also be installed in CRT (CathodeRay Tube: on the image display device beyond the projector 20.In addition, as projector 20, except liquid crystal projector, for example, can also use and adopt DMD (DigitalMicromirror Device: projector digital micromirror device) etc.Moreover DMD is the trade mark of TIX.
In addition, the function of above-mentioned projector 20 for example, can be installed with projector's monomer, also can be with (for example, with projector and PC dispersion treatment) installation dispersedly on a plurality of processing unit.
And then, in the above-described embodiments, though be the structure that transducer 60 is built in projector 20, also can be with transducer 60 as dividing other independent device mutually with projector 20 and constituting.

Claims (19)

1.一种图像处理系统,其特征在于,包括:1. An image processing system, characterized in that, comprising: 向投影对象物投影第1校准图像的图像投影装置;an image projection device for projecting a first calibration image onto a projection object; 对所投影的第1校准图像进行摄像而生成作为规定析像度或其以下的析像度的低析像度的第1摄像信息、和作为所述低析像度或其以上的析像度的高析像度的第3摄像信息的摄像装置;和The projected first calibration image is captured to generate low-resolution first imaging information of a predetermined resolution or lower, and a resolution of the low resolution or higher. The imaging device of the third imaging information with high resolution; and 根据所述第1以及第3摄像信息,生成与所述摄像装置的摄像区域中的相当于所述投影对象物的投影对象区域的位置有关的投影对象区域信息的投影对象区域检测装置;A projection target area detection device that generates projection target area information related to a position corresponding to a projection target area of the projection target object in an imaging area of the imaging device based on the first and third imaging information; 所述投影对象区域检测装置,包括:The projection object area detection device includes: 根据所述第1摄像信息进行边缘检测而生成第1边缘检测信息,并且根据所述第3摄像信息进行边缘检测而生成第3边缘检测信息的边缘检测装置;和an edge detection device that performs edge detection based on the first imaging information to generate first edge detection information, and performs edge detection based on the third imaging information to generate third edge detection information; and 根据所述第1边缘检测信息临时检测所述投影对象区域而生成临时检测信息,并且根据所述第3边缘检测信息生成所述投影对象区域信息的投影对象区域信息生成装置;a projection target area information generation device that temporarily detects the projection target area based on the first edge detection information to generate provisional detection information, and generates the projection target area information based on the third edge detection information; 所述边缘检测装置,通过根据所述临时检测信息,对临时检测出的投影对象区域的边界线附近的像素组进行边缘检测,生成所述第3边缘检测信息。The edge detection device generates the third edge detection information by performing edge detection on a pixel group in the vicinity of a boundary line of the projection target area temporarily detected based on the provisional detection information. 2.如权利要求1所述的图像处理系统,其特征在于,所述边缘检测装置,对基于所述第1摄像信息的第1摄像图像内的多个部位的边缘进行检测而生成所述第1边缘检测信息;2. The image processing system according to claim 1, wherein the edge detection device detects edges of a plurality of parts in the first captured image based on the first captured image information to generate the first captured image. 1 edge detection information; 所述投影对象区域信息生成装置,通过根据基于所述第1边缘检测信息的所述多个部位的位置信息设定线形近似直线或线形近似曲线,生成所述临时检测信息。The projection target area information generation device generates the temporary detection information by setting a linear approximate straight line or a linear approximate curve based on the position information of the plurality of parts based on the first edge detection information. 3.如权利要求2所述的图像处理系统,其特征在于,所述投影对象区域检测装置,包括评价多个边缘检测点的检测点评价装置;3. The image processing system according to claim 2, wherein the projection target area detection device includes a detection point evaluation device for evaluating a plurality of edge detection points; 所述检测点评价装置,判定所述多个边缘检测点是否从所述线形近似直线或所述线形近似曲线离散规定值或其以上,以将离散所述规定值或其以上的检测点除去然后再次设定所述线形近似直线或所述线形近似曲线的方式,控制所述投影对象区域信息生成装置。The detection point evaluation device judges whether or not the plurality of edge detection points deviate from the linear approximate straight line or the linear approximate curve by a predetermined value or more, to remove the detection points deviated by the predetermined value or more, and then The method of setting the linear approximate straight line or the linear approximate curve again is used to control the projection object area information generating means. 4.如权利要求1~3中的任何一项所述的图像处理系统,其特征在于,包括:4. The image processing system according to any one of claims 1 to 3, comprising: 生成差分图像的差分图像生成装置;和a differential image generating device that generates a differential image; and 根据所述差分图像,检测所述摄像区域中的规定的中央块区域的多个中央基准位置的中央基准位置检测装置;A central reference position detection device for detecting a plurality of central reference positions of a predetermined central block area in the imaging area based on the differential image; 所述投影对象区域检测装置,包括设定所述边缘检测装置的边缘检测范围的检索范围决定装置;The projection target area detection device includes search range determination means for setting an edge detection range of the edge detection device; 所述图像投影装置,投影第2校准图像;The image projection device projects a second calibration image; 所述摄像装置,对所投影的所述第2校准图像进行摄像而生成第2摄像信息;The imaging device captures the projected second calibration image to generate second imaging information; 所述差分图像生成装置,根据所述第1以及第2摄像信息,生成所述差分图像;The differential image generation device generates the differential image based on the first and second imaging information; 所述检索范围决定装置,在所述中央块区域的外部设定边缘检测范围;The search range determining means sets an edge detection range outside the central block area; 所述第1校准图像是单色的校准图像;The first calibration image is a monochrome calibration image; 所述第2校准图像,包括比该第2校准图像小,并且位于该第2校准图像的中央附近的所述中央块区域。The second calibration image includes the central block area which is smaller than the second calibration image and located near the center of the second calibration image. 5.如权利要求4所述的图像处理系统,其特征在于,所述第2校准图像由所述中央块区域、位于该中央块区域的周边的周边块区域、和所述中央块区域以及所述周边块区域以外的背景区域构成;5. The image processing system according to claim 4, wherein the second calibration image is composed of the central block area, peripheral block areas located around the central block area, the central block area, and the central block area. The composition of the background area other than the surrounding block area; 所述中央块区域以及所述周边块区域中的各像素和所述背景区域中的各像素具有不同的指标值;Each pixel in the central block area and the peripheral block area and each pixel in the background area have different index values; 且包括根据所述中央基准位置、检测所述摄像区域中的所述周边块区域的多个周边基准位置的周边基准位置检测装置;和根据所述中央基准位置和所述周边基准位置、生成与所述摄像区域中的投影区域的位置有关的投影区域信息的投影区域信息生成装置。And including a peripheral reference position detection device for detecting a plurality of peripheral reference positions of the peripheral block area in the imaging area based on the central reference position; and generating and A projection area information generation device for projection area information related to the position of the projection area in the imaging area. 6.如权利要求5所述的图像处理系统,其特征在于,所述投影区域信息生成装置,通过根据所述中央基准位置和所述周边基准位置,设定多个近似直线或近似曲线而掌握所述中央块区域以及所述周边块区域的形状或配置,从而生成所述投影区域信息。6. The image processing system according to claim 5, wherein said projected area information generation device grasps by setting a plurality of approximate straight lines or approximate curves according to said central reference position and said peripheral reference position. The shape or arrangement of the central block area and the peripheral block area is used to generate the projected area information. 7.如权利要求6所述的图像处理系统,其特征在于,所述投影区域以及所述中央块区域是矩形的区域;7. The image processing system according to claim 6, wherein the projection area and the central block area are rectangular areas; 所述投影区域信息生成装置,通过导出所述多个近似直线的交点或所述多个近似曲线的交点,掌握所述中央块区域的4角的位置,根据该4角的位置,生成表示所述投影区域的4角的位置的所述投影区域信息。The projected area information generation device obtains the positions of the four corners of the central block area by deriving the intersection points of the plurality of approximate straight lines or the intersection points of the plurality of approximate curves, and generates the information representing the four corners based on the positions of the four corners. The projection area information of the positions of the four corners of the projection area. 8.如权利要求5所述的图像处理系统,其特征在于,包括根据所述第1摄像信息和所述中央基准位置,检测所述投影对象区域的多个边界点的投影对象区域边界点检测装置;8. The image processing system according to claim 5, further comprising a projection target area boundary point detection for detecting a plurality of boundary points of the projection target area based on the first imaging information and the central reference position device; 所述周边基准位置检测装置,根据所述边界点检测所述周边基准位置。The peripheral reference position detecting device detects the peripheral reference position based on the boundary point. 9.如权利要求5所述的图像处理系统,其特征在于,包括根据所述投影对象区域信息和所述投影区域信息掌握由所述图像投影装置所投影的图像的畸变,以使该畸变得到修正的方式修正图像信号的图像畸变修正装置;9. The image processing system according to claim 5, further comprising grasping the distortion of the image projected by the image projection device according to the projection object area information and the projection area information, so that the distortion becomes An image distortion correction device for correcting image signals in a correcting manner; 所述图像投影装置,根据由所述图像畸变修正装置修正后的图像信号,投影图像。The image projection device projects an image based on the image signal corrected by the image distortion correction device. 10.一种投影机,其特征在于,包括:10. A projector, characterized in that it comprises: 向投影对象物投影第1校准图像的图像投影装置;an image projection device for projecting a first calibration image onto a projection object; 对所投影的第1校准图像进行摄像而生成作为规定析像度或其以下的析像度的低析像度的第1摄像信息、和作为所述低析像度或其以上的析像度的高析像度的第3摄像信息的摄像装置;和The projected first calibration image is captured to generate low-resolution first imaging information of a predetermined resolution or lower, and a resolution of the low resolution or higher. The imaging device of the third imaging information with high resolution; and 根据所述第1以及第3摄像信息,生成与所述摄像装置的摄像区域中的相当于所述投影对象物的投影对象区域的位置有关的投影对象区域信息的投影对象区域检测装置;A projection target area detection device that generates projection target area information related to a position corresponding to a projection target area of the projection target object in an imaging area of the imaging device based on the first and third imaging information; 所述投影对象区域检测装置,包括:The projection object area detection device includes: 根据所述第1摄像信息进行边缘检测而生成第1边缘检测信息,并且根据所述第3摄像信息进行边缘检测而生成第3边缘检测信息的边缘检测装置;和an edge detection device that performs edge detection based on the first imaging information to generate first edge detection information, and performs edge detection based on the third imaging information to generate third edge detection information; and 根据所述第1边缘检测信息临时检测所述投影对象区域而生成临时检测信息,并且根据所述第3边缘检测信息生成所述投影对象区域信息的投影对象区域信息生成装置;a projection target area information generation device that temporarily detects the projection target area based on the first edge detection information to generate provisional detection information, and generates the projection target area information based on the third edge detection information; 所述边缘检测装置,通过根据所述临时检测信息,对临时检测出的投影对象区域的边界线附近的像素组进行边缘检测,生成所述第3边缘检测信息。The edge detection device generates the third edge detection information by performing edge detection on a pixel group in the vicinity of a boundary line of the projection target area temporarily detected based on the provisional detection information. 11.一种图像处理方法,其特征在于,向投影对象物投影第1校准图像;11. An image processing method, characterized in that, projecting a first calibration image to a projection object; 对所投影的第1校准图像进行摄像而生成作为规定析像度或其以下的析像度的低析像度的第1摄像信息、和作为所述低析像度或其以上的析像度的高析像度的第3摄像信息;The projected first calibration image is captured to generate low-resolution first imaging information of a predetermined resolution or lower, and a resolution of the low resolution or higher. High-resolution third camera information; 根据所述第1摄像信息,进行边缘检测而生成第1边缘检测信息;performing edge detection based on the first imaging information to generate first edge detection information; 根据所述第1边缘检测信息,临时检测所述摄像部的摄像区域中的相当于所述投影对象物的投影对象区域而生成临时检测信息;temporarily detecting a projection target area corresponding to the projection target object in an imaging area of the imaging unit based on the first edge detection information to generate temporary detection information; 通过根据该临时检测信息和所述第3摄像信息,对临时检测出的投影对象区域的边界线附近的像素组进行边缘检测,生成第3边缘检测信息;generating third edge detection information by performing edge detection on the temporarily detected pixel groups near the boundary line of the projection object area according to the temporary detection information and the third imaging information; 根据该第3边缘检测信息,检测所述投影对象区域而生成与所述投影对象区域的位置有关的投影对象区域信息。Based on the third edge detection information, the projection target area is detected to generate projection target area information related to the position of the projection target area. 12.如权利要求11所述的图像处理方法,其特征在于,对基于所述第1摄像信息的第1摄像图像内的多个部位的边缘进行检测而生成所述第1边缘检测信息;12. The image processing method according to claim 11, wherein the first edge detection information is generated by detecting edges of a plurality of parts in the first captured image based on the first captured information; 通过根据基于所述第1边缘检测信息的所述多个部位的位置信息设定线形近似直线或线形近似曲线,生成所述临时检测信息。The provisional detection information is generated by setting a linear approximate straight line or a linear approximate curve based on the position information of the plurality of parts based on the first edge detection information. 13.如权利要求12所述的图像处理方法,其特征在于,判定多个边缘检测点是否从所述线形近似直线或所述线形近似曲线离散规定值或其以上;13. The image processing method according to claim 12, wherein it is determined whether a plurality of edge detection points are separated from the linear approximate straight line or the linear approximate curve by a predetermined value or more; 将离散所述规定值或其以上的检测点除去然后再次设定所述线形近似直线或所述线形近似曲线。The detection points that deviate from the predetermined value or more are removed, and then the linear approximate straight line or the linear approximate curve is set again. 14.如权利要求11~13中的任何一项所述的图像处理方法,其特征在于,14. The image processing method according to any one of claims 11 to 13, wherein: 投影第2校准图像;projecting a second calibration image; 用所述摄像部对所投影的所述第2校准图像进行摄像而生成第2摄像信息;generating second imaging information by imaging the projected second calibration image with the imaging unit; 根据所述第1以及第2摄像信息,生成所述差分图像;generating the difference image according to the first and second imaging information; 根据该差分图像,检测所述摄像区域中的规定的中央块区域的多个中央基准位置;detecting a plurality of central reference positions of a predetermined central block area in the imaging area based on the differential image; 在所述中央块区域的外部设定边缘检测范围;setting an edge detection range outside the central block area; 所述第1校准图像是单色的校准图像;The first calibration image is a monochrome calibration image; 所述第2校准图像,包括比该第2校准图像小,并且位于该第2校准图像的中央附近的所述中央块区域。The second calibration image includes the central block area which is smaller than the second calibration image and located near the center of the second calibration image. 15.如权利要求14所述的图像处理方法,其特征在于,所述第2校准图像由所述中央块区域、位于该中央块区域的周边的周边块区域、和所述中央块区域以及所述周边块区域以外的背景区域构成;15. The image processing method according to claim 14, wherein the second calibration image is composed of the central block area, peripheral block areas located around the central block area, the central block area, and the central block area. The composition of the background area other than the surrounding block area; 所述中央块区域以及所述周边块区域中的各像素和所述背景区域中的各像素具有不同的指标值;Each pixel in the central block area and the peripheral block area and each pixel in the background area have different index values; 根据所述中央基准位置,检测所述摄像区域中的所述周边块区域的多个周边基准位置;detecting a plurality of peripheral reference positions of the peripheral block area in the imaging area based on the central reference position; 根据所述中央基准位置和所述周边基准位置,生成与所述摄像区域中的投影区域的位置有关的投影区域信息。Based on the central reference position and the peripheral reference position, projection area information related to the position of the projection area in the imaging area is generated. 16.如权利要求15所述的图像处理方法,其特征在于,通过根据所述中央基准位置和所述周边基准位置,设定多个近似直线或近似曲线而掌握所述中央块区域以及所述周边块区域的形状或配置,从而生成所述投影区域信息。16. The image processing method according to claim 15, wherein the central block region and the central block area and the The shape or configuration of the surrounding block area, so as to generate the projected area information. 17.如权利要求16所述的图像处理方法,其特征在于,所述投影区域以及所述中央块区域是矩形的区域;17. The image processing method according to claim 16, wherein the projection area and the central block area are rectangular areas; 通过导出所述多个近似直线的交点或所述多个近似曲线的交点,掌握所述中央块区域的4角的位置,根据该4角的位置,生成表示所述投影区域的4角的位置的所述投影区域信息。By deriving the intersection points of the plurality of approximate straight lines or the intersection points of the plurality of approximate curves, the positions of the four corners of the central block area are grasped, and the positions of the four corners representing the projection area are generated based on the positions of the four corners. The projection area information of . 18.如权利要求15所述的图像处理方法,其特征在于,根据所述第1摄像信息和所述中央基准位置,检测所述投影对象区域的多个边界点;18. The image processing method according to claim 15, wherein a plurality of boundary points of the projection target area are detected based on the first imaging information and the central reference position; 根据所述边界点,检测所述周边基准位置。The peripheral reference position is detected based on the boundary point. 19.如权利要求15所述的图像处理方法,其特征在于,根据所述投影对象区域信息和所述投影区域信息掌握被投影的图像的畸变,以使该畸变得到修正的方式修正图像信号。19. The image processing method according to claim 15, wherein the distortion of the projected image is grasped based on the projection target area information and the projection area information, and the image signal is corrected so that the distortion is corrected.
CNB2007101626370A 2004-03-29 2005-03-29 Image processing system, projector and image processing method Expired - Fee Related CN100562082C (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2004095936A JP3882927B2 (en) 2004-03-29 2004-03-29 Image processing system, projector, and image processing method
JP95936/2004 2004-03-29
JP95937/2004 2004-03-29

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CNB2005100597422A Division CN100380945C (en) 2004-03-29 2005-03-29 Image processing system, projector and image processing method

Publications (2)

Publication Number Publication Date
CN101137023A true CN101137023A (en) 2008-03-05
CN100562082C CN100562082C (en) 2009-11-18

Family

ID=35184512

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB2007101626370A Expired - Fee Related CN100562082C (en) 2004-03-29 2005-03-29 Image processing system, projector and image processing method

Country Status (2)

Country Link
JP (1) JP3882927B2 (en)
CN (1) CN100562082C (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111429366A (en) * 2020-03-03 2020-07-17 浙江大学 Single-frame low-light image enhancement method based on luminance transfer function

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5535431B2 (en) * 2006-08-11 2014-07-02 ジーイーオー セミコンダクター インコーポレイテッド System and method for automatic calibration and correction of display shape and color
JP5509663B2 (en) * 2009-04-15 2014-06-04 セイコーエプソン株式会社 Projector and control method thereof
JP5736535B2 (en) * 2009-07-31 2015-06-17 パナソニックIpマネジメント株式会社 Projection-type image display device and image adjustment method
JP6554873B2 (en) * 2015-03-31 2019-08-07 株式会社リコー Projection system, image processing apparatus, calibration method, and program

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999031877A1 (en) * 1997-12-12 1999-06-24 Hitachi, Ltd. Multi-projection image display device
JP2000241874A (en) * 1999-02-19 2000-09-08 Nec Corp Method and device for automatically adjusting screen position for projector
JP2004260785A (en) * 2002-07-23 2004-09-16 Nec Viewtechnology Ltd Projector with distortion correction function

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111429366A (en) * 2020-03-03 2020-07-17 浙江大学 Single-frame low-light image enhancement method based on luminance transfer function
CN111429366B (en) * 2020-03-03 2022-05-17 浙江大学 Single-frame low-light image enhancement method based on luminance transfer function

Also Published As

Publication number Publication date
JP2005286573A (en) 2005-10-13
JP3882927B2 (en) 2007-02-21
CN100562082C (en) 2009-11-18

Similar Documents

Publication Publication Date Title
CN100380945C (en) Image processing system, projector and image processing method
CN100435551C (en) Image processing system, projector and image processing method
JP4055010B2 (en) Image processing system, projector, program, information storage medium, and image processing method
JP3871061B2 (en) Image processing system, projector, program, information storage medium, and image processing method
JP3925521B2 (en) Keystone correction using part of the screen edge
CN100459676C (en) Image processing system, projector and image processing method
US6932480B2 (en) Image processing system, projector, program, information storage medium and image processing method
CN100426126C (en) Image processing system, projector and image processing method
US8328366B2 (en) Projector, computer program product, and exposure adjusting method
CN100562082C (en) Image processing system, projector and image processing method
JP3882928B2 (en) Image processing system, projector, and image processing method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20091118

Termination date: 20210329